CN110187772A - The method for gesture identification of clapping one's hands - Google Patents

The method for gesture identification of clapping one's hands Download PDF

Info

Publication number
CN110187772A
CN110187772A CN201910479541.XA CN201910479541A CN110187772A CN 110187772 A CN110187772 A CN 110187772A CN 201910479541 A CN201910479541 A CN 201910479541A CN 110187772 A CN110187772 A CN 110187772A
Authority
CN
China
Prior art keywords
modulus value
acceleration
hands
clapping
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910479541.XA
Other languages
Chinese (zh)
Other versions
CN110187772B (en
Inventor
蔡浩原
刘春秀
李文宽
赵晟霖
杨磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Electronics of CAS
Original Assignee
Institute of Electronics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Electronics of CAS filed Critical Institute of Electronics of CAS
Priority to CN201910479541.XA priority Critical patent/CN110187772B/en
Publication of CN110187772A publication Critical patent/CN110187772A/en
Application granted granted Critical
Publication of CN110187772B publication Critical patent/CN110187772B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a kind of method of gesture identification of clapping one's hands, comprising: step 1: initialization algorithm library variable;Step 2: obtaining the 3-axis acceleration vector data of intelligent wearable device sensor;Step 3: the 3-axis acceleration vector data obtained using step 2 calculates acceleration modulus value;Step 4: by the acceleration modulus value for the different moments that step 3 calculates, the moment occurs for the rising edge for obtaining acceleration modulus value and the moment occurs for failing edge;Step 5: the moment occurs for the rising edge of acceleration modulus value acquired in obtaining step 4 and the maximum value of acceleration modulus value between the moment occurs for failing edge, and calculates acceleration modulus value depth-width ratio;And step 6: by step 5 depth-width ratio calculated, the identification for gesture of completing to clap one's hands.

Description

The method for gesture identification of clapping one's hands
Technical field
The disclosure is related to action recognition technical field more particularly to a kind of gesture identification of clapping one's hands based on inertial sensor Method.
Background technique
Intelligent wearable device is one of hot spot direction of current consumer electronics field.It is passed using the inertia on wearable device Sensor, such as accelerometer, gyroscope can be detected and be identified to the gesture of wearer, be known with the gesture of view-based access control model The advantages that other method is compared, and have is not influenced by ambient light, and recognition speed is fast, at low cost.However, existing passed based on inertia That there are discriminations is low for the gesture identification method of sensor, requires the disadvantages of high to user action.
Patent 201610159248.1 proposes a kind of gesture identification method based on acceleration transducer, and proposition, which utilizes, to be added Speedometer and gyroscope calculate the angle of sensor and ground level, compare with the gesture motion feature in maneuver library, come real The identification of existing gesture.This method, which exists, requires height to the feature difference of gesture, if user action and standard operation slightly difference, Then unrecognized problem.
Disclosure
(1) technical problems to be solved
Based on the above issues, present disclose provides a kind of methods of gesture identification of clapping one's hands, to alleviate gesture in the prior art Height is required to the feature difference of gesture when identification, to user action requirement height, the technical problems such as discrimination is low.
(2) technical solution
In the embodiments of the present disclosure, a kind of method of gesture identification of clapping one's hands is provided, comprising:
Step 1: initialization algorithm library variable;
Step 2: obtaining the 3-axis acceleration vector data of intelligent wearable device sensor;
Step 3: the 3-axis acceleration vector data obtained using step 2 calculates acceleration modulus value;
Step 4: by the acceleration modulus value for the different moments that step 3 calculates, the rising edge for obtaining acceleration modulus value occurs Moment occurs for moment and failing edge;
Step 5: the moment occurs for the rising edge of acceleration modulus value acquired in obtaining step 4 and failing edge occurred between the moment The maximum value of acceleration modulus value, and calculate acceleration modulus value depth-width ratio;And
Step 6: by step 5 depth-width ratio calculated, the identification for gesture of completing to clap one's hands.
In the embodiments of the present disclosure, the variable includes: acceleration modulus value;Acceleration peak value;Acceleration modulus value rising edge Moment;The acceleration modulus value failing edge moment;Acceleration modulus value depth-width ratio.
In the embodiments of the present disclosure, the calculation formula of the acceleration modulus value is as follows:
Wherein, NormAcc is acceleration modulus value, axRepresent x-axis acceleration, ayRepresent y-axis acceleration, azRepresent z-axis acceleration Degree.
In the embodiments of the present disclosure, in step 4, if LastNormAcc < AccthAnd CurNormAcc > Accth;Then remember Recording current time is that moment T1 occurs for rising edge;Wherein LastNormAcc is previous moment acceleration modulus value;CurNormAcc is The acceleration modulus value at current time, AccthFor acceleration modulus value threshold value.
In the embodiments of the present disclosure, in step 4, if LastNormAcc > AccthAnd CurNormAcc < AccthAnd T1 > 0, then recording current time is that moment T2 occurs for failing edge.
In the embodiments of the present disclosure, the threshold value of the acceleration modulus value are as follows: 0.3≤Accth≤0.8。
In the embodiments of the present disclosure, in the step 5, note acceleration modulus value depth-width ratio is Height2Width, then: Height2Width=PeakValue/ (T2-T1);
Wherein, PeakValue is that the moment occurs for rising edge acquired in step 4 and acceleration between the moment occurs for failing edge The maximum value of modulus value.
In the embodiments of the present disclosure, in the step 6, by comparing depth-width ratio and the depth-width ratio threshold value of acceleration modulus value Size determines whether gesture of clapping one's hands.
In the embodiments of the present disclosure, if acceleration modulus value depth-width ratio Height2Width is greater than the depth-width ratio threshold value of setting H2Wth, then determine that gesture of once clapping one's hands occurs.
In the embodiments of the present disclosure, the depth-width ratio threshold value are as follows: 0.1≤H2Wth≤1。
(3) beneficial effect
It can be seen from the above technical proposal that the disclosure is clapped one's hands, the method for gesture identification at least has the advantages that it One of or in which a part:
(1) discrimination is high, and false recognition rate is low;
(2) at low cost, recognition speed is fast, and computing resource occupies few;
(3) not stringent, applicability height is required user action.
Detailed description of the invention
Fig. 1 be the embodiment of the present disclosure clap one's hands gesture identification method flow diagram.
Fig. 2 be the embodiment of the present disclosure clap one's hands gesture identification method operation configuration diagram.
Fig. 3 be the embodiment of the present disclosure clap one's hands gesture identification method identification gesture of clapping one's hands acceleration modulus value time domain waveform Figure.
Specific embodiment
Present disclose provides a kind of method of gesture identification of clapping one's hands, the method for the gesture identification of clapping one's hands is used to acceleration The mode of magnitude calculation depth-width ratio, to clapping one's hands, gesture is identified, computing resource occupies extremely low, discrimination height, false recognition rate It is low, and not stringent, applicability height is required user action.
For the purposes, technical schemes and advantages of the disclosure are more clearly understood, below in conjunction with specific embodiment, and reference The disclosure is further described in attached drawing.
In the embodiments of the present disclosure, a kind of method of gesture identification of clapping one's hands is provided, it is referring to figs. 1 and 2, described to clap one's hands The method of gesture identification, comprising:
Step 1: initialization algorithm library variable;
Firstly, the various variables in initialization algorithm library, including acceleration modulus value NormAcc, acceleration peak value PeakValue, acceleration modulus value rising edge time T1, acceleration modulus value failing edge moment T2, depth-width ratio Height2Width become Amount etc..
Step 2: obtaining the 3-axis acceleration vector data of intelligent wearable device sensor;
3-axis acceleration vector data is denoted as [ax, ay, az];axRepresent x-axis acceleration, ayRepresent y-axis acceleration, azIt represents Z-axis acceleration.
Step 3: the 3-axis acceleration vector data obtained using step 2 calculates acceleration modulus value;
Formula is as follows:
Step 4: by the acceleration modulus value for the different moments that step 3 calculates, the rising edge for obtaining acceleration modulus value occurs Moment occurs for moment and failing edge;Include:
Preservation previous moment acceleration modulus value is LastNormAcc, and the acceleration modulus value at current time is CurNormAcc;
If: LastNormAcc < AccthAnd CurNormAcc > Accth;Then record current time be rising edge occur when Carve T1.
If: LastNormAcc > AccthAnd CurNormAcc < AccthAnd T1 > 0;Record current time then as decline Along generation moment T2.
Wherein, AccthFor acceleration modulus value threshold value, in practical applications, the bottom of sensor itself is made an uproar in intelligent wearable device It is 0.1 or so, therefore bottom of the setting greater than 3 times of the threshold value of the acceleration modulus value is made an uproar, preferably 0.3≤Accth≤0.8。
Step 5: the moment occurs for the rising edge of acceleration modulus value acquired in obtaining step 4 and failing edge occurred between the moment The maximum value of acceleration modulus value, and calculate acceleration modulus value depth-width ratio;
The maximum value that acceleration modulus value between moment and failing edge generation moment occurs for the rising edge of acceleration modulus value is denoted as PeakValue;Note depth-width ratio is Height2Width, then:
Height2Width=PeakValue/ (T2-T1);
Step 6: by step 5 depth-width ratio calculated, the identification for gesture of completing to clap one's hands;
If depth-width ratio is greater than some depth-width ratio threshold value H2Wth, then can be determined that have occurred and once clap one's hands;
That is: Height2Width > H2Wth
0.1≤H2W of depth-width ratio threshold valueth≤1。
Step 2 is executed to step 6 by circulation, can continuously identify gesture of clapping one's hands.
In the embodiments of the present disclosure, as shown in figure 3, to identify the acceleration modulus value time domain waveform for gesture of clapping one's hands twice Figure.
So far, attached drawing is had been combined the embodiment of the present disclosure is described in detail.It should be noted that in attached drawing or saying In bright book text, the implementation for not being painted or describing is form known to a person of ordinary skill in the art in technical field, and It is not described in detail.In addition, the above-mentioned definition to each element and method be not limited in mentioning in embodiment it is various specific Structure, shape or mode, those of ordinary skill in the art simply can be changed or be replaced to it.
According to above description, should clap one's hands to the disclosure method of gesture identification of those skilled in the art has and clearly recognizes Know.
In conclusion present disclose provides a kind of method of gesture identification of clapping one's hands, the method benefit of the gesture identification of clapping one's hands With the inertial sensor on wearable device, such as accelerometer, gyroscope, the gesture of wearer can be detected and be known Not, have recognition speed fast, computing resource occupies few, discrimination height, the low advantage of false recognition rate.
It should also be noted that, the direction term mentioned in embodiment, for example, "upper", "lower", "front", "rear", " left side ", " right side " etc. is only the direction with reference to attached drawing, not is used to limit the protection scope of the disclosure.Through attached drawing, identical element by Same or similar appended drawing reference indicates.When may cause understanding of this disclosure and cause to obscure, conventional structure will be omitted Or construction.
And the shape and size of each component do not reflect actual size and ratio in figure, and only illustrate the embodiment of the present disclosure Content.In addition, in the claims, any reference symbol between parentheses should not be configured to the limit to claim System.
It unless there are known entitled phase otherwise anticipates, the numerical parameter in this specification and appended claims is approximation, energy Enough bases pass through the resulting required characteristic changing of content of this disclosure.Specifically, all be used in specification and claim The middle content for indicating composition, the number of reaction condition etc., it is thus understood that repaired by the term of " about " in all situations Decorations.Under normal circumstances, the meaning expressed refers to include by specific quantity ± 10% variation in some embodiments, some ± 5% variation in embodiment, ± 1% variation in some embodiments, in some embodiments ± 0.5% variation.
Furthermore word "comprising" does not exclude the presence of element or step not listed in the claims.It is located in front of the element Word "a" or "an" does not exclude the presence of multiple such elements.
The word of ordinal number such as " first ", " second ", " third " etc. used in specification and claim, with modification Corresponding element, itself is not meant to that the element has any ordinal number, does not also represent the suitable of a certain element and another element Sequence in sequence or manufacturing method, the use of those ordinal numbers are only used to enable an element and another tool with certain name Clear differentiation can be made by having the element of identical name.
In addition, unless specifically described or the step of must sequentially occur, there is no restriction in the above institute for the sequence of above-mentioned steps Column, and can change or rearrange according to required design.And above-described embodiment can be based on the considerations of design and reliability, that This mix and match is used using or with other embodiments mix and match, i.e., the technical characteristic in different embodiments can be freely combined Form more embodiments.
Those skilled in the art will understand that can be carried out adaptively to the module in the equipment in embodiment Change and they are arranged in one or more devices different from this embodiment.It can be the module or list in embodiment Member or component are combined into a module or unit or component, and furthermore they can be divided into multiple submodule or subelement or Sub-component.Other than such feature and/or at least some of process or unit exclude each other, it can use any Combination is to all features disclosed in this specification (including adjoint claim, abstract and attached drawing) and so disclosed All process or units of what method or apparatus are combined.Unless expressly stated otherwise, this specification is (including adjoint power Benefit require, abstract and attached drawing) disclosed in each feature can carry out generation with an alternative feature that provides the same, equivalent, or similar purpose It replaces.Also, in the unit claims listing several devices, several in these devices can be by same hard Part item embodies.
Similarly, it should be understood that in order to simplify the disclosure and help to understand one or more of each open aspect, Above in the description of the exemplary embodiment of the disclosure, each feature of the disclosure is grouped together into single implementation sometimes In example, figure or descriptions thereof.However, the disclosed method should not be interpreted as reflecting the following intention: i.e. required to protect The disclosure of shield requires features more more than feature expressly recited in each claim.More precisely, as following Claims reflect as, open aspect is all features less than single embodiment disclosed above.Therefore, Thus the claims for following specific embodiment are expressly incorporated in the specific embodiment, wherein each claim itself All as the separate embodiments of the disclosure.
Particular embodiments described above has carried out further in detail the purpose of the disclosure, technical scheme and beneficial effects Describe in detail it is bright, it is all it should be understood that be not limited to the disclosure the foregoing is merely the specific embodiment of the disclosure Within the spirit and principle of the disclosure, any modification, equivalent substitution, improvement and etc. done should be included in the guarantor of the disclosure Within the scope of shield.

Claims (10)

1. a kind of method for gesture identification of clapping one's hands, comprising:
Step 1: initialization algorithm library variable;
Step 2: obtaining the 3-axis acceleration vector data of intelligent wearable device sensor;
Step 3: the 3-axis acceleration vector data obtained using step 2 calculates acceleration modulus value;
Step 4: by the acceleration modulus value for the different moments that step 3 calculates, the moment occurs for the rising edge for obtaining acceleration modulus value And the moment occurs for failing edge;
Step 5: the moment occurs for the rising edge of acceleration modulus value acquired in obtaining step 4 and failing edge occurs to accelerate between the moment The maximum value of modulus value is spent, and calculates acceleration modulus value depth-width ratio;And
Step 6: by step 5 depth-width ratio calculated, the identification for gesture of completing to clap one's hands.
2. the method for gesture identification according to claim 1 of clapping one's hands, the variable includes: acceleration modulus value;Acceleration peak Value;Acceleration modulus value rising edge time;The acceleration modulus value failing edge moment;Acceleration modulus value depth-width ratio.
3. the calculation formula of the method for gesture identification according to claim 1 of clapping one's hands, the acceleration modulus value is as follows:
Wherein, NormAcc is acceleration modulus value, axRepresent x-axis acceleration, ayRepresent y-axis acceleration, azRepresent z-axis acceleration.
4. the method for gesture identification according to claim 1 of clapping one's hands, in step 4, if LastNormAcc < AccthAnd CurNormAcc > Accth;Then recording current time is that moment T1 occurs for rising edge;Wherein LastNormAcc adds for previous moment Speed modulus value;CurNormAcc is the acceleration modulus value at current time, AccthFor acceleration modulus value threshold value.
5. the method for gesture identification according to claim 1 of clapping one's hands, in step 4, if LastNormAcc > AccthAnd CurNormAcc < AccthAnd T1 > 0, then recording current time is that moment T2 occurs for failing edge.
6. the method for gesture identification according to claim 1 or 4 of clapping one's hands, the threshold value of the acceleration modulus value are as follows: 0.3≤ Accth≤0.8。
7. the method for gesture identification according to claim 1 of clapping one's hands, in the step 5, note acceleration modulus value depth-width ratio is Height2Width, then:
Height2Width=PeakValue/ (T2-T1);
Wherein, PeakValue is that the moment occurs for rising edge acquired in step 4 and acceleration modulus value between the moment occurs for failing edge Maximum value.
8. the method for gesture identification according to claim 1 of clapping one's hands, in the step 6, by comparing acceleration modulus value The size of depth-width ratio and depth-width ratio threshold value, determines whether gesture of clapping one's hands.
9. the method for gesture identification according to claim 8 of clapping one's hands, if acceleration modulus value depth-width ratio Height2Width Greater than the depth-width ratio threshold value H2W of settingth, then determine that gesture of once clapping one's hands occurs.
10. the method for gesture identification according to claim 9 of clapping one's hands, the depth-width ratio threshold value are as follows: 0.1≤H2Wth≤1。
CN201910479541.XA 2019-06-03 2019-06-03 Clapping gesture recognition method Active CN110187772B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910479541.XA CN110187772B (en) 2019-06-03 2019-06-03 Clapping gesture recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910479541.XA CN110187772B (en) 2019-06-03 2019-06-03 Clapping gesture recognition method

Publications (2)

Publication Number Publication Date
CN110187772A true CN110187772A (en) 2019-08-30
CN110187772B CN110187772B (en) 2020-09-25

Family

ID=67720069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910479541.XA Active CN110187772B (en) 2019-06-03 2019-06-03 Clapping gesture recognition method

Country Status (1)

Country Link
CN (1) CN110187772B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080278447A1 (en) * 2007-05-08 2008-11-13 Ming-Yen Lin Three-demensional mouse appratus
US20110265045A1 (en) * 2010-04-26 2011-10-27 Via Technologies, Inc. Electronic system and method for operating touch screen thereof
CN103984416A (en) * 2014-06-10 2014-08-13 北京邮电大学 Gesture recognition method based on acceleration sensor
CN104134028A (en) * 2014-07-29 2014-11-05 广州视源电子科技股份有限公司 Identity authentication method and system based on gesture features
CN105786182A (en) * 2016-02-26 2016-07-20 深圳还是威健康科技有限公司 Method and device for controlling periphery devices based on gesture
CN105824420A (en) * 2016-03-21 2016-08-03 李骁 Gesture recognition method based on acceleration transducer
CN106598231A (en) * 2016-11-22 2017-04-26 深圳市元征科技股份有限公司 Gesture identification method and apparatus
CN107491254A (en) * 2016-06-13 2017-12-19 中兴通讯股份有限公司 A kind of gesture operation method, device and mobile terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080278447A1 (en) * 2007-05-08 2008-11-13 Ming-Yen Lin Three-demensional mouse appratus
US20110265045A1 (en) * 2010-04-26 2011-10-27 Via Technologies, Inc. Electronic system and method for operating touch screen thereof
CN103984416A (en) * 2014-06-10 2014-08-13 北京邮电大学 Gesture recognition method based on acceleration sensor
CN104134028A (en) * 2014-07-29 2014-11-05 广州视源电子科技股份有限公司 Identity authentication method and system based on gesture features
CN105786182A (en) * 2016-02-26 2016-07-20 深圳还是威健康科技有限公司 Method and device for controlling periphery devices based on gesture
CN105824420A (en) * 2016-03-21 2016-08-03 李骁 Gesture recognition method based on acceleration transducer
CN107491254A (en) * 2016-06-13 2017-12-19 中兴通讯股份有限公司 A kind of gesture operation method, device and mobile terminal
CN106598231A (en) * 2016-11-22 2017-04-26 深圳市元征科技股份有限公司 Gesture identification method and apparatus

Also Published As

Publication number Publication date
CN110187772B (en) 2020-09-25

Similar Documents

Publication Publication Date Title
US11287659B2 (en) Automatic placement of a virtual object in a three-dimensional space
US9977882B2 (en) Multi-input user authentication on display device
US9830444B2 (en) Password processing device
US7679642B2 (en) Camera navigation in a gaming environment
KR20190118373A (en) Virtual reality experience system and method
US20130010207A1 (en) Gesture based interactive control of electronic equipment
JP2019523929A (en) Augmented reality identification verification
WO2020131488A1 (en) Natural language input disambiguation for spatialized regions
EP4062380A1 (en) Mapping and localization of a passable world
EP3224698B1 (en) Electronic device for executing a plurality of applications and method for controlling the electronic device
JP2008527854A (en) Tilt sensor based on optical flow
TWI343988B (en) Motion recognition system and method for interacting with an electronic device
KR20120085859A (en) Selective motor control classification
CN110109551A (en) Gesture identification method, device, equipment and storage medium
CN110015307A (en) A kind of control method for vehicle, device, readable storage medium storing program for executing and terminal device
Kim et al. Gesture-recognizing hand-held interface with vibrotactile feedback for 3D interaction
CN110187772A (en) The method for gesture identification of clapping one&#39;s hands
KR102292415B1 (en) Apparatus and method for measuring body motion similarity
KR20200143317A (en) User authentication on display device
US9158380B2 (en) Identifying a 3-D motion on 2-D planes
JP6518931B1 (en) Virtual space display system
CN107239333A (en) Using switching handling method and device
WO2020113185A1 (en) Control system for a three dimensional environment
WO2016099559A1 (en) 3d navigation mode
KR20190056833A (en) Head mounted control apparatus and method to generate signal for head mounted display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant