CN105824420B - A kind of gesture identification method based on acceleration transducer - Google Patents

A kind of gesture identification method based on acceleration transducer Download PDF

Info

Publication number
CN105824420B
CN105824420B CN201610159248.1A CN201610159248A CN105824420B CN 105824420 B CN105824420 B CN 105824420B CN 201610159248 A CN201610159248 A CN 201610159248A CN 105824420 B CN105824420 B CN 105824420B
Authority
CN
China
Prior art keywords
gesture
acceleration
data
axis
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610159248.1A
Other languages
Chinese (zh)
Other versions
CN105824420A (en
Inventor
李骁
杨明胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chen Ailian
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201610159248.1A priority Critical patent/CN105824420B/en
Publication of CN105824420A publication Critical patent/CN105824420A/en
Application granted granted Critical
Publication of CN105824420B publication Critical patent/CN105824420B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Abstract

The invention discloses a kind of gesture identification methods based on acceleration transducer, belong to technical field of hand gesture recognition.The present invention obtains gesture identification method in calculating process, can automatic decision action beginning and end.Extraneous control intervention is not needed.The present invention divides three steps (angle, feature, state) gradually to exclude non-compliant gesture, reduces overall calculation amount, while also reducing the probability of maloperation.The action planting modes on sink characteristic of the present invention is obtained by acquiring a large amount of gesture samples, is inputted by system, cannot voluntarily be adjusted setting by user's individual.Although requiring height to the feature difference of gesture in this way, calculation amount is small, and discrimination is high, improves the real-time of algorithm, quick identification can be reached with low cost.

Description

A kind of gesture identification method based on acceleration transducer
Technical field
The invention belongs to technical field of hand gesture recognition, more particularly to a kind of gesture identification side based on acceleration transducer Method.
Background technology
For intelligent wearable device as one of currently manufactured field forward position direction, the movement including gesture identification identifies quilt It is widely used in intelligent wearable device, acceleration transducer is applied to technical field of hand gesture recognition, by analyzes human hand gesture Generated acceleration signal is detected, identifies the motion state of user and realizes operation to smart machine when movement. However current acceleration transducer be applied in gesture identification field there is cannot accurate judgement action terminal (or needing manual setting) cannot judge to filter out non-compliant gesture, cause the shortcomings of maloperation well.
Invention content
It is an object of the invention to overcome above-mentioned shortcoming and deficiency existing in the prior art, provide a kind of based on acceleration The gesture identification method of sensor.
The purpose of the invention is achieved by the following technical solution:A kind of gesture identification method based on acceleration transducer, Include the following steps:
(1) acceleration information and angular velocity data of user's hand motion are acquired by sensor;
(2) Kalman filtering is carried out to the data of step (1) acquisition, removes shaking interference and sensor during gesture Clutter;
(3) border detection acted utilizes the correlation of gesture motion and acceleration change amount, automatic decision action Terminal, and intercept judgement gesture of this section of action data as this;
(4) high-pass filtering and low-pass filtering are carried out to the action data of step (3) interception, filters out non-gesture motion number According to further rejecting interference;
(5) Attitude Calculation, the angle of sensor and ground level when calculating movement according to sensing data obtain posture number According to, and exclude according to angle the maloperation gesture of direction exception;
(6) compared according to the gesture motion feature in angular velocity data, acceleration information and attitude data, with maneuver library It is right, confirm gesture whether effectively and gesture-type;
(7) gesture control device of the algorithmic match has a variety of working conditions;According to the current state of device, if hand Gesture is effective, then sends out corresponding instruction;Gesture same in this way represents different instruction under different conditions, can effectively reduce gesture number Amount is also convenient for user's memory convenient for identification.
When step (1) gathered data, acceleration information is sampled at a fixed time interval.
The step (2) uses Kalman filtering (Kalman filtering), right in the case of known measurement variance The collected data of sensor carry out real-time update, calculate the optimal solution of current truthful data, filter out hand during gesture The trueness error of shake and sensor hardware itself.
The step (3) uses the boundary detection method based on acceleration change amount, calculation as follows:
1) the x, y, z axis linear acceleration (filtering out gravity influence) of each sampled point is first calculated separately,
Wherein, aoxi、aoyi、aoziX, y, z axle acceleration data, g are measured for ith sample pointxi、gyi、gziFor the sampling Component of the point acceleration of gravity in x, y, z axis;
2) the sum of the acceleration difference absolute value of ith sample point Δ a is calculatedi,
Δai=abS (axi--ax(i-1))+abS(ayi-ay(i-1))+abS(azi-az(i-1))
3) it is sampled data number, M to define NthFor the detection threshold of beginning and end:
IfThen i is gesture starting point;
IfAnd when having starting point, then i is gesture terminal.
The preliminary filter type of the step (4), when first investigating the acceleration change amount of all gestures in maneuver library, action Between length maximum value and minimum value, using borders rule, the action data being truncated in step (3) is done and is judged, is lost Discard the data not within the scope of this maximum value and minimum value.
The Attitude Calculation of the step (5), calculation are as follows:The first component according to acceleration of gravity on x, y, z axis gx、gy、gz, it is converted into the quaternary number q=[w, x, y, z] for indicating sensor rotation angleπ(or can quaternary directly be exported by sensor Number);Sensor and the angle on ground are calculated further according to quaternary number q, is yaw angle ψ (z-axis), pitching angle theta (y-axis), cross respectively Roll angle(x-axis):
If the angular range of current gesture is within a preset range, this data is brought into next step.
Gesture motion in the step (6) in maneuver library, is distinguished based on following characteristics point:
1. the maximum value in acceleration change value and minimum value:The acceleration calculated in step (3) is poor
Divide the sum of absolute value Δ aiMaximum value and minimum value;
2. gesture time length L:L=tTerminal time-tStarting time
3. the acceleration change value of three axis direction of x, y, z, and determine the wherein maximum axis of acceleration change:
Wherein, L is the time span of gesture;
4. the angular speed changing value of three axis direction of x, y, z, and determine that wherein angular speed changes maximum axis:
Wherein, ψ, θ of each sampled point,The yaw angle ψ (z-axis) that is calculated as in step (5), pitching angle theta (y-axis), Roll angle(x-axis);
5. each inswept angle on three axis of x, y, z:
6. acceleration change wave crest number Px、Py、Pz:In gesture boundary, acceleration wave that equipment occurs on x, y, z axis Peak number;
7. angular speed changes wave crest number Pψ、PθIn gesture boundary, angular speed wave that equipment occurs on x, y, z axis Peak number;
For the data that step (5) obtains, compared item by item by the sequence of above-mentioned 7 characteristic points;If with maneuver library In all characteristic points of some gesture all meet, then be determined as the gesture, stop technology process;The action being consistent such as is can not find, Then judge that the gesture is invalid, stop technology process.
Characteristic in the maneuver library, the gesture sample by acquiring different conditions, Different Individual obtains, by system Input, cannot voluntarily adjust setting by user's individual;The program requires height to the feature difference of gesture, but calculation amount is small, knows Rate is not high, need not be adapted to for single individual, quick identification can be reached with low cost.
The present invention has the following advantages and effects with respect to the prior art:
1. in calculating process, can automatic decision action beginning and end.Extraneous control intervention is not needed.
2. point three steps (angle, feature, state) gradually exclude non-compliant gesture, reduce overall calculation amount, The probability of maloperation is also reduced simultaneously.
It is obtained 3. acting planting modes on sink characteristic by acquiring a large amount of gesture samples, is inputted, cannot be voluntarily adjusted by user's individual by system Whole setting.Although requiring height to the feature difference of gesture in this way, calculation amount is small, and discrimination is high, improves the real-time of algorithm Property, quick identification can be reached with low cost.
4. the gesture control device of the algorithmic match has a variety of working conditions, same gesture represents under different conditions Instruction it is different.Gesture quantity can be reduced in this way, can just solve " the feature difference requirement of gesture described in the 3rd article It is high " the problem of, convenient for calculating identification, it is also convenient for user's memory.
Description of the drawings
Fig. 1 is the gesture identification method flow chart the present invention is based on acceleration transducer.
Fig. 2 is result figure of the step of the present invention (3) using the boundary detection method based on acceleration change amount;Wherein, A is Starting point, B are terminal.
Specific implementation mode
Present invention will now be described in further detail with reference to the embodiments and the accompanying drawings, but embodiments of the present invention are unlimited In this.
Embodiment 1
As shown in Figure 1, the present invention provides a kind of gesture identification methods based on acceleration transducer, including walk as follows Suddenly:
(1) acceleration information and angular velocity data of user's hand motion are acquired by sensor;
(2) Kalman filtering is carried out to the data of step (1) acquisition, removes shaking interference and sensor during gesture Clutter;
(3) border detection acted utilizes the correlation of gesture motion and acceleration change amount, automatic decision action Terminal, and intercept judgement gesture of this section of action data as this;
(4) high-pass filtering and low-pass filtering are carried out to the action data of step (3) interception, filters out non-gesture motion number According to further rejecting interference;
(5) Attitude Calculation, the angle of sensor and ground level when calculating movement according to sensing data obtain posture number According to, and exclude according to angle the maloperation gesture of direction exception;
(6) compared according to the gesture motion feature in angular velocity data, acceleration information and attitude data, with maneuver library It is right, confirm gesture whether effectively and gesture-type;
(7) gesture control device of the algorithmic match has a variety of working conditions;According to the current state of device, if hand Gesture is effective, then sends out corresponding instruction;Gesture same in this way represents different instruction under different conditions, can effectively reduce gesture number Amount is also convenient for user's memory convenient for identification.
When step (1) gathered data, acceleration information is sampled at a fixed time interval.
The step (2) uses Kalman filtering (Kalman filtering), right in the case of known measurement variance The collected data of sensor carry out real-time update, calculate the optimal solution of current truthful data, filter out hand during gesture The trueness error of shake and sensor hardware itself.
The step (3) uses the boundary detection method based on acceleration change amount, analysis chart as shown in Figure 2, calculating side Formula is as follows:
1) the x, y, z axis linear acceleration (filtering out gravity influence) of each sampled point is first calculated separately,
Wherein, aoxi、aoyi、aoziX, y, z axle acceleration data, g are measured for ith sample pointxi、gyi、gziFor the sampling Component of the point acceleration of gravity in x, y, z axis;
2) the sum of the acceleration difference absolute value of ith sample point Δ a is calculatedi,
Δai=abS (axi--ax(i-1))+abS(ayi-ay(i-1))+abS(azi-az(i-1))
3) it is sampled data number, M to define NthFor the detection threshold of beginning and end:
IfThen i is gesture starting point;
IfAnd when having starting point, then i is gesture terminal.
The preliminary filter type of the step (4), when first investigating the acceleration change amount of all gestures in maneuver library, action Between length maximum value and minimum value, using borders rule, the action data being truncated in step (3) is done and is judged, is lost Discard the data not within the scope of this maximum value and minimum value.
The Attitude Calculation of the step (5), calculation are as follows:The first component according to acceleration of gravity on x, y, z axis gx、gy、gz, it is converted into the quaternary number q=[w, x, y, z] for indicating sensor rotation angleπ(or can quaternary directly be exported by sensor Number);Sensor and the angle on ground are calculated further according to quaternary number q, is yaw angle ψ (z-axis), pitching angle theta (y-axis), cross respectively Roll angle(x-axis):
If the angular range of current gesture is within a preset range, this data is brought into next step.
Gesture motion in the step (6) in maneuver library, is distinguished based on following characteristics point:
1. the maximum value in acceleration change value and minimum value:The acceleration difference absolute value calculated i.e. in step (3) it With Δ aiMaximum value and minimum value;
2. gesture time length L:L=tTerminal time-tStarting time
3. the acceleration change value of three axis direction of x, y, z, and determine the wherein maximum axis of acceleration change:
Wherein, L is the time span of gesture;
4. the angular speed changing value of three axis direction of x, y, z, and determine that wherein angular speed changes maximum axis:
Wherein, ψ, θ of each sampled point,The yaw angle ψ (z-axis) that is calculated as in step (5), pitching angle theta (y-axis), Roll angle(x-axis);
5. each inswept angle on three axis of x, y, z:
6. acceleration change wave crest number Px、Py、Pz:In gesture boundary, acceleration wave that equipment occurs on x, y, z axis Peak number;
7. angular speed changes wave crest numberIn gesture boundary, the angle that equipment occurs on x, y, z axis is fast Spend wave crest number;
For the data that step (5) obtains, compared item by item by the sequence of above-mentioned 7 characteristic points;If with maneuver library In all characteristic points of some gesture all meet, then be determined as the gesture, stop technology process;The action being consistent such as is can not find, Then judge that the gesture is invalid, stop technology process.
Characteristic in the maneuver library, the gesture sample by acquiring different conditions, Different Individual obtains, by system Input, cannot voluntarily adjust setting by user's individual;The program requires height to the feature difference of gesture, but calculation amount is small, knows Rate is not high, need not be adapted to for single individual, quick identification can be reached with low cost.
The present invention has the following advantages and effects with respect to the prior art:
1. in calculating process, can automatic decision action beginning and end.Extraneous control intervention is not needed.
2. point three steps (angle, feature, state) gradually exclude non-compliant gesture, reduce overall calculation amount, The probability of maloperation is also reduced simultaneously.
It is obtained 3. acting planting modes on sink characteristic by acquiring a large amount of gesture samples, is inputted, cannot be voluntarily adjusted by user's individual by system Whole setting.Although requiring height to the feature difference of gesture in this way, calculation amount is small, and discrimination is high, improves the real-time of algorithm Property, quick identification can be reached with low cost.
4. the gesture control device of the algorithmic match has a variety of working conditions, same gesture represents under different conditions Instruction it is different.Gesture quantity can be reduced in this way, just solved " feature difference of gesture requires high " described in the 3rd article Problem is also convenient for user's memory convenient for calculating identification.
The above embodiment is a preferred embodiment of the present invention, but embodiments of the present invention are not by above-described embodiment Limitation, it is other it is any without departing from the spirit and principles of the present invention made by changes, modifications, substitutions, combinations, simplifications, Equivalent substitute mode is should be, is included within the scope of the present invention.

Claims (6)

1. a kind of gesture identification method based on acceleration transducer, it is characterised in that:Include the following steps:
(1) acceleration information and angular velocity data of user's hand motion are acquired by sensor;
(2) Kalman filtering is carried out to the data of step (1) acquisition, the shaking interference and sensor during removal gesture are miscellaneous Wave;
(3) border detection acted, using the correlation of gesture motion and acceleration change amount, automatic decision action rises Stop, and intercept judgement gesture of this section of action data as this;
(4) high-pass filtering and low-pass filtering are carried out to the action data of step (3) interception, filters out non-gesture motion data, into One step rejects interference;
(5) Attitude Calculation, the angle of sensor and ground level when calculating movement according to sensing data obtain attitude data, And the maloperation gesture of direction exception is excluded according to angle;
(6) it according to angular velocity data, acceleration information and attitude data, is compared with the gesture motion feature in maneuver library, Confirm gesture whether effectively and gesture-type;
(7) the matched gesture control device of this method has a variety of working conditions;According to the current state of device, if gesture has Effect, then send out corresponding instruction;Gesture same in this way represents different instruction under different conditions,
Gesture motion in the step (6) in maneuver library, is distinguished based on following characteristics point:
1. the maximum value in acceleration change value and minimum value:The sum of the acceleration difference absolute value calculated in step (3) Δ aj Maximum value and minimum value;
2. gesture time length L:L=tTerminal time-tStarting time
3. the acceleration change value of three axis direction of x, y, z, and determine the wherein maximum axis of acceleration change:
Wherein, L is the time span of gesture;
4. the angle change value of three axis direction of x, y, z, and determine the wherein maximum axis of angle change:
Wherein, ψ, θ of each sampled point,The yaw angle ψ of z-axis that is calculated as in step (5), the pitching angle theta of y-axis, x-axis Roll angle
5. each inswept angle on three axis of x, y, z:
6. acceleration change wave crest number Px、Py、Pz:In gesture boundary, acceleration wave peak number that equipment occurs on x, y, z axis;
7. angular speed changes wave crest number Pψ、PθIn gesture boundary, angular speed wave crest number that equipment occurs on x, y, z axis;
For the data that step (5) obtains, compared item by item by the sequence of above-mentioned 7 characteristic points;If with certain in maneuver library All characteristic points of a gesture all meet, then are determined as the gesture, stop technology process;The action being consistent such as is can not find, then is sentenced The gesture of breaking is invalid, stop technology process;Characteristic in the maneuver library, by acquiring different conditions, Different Individual Gesture sample obtains, and is inputted by system, cannot voluntarily adjust setting by user's individual.
2. the gesture identification method according to claim 1 based on acceleration transducer, it is characterised in that:The step (1) when gathered data, acceleration information is sampled at a fixed time interval.
3. the gesture identification method according to claim 1 based on acceleration transducer, it is characterised in that:The step (2) Kalman filtering is used, in the case of known measurement variance, to the collected data of sensor, real-time update is carried out, calculates The optimal solution for going out current truthful data filters out the shake of hand and the trueness error of sensor hardware itself during gesture.
4. the gesture identification method according to claim 1 based on acceleration transducer, it is characterised in that:The step (3) use the boundary detection method based on acceleration change amount, calculation as follows:
1) the x, y, z axis linear acceleration that each sampled point filters out gravity influence is first calculated separately,
Wherein, aoxj、aoyj、aozjX, y, z axle acceleration data, g are measured for j-th of sampled pointxj、gyj、gzjFor the sampled point weight Component of the power acceleration in x, y, z axis;
2) the sum of the acceleration difference absolute value of j-th of sampled point Δ a is calculatedj,
Δaj=abs (axj-ax(j-1))+abs(ayj-ay(j-1))+abs(azj-az(j-1))
3) it is sampled data number, M to define NthFor the detection threshold value of beginning and end:
IfThen j is gesture starting point;
IfAnd when having starting point, then j is gesture terminal.
5. the gesture identification method according to claim 1 based on acceleration transducer, it is characterised in that:The step (4) preliminary filter type, first investigate maneuver library in the acceleration change amount of all gestures, the maximum value of actuation time length and Minimum value is done the action data being truncated in step (3) and is judged, discard not in this maximum value using borders rule With the data within the scope of minimum value.
6. the gesture identification method according to claim 1 based on acceleration transducer, it is characterised in that:The step (5) Attitude Calculation, calculation are as follows:The first component g according to acceleration of gravity on x, y, z axisx、gy、gz, it is converted into table Show the quaternary number q=[w, x, y, z] at sensor rotation angleπ;Sensor and the angle on ground are calculated further according to quaternary number q, point Be not the yaw angle ψ of z-axis, the pitching angle theta of y-axis, x-axis roll angle
If the angular range of current gesture is within a preset range, this data is brought into next step.
CN201610159248.1A 2016-03-21 2016-03-21 A kind of gesture identification method based on acceleration transducer Active CN105824420B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610159248.1A CN105824420B (en) 2016-03-21 2016-03-21 A kind of gesture identification method based on acceleration transducer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610159248.1A CN105824420B (en) 2016-03-21 2016-03-21 A kind of gesture identification method based on acceleration transducer

Publications (2)

Publication Number Publication Date
CN105824420A CN105824420A (en) 2016-08-03
CN105824420B true CN105824420B (en) 2018-09-14

Family

ID=56524667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610159248.1A Active CN105824420B (en) 2016-03-21 2016-03-21 A kind of gesture identification method based on acceleration transducer

Country Status (1)

Country Link
CN (1) CN105824420B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106372673A (en) * 2016-09-06 2017-02-01 深圳市民展科技开发有限公司 Apparatus motion identification method
CN106990841A (en) * 2017-03-30 2017-07-28 无锡云瞳信息科技有限公司 Gesture identification method and the wearable video camera of intelligence based on motion sensor
CN108960016A (en) * 2017-05-25 2018-12-07 富士通株式会社 The method and apparatus for detecting specific action
CN108108015A (en) * 2017-11-20 2018-06-01 电子科技大学 A kind of action gesture recognition methods based on mobile phone gyroscope and dynamic time warping
CN109262608A (en) * 2018-08-22 2019-01-25 南京阿凡达机器人科技有限公司 A kind of method and system that remote-controlled robot is grappled
CN111259694B (en) * 2018-11-30 2021-02-12 北京字节跳动网络技术有限公司 Gesture moving direction identification method, device, terminal and medium based on video
CN109883531A (en) * 2019-03-05 2019-06-14 广州亚美信息科技有限公司 Vehicle vibration kind identification method and system based on acceleration transducer
CN110187772B (en) * 2019-06-03 2020-09-25 中国科学院电子学研究所 Clapping gesture recognition method
CN110377159B (en) * 2019-07-24 2023-06-09 张洋 Action recognition method and device
CN113655879A (en) * 2021-07-15 2021-11-16 上海交通大学 Gesture recognition method and system based on accelerometer
CN115695518B (en) * 2023-01-04 2023-06-30 广东保伦电子股份有限公司 PPT control method based on intelligent mobile equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104185050A (en) * 2014-07-30 2014-12-03 哈尔滨工业大学深圳研究生院 OTT television based intelligent remote control system and control method thereof
CN104731307A (en) * 2013-12-20 2015-06-24 孙伯元 Somatic action identifying method and man-machine interaction device
CN104866099A (en) * 2015-05-27 2015-08-26 东南大学 Error compensation method for improving gesture identification precision of intelligent device based on motion sensor

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101185144B1 (en) * 2006-01-24 2012-09-24 삼성전자주식회사 Method and apparatus for estimating 2-dimension trajectory of a gesture

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731307A (en) * 2013-12-20 2015-06-24 孙伯元 Somatic action identifying method and man-machine interaction device
CN104185050A (en) * 2014-07-30 2014-12-03 哈尔滨工业大学深圳研究生院 OTT television based intelligent remote control system and control method thereof
CN104866099A (en) * 2015-05-27 2015-08-26 东南大学 Error compensation method for improving gesture identification precision of intelligent device based on motion sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于加速度传感器的手势识别;代宏斌;《中国优秀硕士学位论文全文数据库》;20140915(第9期);第9-27页 *

Also Published As

Publication number Publication date
CN105824420A (en) 2016-08-03

Similar Documents

Publication Publication Date Title
CN105824420B (en) A kind of gesture identification method based on acceleration transducer
US9221170B2 (en) Method and apparatus for controlling a robotic device via wearable sensors
CN109579853B (en) Inertial navigation indoor positioning method based on BP neural network
US20190065872A1 (en) Behavior recognition apparatus, learning apparatus, and method and program therefor
CN111666891B (en) Method and device for estimating movement state of obstacle
CN105877757A (en) Multi-sensor integrated human motion posture capturing and recognizing device
CN111611901B (en) Vehicle reverse running detection method, device, equipment and storage medium
EP2975867A1 (en) Method for detecting driving events of a vehicle based on a smartphone
CN110057380B (en) Step counting method, step counting device, terminal and storage medium
CN106184220B (en) Abnormal driving detection method in a kind of track based on vehicle location track
CN111095164A (en) Method and apparatus for detecting user input in dependence on gesture
CN110327050B (en) Embedded intelligent detection method for falling state of person for wearable equipment
CN107766930B (en) Equivalent ROM distance calculation method based on DTW cluster fuzzy clustering SOM neurons
JP2019112049A (en) Method for recognizing driving style of driver of land vehicle and corresponding device
CN109540143B (en) Pedestrian unconventional action direction identification method based on multi-sensing-source dynamic peak fusion
CN107358248B (en) Method for improving falling detection system precision
WO2019033587A1 (en) Smart watch-based gps drifting filtering method, and smart watch
WO2017185222A1 (en) System and method for motion trajectory collection and analysis based on ball games
CN107662613B (en) A kind of extreme driving behavior recognition methods and system based on mobile intelligent perception
CN107203271B (en) Double-hand recognition method based on multi-sensor fusion technology
CN108710828A (en) The method, apparatus and storage medium and vehicle of identification object
CN108051001B (en) Robot movement control method and system and inertial sensing control device
CN106408593A (en) Video-based vehicle tracking method and device
CN106740865B (en) A method of vehicle zig zag event is determined according to acceleration and gyroscope
CN110929766B (en) Self-adaptive pedestrian mobile phone attitude identification method based on Gaussian mixture clustering algorithm

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201022

Address after: No.242, dongshuangding village, Yaxi Town, Rongcheng City, Weihai City, Shandong Province

Patentee after: Chen Ailian

Address before: 518000 Guangdong city of Shenzhen province Baoan District Shengnawei Bay beauty garden 9A-2108

Patentee before: Li Xiao

TR01 Transfer of patent right