CN106406516A - Local real-time movement trajectory characteristic extraction and identification method for smartphone - Google Patents

Local real-time movement trajectory characteristic extraction and identification method for smartphone Download PDF

Info

Publication number
CN106406516A
CN106406516A CN201610732089.XA CN201610732089A CN106406516A CN 106406516 A CN106406516 A CN 106406516A CN 201610732089 A CN201610732089 A CN 201610732089A CN 106406516 A CN106406516 A CN 106406516A
Authority
CN
China
Prior art keywords
mobile phone
data
user
smartphone
smart mobile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610732089.XA
Other languages
Chinese (zh)
Inventor
赵宏
陈攀
侯春宁
王丹丹
韩泽宇
曹昶
张乐
张雨丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lanzhou University of Technology
Original Assignee
Lanzhou University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lanzhou University of Technology filed Critical Lanzhou University of Technology
Priority to CN201610732089.XA priority Critical patent/CN106406516A/en
Publication of CN106406516A publication Critical patent/CN106406516A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • G06F2218/10Feature extraction by analysing the shape of a waveform, e.g. extracting parameters relating to peaks

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Telephone Function (AREA)

Abstract

The invention relates to a local real-time movement trajectory characteristic extraction and identification method for a smartphone. The method has two stages including a training stage and an identifying stage; in the training stage, a user carries the smartphone to do different actions; data of a triaxial acceleration sensor is acquired; for a user posture behaviour, peak and trough characteristic points are extracted, so that binary coding of the peak and trough characteristic points is carried out; after being coded, the characteristic points are vectorized; furthermore, matrixing of actions having the same peak and trough number value is carried out; multiple samples are acquired and trained on an upper computer; therefore, a user action characteristic standard library is established; in the identifying stage, the user action standard library is transferred to the smartphone; the user carries the smartphone to do actions; sensor data is acquired at the smartphone side; characteristic point extraction of the smartphone is carried out locally; the user action standard library is matched; and thus, user actions are identified. According to the method disclosed by the invention, characteristic vectors needing to be stored by the movement trajectory of the smartphone are identified as a binary system; furthermore, the data size is small; the identification process is relatively simple; massive calculation does not need to be performed; and thus, the method is suitable for embedded equipment, such as the resource limited smartphone.

Description

A kind of smart mobile phone locally real-time movement locus Feature extraction and recognition method
Technical field
The present invention relates to smart mobile phone track identification technical field, refer specifically to smart mobile phone locally real-time movement locus Feature extraction and recognition method.
Background technology
Embedded in mobile phone has acceleration transducer, can be used to detect size and the side of mobile phone acceleration according to acceleration transducer To, and then perceive the state in three dimensions.There are X, the coordinate in tri- directions of Y, Z for mobile phone acceleration sensor, work as user When carrying mobile phone does different actions, for X, tri- change in coordinate axis direction of Y, Z all can produce different numerical value, and three axles produce simultaneously Data reflect the state of user.More complicated for smart mobile phone movement locus Feature extraction and recognition process at present, and Smart mobile phone computing capability is weak, and memory space is little, needs the movement locus feature carrying out non-real time by extras to carry The present situation taking and identifying is it is impossible to meet the identification that Smartphone device locally carries out movement locus in real time.
Content of the invention:
The present invention proposes a kind of smart mobile phone locally real-time movement locus Feature extraction and recognition method, and the method identifies mobile phone For binary system and data scale is little for the characteristic point vector of the required storage of movement locus, and identification process is relatively simple need not to be carried out in a large number Calculate, be suitable for carrying out on the embedded devices such as resource-constrained smart mobile phone.
For this reason, the technical scheme being adopted is:
A kind of smart mobile phone locally real-time movement locus Feature extraction and recognition method, the method is divided into training and identification two Stage;In the training stage:User carries smart mobile phone and does different actions, gathers separated intelligent mobile phone 3-axis acceleration sensor Data;Wave crest and wave trough characteristic point is extracted to user's attitude behavior;Binary coding quantization is carried out to Wave crest and wave trough characteristic point;Coding Characteristic point vectorization afterwards;And Wave crest and wave trough quantity identical action is carried out matrixing, gather multiple samples and carry out in host computer Training, sets up user action characteristic standard storehouse;In cognitive phase:Transplanting user action characteristic standard storehouse is to smart mobile phone, user When carrying smart mobile phone and doing different actions, gather sensing data in mobile phone terminal, open up relief area in mobile phone terminal, extract motion Track data, sets up multithreading and extracts segmentation feature point, mate user action characteristic standard storehouse, smart mobile phone locally enters in real time Row feature point extraction, and then identifying user action.
Complicated relative to smart mobile phone movement locus Feature extraction and recognition process, and smart mobile phone computing capability Weak, memory space is little, needs to carry out the present situation of the movement locus Feature extraction and recognition of non-real time by extras, this Invention has advantages below to the identification of smart mobile phone movement locus:(1) smart mobile phone can be directed to or some are small-sized embedded Formula equipment carries out the identification of movement locus.(2) calculate characteristic point simple, reduce the normalized to characteristic point, be suitable for calculating The weak smart mobile phone of ability.(3) after quantization encoding being carried out to movement locus characteristic point, the required storage of identification mobile phone movement locus Characteristic point vector be binary system and data scale is little, the embedded device such as smart mobile phone limited by suitable storage resource.
Brief description:
Fig. 1 is that smart mobile phone movement locus of the present invention train flow chart;
Fig. 2 is smart mobile phone movement locus identification process figure of the present invention;
Fig. 3 produces data waveform figure for intelligent mobile phone sensor of the present invention;
Fig. 4 is separation sensor data of the present invention and feature extraction code pattern;
Fig. 5 utilizes windows detecting Wave crest and wave trough figure for the present invention;
Fig. 6 is Wave crest and wave trough characteristic vector figure of the present invention;
Fig. 7 is special action Wave crest and wave trough eigenmatrix figure of the present invention.
Specific embodiment:
Below in conjunction with the accompanying drawings the present invention is described in further detail.
The training schematic flow sheet of the present invention is as shown in Figure 1:
It is that user carries smart mobile phone or low profile edge equipment does different actions first, 3-axis acceleration sensor produces number According to according to X, Y, data is carried out separating by Z axis, then finds Wave crest and wave trough characteristic point in all directions, and is encoded, will Characteristic point vectorization, the same number of attitude matrix for Wave crest and wave trough, gather multiple samples, using artificial neural network instruction Practice sample, form user action characteristic standard storehouse.
By smart mobile phone acceleration transducer along X, Y, Z tri- axle carries out separating.When having in X-direction for a certain action It is unique, you can identification.If in X-axis None- identified, for Y, Z axis just need not be compared.And successively to Y-axis, Z axis are carried out Process.The process in mobile phone end data can be reduced using the method, save smart mobile phone computing resource.
Form using one piece of data segmentation extracts Wave crest and wave trough characteristic point to user's attitude behavior.This method provide one Plant the extracting mode of data Wave crest and wave trough characteristic point, method is simple, amount of calculation is less and can filter miscellaneous point and find out crucial spy Levy a little.
The form of the Wave crest and wave trough characteristic point vector after coding is represented, can be by feature dot format by vectorization Change, conveniently calculate further, and computational efficiency can be improved.
The same number of action of Wave crest and wave trough carries out matrixing process.The same number of action of Wave crest and wave trough forms oneMatrix, M represents the number of Wave crest and wave trough on three coordinate axess.Further increase computational efficiency.
The identification process schematic diagram of the present invention is as shown in Figure 2:
First user action characteristic standard storehouse is transplanted in smart mobile phone memory space.User carry smart mobile phone do different During action, gather sensing data in mobile phone terminal, open up relief area in mobile phone terminal, extract motion trace data, set up multithreading Extract segmentation feature point, mate user action characteristic standard storehouse, smart mobile phone locally carries out feature point extraction, Jin Ershi in real time Other user action.
Flow process will be further described below.
As shown in Figure 3:When user does action, smart mobile phone acceleration transducer produces data in X, tri- directions of Y, Z, Data is processed, then splicing is it is simply that three waveforms.Data on mobile phone coordinate axess reflects the kinestate of user.
As shown in Figure 4:According to intelligent mobile phone sensor X, three groups of device data difference to acceleration sensing for tri- directions of Y, Z Processed, the order according to Wave crest and wave trough appearance is as the characteristic point of waveform.It is 1 according to crest, trough is that 0 mode will be examined The characteristic point measuring is encoded.Computer Storage resource can be saved using binary coding.
The method of Wave crest and wave trough feature extraction:
As shown in Figure 5:Identify crest and the trough of one piece of data using sliding window.Data by the one section of fixed number producing As definite value sliding window, user movement status data all, in definite value sliding window, sliding window is divided into equal portions Little sliding window, carries out the detection of Wave crest and wave trough data in this little sliding window, can have in each little sliding window crest with Trough, the order according to occurring is ranked up, by each comparing of crest and trough, front several crests of value maximum and value Minimum rear several troughs, as the characteristic point of each axle.Partly miscellaneous point can be filtered out by this method, search out key Characteristic point.
As shown in Figure 6:The characteristic point that Wave crest and wave trough is obtained carries out vectorization.The data that three groups of acceleration transducers produce The Wave crest and wave trough producing sequentially in time, forms three groups of vectors, and the data in each vector represents the characteristic point of all directions And the order that characteristic point occurs.
As shown in Figure 7:The data that special action is formed carries out matrixing process.Separation sensor all directions produce Data, if three coordinate axess Wave crest and wave trough length are all identical, can carry out matrixing process.Form oneMatrix.
Cognitive phase:Each moment of intelligent mobile phone sensor can produce data, for data Wave crest and wave trough calculating with The generation of data has synchronous problem, and using developing relief area, and multithreading solves stationary problem.
Implementation method is as follows:During Wave crest and wave trough detection, define a big sliding window relief area, will sense The continuous iteration of data that device produces is deposited into oneIn fixed length matrix, wherein N set according to display windows length one Individual length value, adds latest data in matrix, abandons the relatively early data being stored in simultaneously, and the length maintaining matrix is fixed value N, Then by 3 fixed length matrix deciles, the little sliding window relief area after decile is opened up with multiple threads and carries out Wave crest and wave trough spy The detection levied.
In user movement state with Sample Storehouse comparison process, first X-axis data is compared, if for a certain action There is it unique in X-direction, you can identification.If in X-axis None- identified, for Y, Z axis just need not be compared.Successively to Y Axle, Z axis are processed.
Further illustrate the effect of the present invention below by specific application scenarios:
Scene 1:Field of human-computer interaction.Present man-machine interaction is required for buying supporting equipment, spends costly.Intelligence Mobile phone arranges many sensors as the very high equipment of an integrated level, inside, all the time all create substantial amounts of Data, and smart mobile phone computing capability is weak, memory space is little, can carry out effective storage of key point for the data producing. Identifying user action, and then map other functions, carry out man-machine interaction.
Scene 2:Characteristic point normalized.For some wave character process aspects, it is subject to a lot of disturbing sometimes, Such as speed, power, the characteristic point that collects is needed to be normalized, by this method may not necessarily carry out with big than Example or the normalized with small scale, only need to find Wave crest and wave trough characteristic point sequentially in time.
Scene 3:Characteristic point quantifies and calculates.For some specific fields, it is impossible to right after searching out characteristic point Characteristic point carries out calculating process.By herein matrixing being carried out to characteristic point, characteristic point can be further processed.
To sum up, the present invention by smart mobile phone acceleration transducer is carried out with the separation of data, feature extraction, compile by feature Code, a series of process such as characteristic point vectorization, effectively solve more complicated to smart mobile phone movement locus feature extraction Problem, and coded quantization is carried out for wave character, key message is effectively stored, has greatly been saved smart mobile phone Storage resource, feature coding can reduce characteristic point is normalized, and has saved the computing resource of smart mobile phone mobile phone, And without by by extras, locally real-time movement locus Feature extraction and recognition can carried out it is adaptable to provide The embedded devices such as the smart mobile phone limited by source.

Claims (4)

1. a kind of smart mobile phone locally in real time movement locus Feature extraction and recognition method it is characterised in that:The method is divided into Training and two stages of identification;In the training stage:User carries smart mobile phone and does different actions, gathers separated intelligent mobile phone three Axle acceleration sensor data;Wave crest and wave trough characteristic point is extracted to user's attitude behavior;Carry out two to Wave crest and wave trough characteristic point to enter Coded quantization processed;Characteristic point vectorization after coding;And Wave crest and wave trough quantity identical action is carried out matrixing, gather multiple samples This is trained in host computer, sets up user action characteristic standard storehouse;In cognitive phase:Transplanting user action characteristic standard storehouse is extremely Smart mobile phone, when user carries smart mobile phone and does different actions, gathers sensing data in mobile phone terminal, slow in mobile phone terminal developing Rush area, extract motion trace data, set up multithreading and extract segmentation feature point, mate user action characteristic standard storehouse, intelligent handss Machine locally carries out feature point extraction in real time, and then identifying user action.
2. a kind of smart mobile phone according to claim 1 locally real-time movement locus Feature extraction and recognition method, its It is characterised by:Smart mobile phone acceleration transducer is carried out separating along three axles.
3. a kind of smart mobile phone according to claim 1 locally real-time movement locus Feature extraction and recognition method, its It is characterised by:Form using one piece of data segmentation extracts Wave crest and wave trough characteristic point to user's attitude behavior.
4. a kind of smart mobile phone according to claim 3 locally real-time movement locus Feature extraction and recognition method, its It is characterised by:The form of described utilization one piece of data segmentation is specially:Identify crest and the ripple of one piece of data using sliding window Paddy, using the data of the one section of fixed number producing as definite value sliding window, user movement status data is all in definite value sliding window In mouthful, sliding window is divided into the sliding window of equal portions, carries out the inspection of Wave crest and wave trough data in this little sliding window Survey, in each little sliding window, can have crest and trough, the order according to occurring is ranked up, crest and trough is respective Compare, rear several troughs of the maximum front several crests of value and value minimum, as the characteristic point of each axle.
CN201610732089.XA 2016-08-26 2016-08-26 Local real-time movement trajectory characteristic extraction and identification method for smartphone Pending CN106406516A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610732089.XA CN106406516A (en) 2016-08-26 2016-08-26 Local real-time movement trajectory characteristic extraction and identification method for smartphone

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610732089.XA CN106406516A (en) 2016-08-26 2016-08-26 Local real-time movement trajectory characteristic extraction and identification method for smartphone

Publications (1)

Publication Number Publication Date
CN106406516A true CN106406516A (en) 2017-02-15

Family

ID=58004828

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610732089.XA Pending CN106406516A (en) 2016-08-26 2016-08-26 Local real-time movement trajectory characteristic extraction and identification method for smartphone

Country Status (1)

Country Link
CN (1) CN106406516A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108737623A (en) * 2018-05-31 2018-11-02 南京航空航天大学 The method for identifying ID of position and carrying mode is carried based on smart mobile phone
CN110151187A (en) * 2019-04-09 2019-08-23 缤刻普达(北京)科技有限责任公司 Body-building action identification method, device, computer equipment and storage medium
CN113283493A (en) * 2021-05-19 2021-08-20 Oppo广东移动通信有限公司 Sample acquisition method, device, terminal and storage medium
CN113780447A (en) * 2021-09-16 2021-12-10 郑州云智信安安全技术有限公司 Sensitive data discovery and identification method and system based on flow analysis
WO2023178594A1 (en) * 2022-03-24 2023-09-28 广东高驰运动科技股份有限公司 Action counting method and apparatus, device, and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101788861A (en) * 2009-01-22 2010-07-28 华硕电脑股份有限公司 Method and system for identifying three-dimensional motion
CN102246125A (en) * 2008-10-15 2011-11-16 因文森斯公司 Mobile devices with motion gesture recognition
CN102772211A (en) * 2012-08-08 2012-11-14 中山大学 Human movement state detection system and detection method
CN103345627A (en) * 2013-07-23 2013-10-09 清华大学 Action recognition method and device
CN103517118A (en) * 2012-12-28 2014-01-15 Tcl集团股份有限公司 Motion recognition method and system for remote controller
CN103886323A (en) * 2013-09-24 2014-06-25 清华大学 Behavior identification method based on mobile terminal and mobile terminal
CN103984416A (en) * 2014-06-10 2014-08-13 北京邮电大学 Gesture recognition method based on acceleration sensor
CN104754111A (en) * 2013-12-31 2015-07-01 北京新媒传信科技有限公司 Control method for mobile terminal application and control device
CN104750386A (en) * 2015-03-20 2015-07-01 广东欧珀移动通信有限公司 Gesture recognition method and device
CN105159441A (en) * 2015-07-28 2015-12-16 东华大学 Autonomous motion identification technology based private coach smart band

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102246125A (en) * 2008-10-15 2011-11-16 因文森斯公司 Mobile devices with motion gesture recognition
CN101788861A (en) * 2009-01-22 2010-07-28 华硕电脑股份有限公司 Method and system for identifying three-dimensional motion
CN102772211A (en) * 2012-08-08 2012-11-14 中山大学 Human movement state detection system and detection method
CN103517118A (en) * 2012-12-28 2014-01-15 Tcl集团股份有限公司 Motion recognition method and system for remote controller
CN103345627A (en) * 2013-07-23 2013-10-09 清华大学 Action recognition method and device
CN103886323A (en) * 2013-09-24 2014-06-25 清华大学 Behavior identification method based on mobile terminal and mobile terminal
CN104754111A (en) * 2013-12-31 2015-07-01 北京新媒传信科技有限公司 Control method for mobile terminal application and control device
CN103984416A (en) * 2014-06-10 2014-08-13 北京邮电大学 Gesture recognition method based on acceleration sensor
CN104750386A (en) * 2015-03-20 2015-07-01 广东欧珀移动通信有限公司 Gesture recognition method and device
CN105159441A (en) * 2015-07-28 2015-12-16 东华大学 Autonomous motion identification technology based private coach smart band

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108737623A (en) * 2018-05-31 2018-11-02 南京航空航天大学 The method for identifying ID of position and carrying mode is carried based on smart mobile phone
CN110151187A (en) * 2019-04-09 2019-08-23 缤刻普达(北京)科技有限责任公司 Body-building action identification method, device, computer equipment and storage medium
CN110151187B (en) * 2019-04-09 2022-07-05 缤刻普达(北京)科技有限责任公司 Body-building action recognition method and device, computer equipment and storage medium
CN113283493A (en) * 2021-05-19 2021-08-20 Oppo广东移动通信有限公司 Sample acquisition method, device, terminal and storage medium
CN113780447A (en) * 2021-09-16 2021-12-10 郑州云智信安安全技术有限公司 Sensitive data discovery and identification method and system based on flow analysis
CN113780447B (en) * 2021-09-16 2023-07-11 郑州云智信安安全技术有限公司 Sensitive data discovery and identification method and system based on flow analysis
WO2023178594A1 (en) * 2022-03-24 2023-09-28 广东高驰运动科技股份有限公司 Action counting method and apparatus, device, and storage medium

Similar Documents

Publication Publication Date Title
CN106406516A (en) Local real-time movement trajectory characteristic extraction and identification method for smartphone
Yang et al. Deep convolutional neural networks on multichannel time series for human activity recognition.
Subetha et al. A survey on human activity recognition from videos
CN105956560B (en) A kind of model recognizing method based on the multiple dimensioned depth convolution feature of pondization
CN108287989B (en) Sliding verification code man-machine identification method based on track
CN109858406B (en) Key frame extraction method based on joint point information
CN108509859A (en) A kind of non-overlapping region pedestrian tracting method based on deep neural network
Xia et al. An evaluation of deep learning in loop closure detection for visual SLAM
CN104281853A (en) Behavior identification method based on 3D convolution neural network
CN110232308B (en) Robot-following gesture track recognition method based on hand speed and track distribution
CN105095829A (en) Face recognition method and system
Zhang et al. IF-ConvTransformer: A framework for human activity recognition using IMU fusion and ConvTransformer
CN111723662B (en) Human body posture recognition method based on convolutional neural network
CN106815578A (en) A kind of gesture identification method based on Depth Motion figure Scale invariant features transform
CN110674875A (en) Pedestrian motion mode identification method based on deep hybrid model
CN103440667A (en) Automatic device for stably tracing moving targets under shielding states
Liu et al. Hand Gesture Recognition Based on Single‐Shot Multibox Detector Deep Learning
CN106023249A (en) Moving object detection method based on local binary similarity pattern
CN110533699A (en) The dynamic multiframe speed-measuring method of pixel variation based on optical flow method
CN105160285A (en) Method and system for recognizing human body tumble automatically based on stereoscopic vision
Zhang et al. Trajectory series analysis based event rule induction for visual surveillance
CN113780140A (en) Gesture image segmentation and recognition method and device based on deep learning
CN105631462A (en) Behavior identification method through combination of confidence and contribution degree on the basis of space-time context
CN109029432B (en) Human body action detection and identification method based on six-axis inertial sensing signal
Wang Data feature extraction method of wearable sensor based on convolutional neural network

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170215

RJ01 Rejection of invention patent application after publication