CN109189221B - User behavior identification method across mobile phone platforms - Google Patents
User behavior identification method across mobile phone platforms Download PDFInfo
- Publication number
- CN109189221B CN109189221B CN201810967532.0A CN201810967532A CN109189221B CN 109189221 B CN109189221 B CN 109189221B CN 201810967532 A CN201810967532 A CN 201810967532A CN 109189221 B CN109189221 B CN 109189221B
- Authority
- CN
- China
- Prior art keywords
- data
- acceleration
- data acquisition
- acquisition platform
- platform
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 230000001133 acceleration Effects 0.000 claims abstract description 104
- 238000013480 data collection Methods 0.000 claims abstract description 16
- 238000003672 processing method Methods 0.000 claims abstract description 12
- 230000009466 transformation Effects 0.000 claims abstract description 12
- 238000012545 processing Methods 0.000 claims abstract description 9
- 238000002372 labelling Methods 0.000 claims abstract description 4
- 230000006399 behavior Effects 0.000 claims description 46
- 230000003068 static effect Effects 0.000 claims description 19
- 238000005070 sampling Methods 0.000 claims description 16
- 238000012952 Resampling Methods 0.000 claims description 15
- 239000002131 composite material Substances 0.000 claims description 9
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000003066 decision tree Methods 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
Abstract
The invention provides a user behavior identification method across mobile phone platforms, which comprises the following steps: s1: collecting data on a data collection platform A and labeling a collected data set DataA; s2: processing the labeled data set DataA by adopting an acceleration sensor data processing method based on time-frequency domain transformation, then establishing a behavior recognition model ModelA, and S3: acquiring data on a data acquisition platform B to obtain a data set DataB, and processing the data set DataB by adopting an acceleration sensor data processing method based on time-frequency domain transformation; s4: the behavior recognition model ModelA is directly realized on the data acquisition platform B, and then the processed data set DataB is recognized by the behavior recognition model ModelA to predict the real-time behavior of the user.
Description
Technical Field
The invention relates to the technical field of pattern recognition and sensors, in particular to a user behavior recognition method across mobile phone platforms.
Background
In the research field of intelligent old people care, the monitoring device has important significance in monitoring the action behaviors of the old people, such as static behaviors, walking, jogging, climbing stairs, descending stairs and the like. The process of constructing the behavior recognition model based on the acceleration sensor by using the traditional machine learning method is shown in fig. 1, and the process of deploying the trained model to recognize the user behavior on line is shown in fig. 2. If the behavior recognition model is required to successfully predict the behavior of the user, it is the most basic requirement that the DataA and the DataB have consistent distributivity; in reality, when a behavior recognition model and a deployment system are researched, it is found that data of an apple iphone7(iOS platform) and a Mate8(Android platform) do not meet the requirement of consistency, as shown in fig. 3 and 4, as can be seen from fig. 3, the resultant acceleration of an iphone7 mobile phone is about 1 when the mobile phone is at rest, and the unit of the acceleration read by the iphone7 mobile phone is g when the gravity acceleration is 1g when the mobile phone is at rest; the combined acceleration of the Mate8 mobile phone is about 10, and the unit of the acceleration read by the Mate8 mobile phone is m/s 2, which is known from the magnitude of the gravity acceleration at rest being 9.8 m/s 2. The difference of the readings can be understood as the difference of the measurement dimension of the acceleration of the API under different mobile phone platforms. The mobile phones of the two platforms have such obvious difference between static data read by respective acceleration data acquisition programs, and people want to explore the difference, however, no official statement is made by referring to relevant data. Nevertheless, the dimension of the acceleration sensor data collected under the two different mobile phone platforms is different, and is an objective reality, and the curve depicted in fig. 4 represents the composite acceleration values of 200 data collected by the Mate8 and iphone7 platform mobile phones respectively when jogging with the user. Jogging is a periodic motion represented on a waveform plot of acceleration data, exhibiting a pattern of alternating peaks and valleys. As is clear from fig. 4, the two handsets perceive the number of peaks and troughs within 200 data as different, that is, the number of steps that the user has jogged is different. Because the two mobile phones are bound together, the time difference between every two wave crests is the same no matter which mobile phone is used, so that the time consumed by the two mobile phones for acquiring 200 data can be calculated to be different, and the acceleration sampling frequency of the two mobile phones can be further known to be different; the difference in the distribution of the training data and the predictive data results in the inability of a behavior recognition model trained on data collected on one platform to effectively distinguish data collected on the other platform.
Disclosure of Invention
The invention aims to provide a user behavior identification method across mobile phone platforms, which is used for normalizing acceleration data acquired by different data platforms in a time domain and resampling the acceleration data in a frequency domain, so that the equipment universality of a model can be effectively improved, and the technical problem can be solved.
In order to achieve the purpose, the invention adopts the technical scheme that: a method for identifying user behaviors across mobile phone platforms comprises the following steps:
s1: collecting data on a data collection platform A and labeling a collected data set DataA;
s2: processing the labeled data set DataA by adopting an acceleration sensor data processing method based on time-frequency domain transformation, then establishing a behavior recognition model ModelA, and recording a parameter beta of the established behavior recognition model ModelA;
s3: acquiring data on a data acquisition platform B to obtain a data set DataB, and processing the data set DataB by adopting an acceleration sensor data processing method based on time-frequency domain transformation;
s4: and directly realizing a behavior recognition model Modela with the parameter beta on a data acquisition platform B, and then recognizing the processed data set DataB by adopting the behavior recognition model Modela so as to predict the real-time behavior of the user.
Further, in step S2, a decision tree classification method or a neural network classification method is used when the behavior recognition model ModelA is established.
Further, the acceleration sensor data processing methods based on time-frequency domain transformation used in step S2 and step S3 each include the following steps:
s11: acquiring original triaxial acceleration data, namely fixedly configuring a data acquisition platform A or a data acquisition platform B on a user body, and acquiring triaxial acceleration data of the user in any state through a triaxial acceleration sensor built in the data acquisition platform A or the data acquisition platform B;
s12: the acceleration signals are synthesized, and it is assumed that the triaxial acceleration data collected in step S11 is (a'x,a’y,a’z) Then the combined value of the three-axis acceleration of the user under any state is
S13: detecting a static state, namely, a user places a data acquisition platform A or a data acquisition platform B in a static state, and acquires self triaxial acceleration data of the data acquisition platform A or the data acquisition platform B when the data acquisition platform A or the data acquisition platform B is static through a triaxial acceleration sensor built in the data acquisition platform A or the data acquisition platform B;
s14: the mode S of the stationary combined acceleration is calculated, and the three-axis acceleration data acquired in step S13 is assumed to be (a)x,ay,az) If the data acquisition platform A or the data acquisition platform B is static, the composite value of the self triaxial acceleration is S, and
s15: and (3) synthesizing acceleration normalization, wherein the synthesized acceleration obtained by normalizing the triaxial acceleration data of the user in any state is as follows:
s16: and (3) resampling data within one second, taking T as a time window, generating a synthesized acceleration data oscillogram from the synthesized acceleration data collected by the data collection platform A or the data collection platform B in the time window, and then setting the data resampling frequency as F, namely performing interpolation operation if the sampling frequency of the synthesized acceleration data oscillogram generated by the data collection platform A or the data collection platform B is lower than F, and performing downsampling operation if the sampling frequency is higher than F.
S17: and (4) extracting the subsequent features, namely extracting the features in the synthesized acceleration data waveform diagram generated in the step S16 by adopting a sliding window method.
Further, the time window T is 2 seconds.
Further, the data resampling frequency F is 32 Hz.
Further, the data acquisition platform a and the data acquisition platform B are two mobile phones with different operating systems.
Compared with the prior art, the invention has the beneficial effects that: the method provided by the invention can be used for normalizing the acceleration data acquired by different data platforms in the time domain and resampling in the frequency domain, so that the equipment universality of the model can be effectively improved.
Drawings
FIG. 1 is a schematic flow chart of a conventional machine learning method for constructing an acceleration sensor-based behavior recognition model;
FIG. 2 is a schematic flow chart of a conventional behavior recognition model for recognizing user behavior online;
FIG. 3 is a waveform of the composite acceleration data of an apple iphone7(iOS platform) and Huacheng Mate8(Android platform) at rest;
FIG. 4 is a waveform of composite acceleration data for an apple iphone7(iOS platform) and Huacheng Mate8(Android platform) during jogging;
FIG. 5 is a schematic flow chart of an acceleration sensor-based behavior recognition model constructed by the present invention;
FIG. 6 is a schematic flow chart of online recognition of user behavior by the behavior recognition model constructed in the present invention;
fig. 7 is a flow chart diagram of an acceleration sensor data processing method based on time-frequency domain transformation.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without creative efforts belong to the protection scope of the present invention.
A method for identifying user behavior across mobile phone platforms, as shown in fig. 5 and 6, includes the following steps:
s1: collecting data on a data collection platform A and labeling a collected data set DataA;
s2: processing the labeled data set DataA by adopting an acceleration sensor data processing method based on time-frequency domain transformation, then establishing a behavior recognition model ModelA, and recording a parameter beta of the established behavior recognition model ModelA;
s3: acquiring data on a data acquisition platform B to obtain a data set DataB, and processing the data set DataB by adopting an acceleration sensor data processing method based on time-frequency domain transformation;
s4: and directly realizing a behavior recognition model Modela with the parameter beta on a data acquisition platform B, and then recognizing the processed data set DataB by adopting the behavior recognition model Modela so as to predict the real-time behavior of the user.
In step S2, a decision tree classification method or a neural network classification method is adopted when the behavior recognition model ModelA is established, and it should be noted here that the classification method adopted when the behavior recognition model ModelA is established may be any one of the above two methods, and certainly is not limited to the above two methods, and those skilled in the art may adopt many different classification methods to establish the behavior recognition model ModelA, so that the methods are not limited to one another here.
As shown in fig. 7, the acceleration sensor data processing methods based on time-frequency domain transformation used in step S2 and step S3 each include the following steps:
s11: acquiring original triaxial acceleration data, namely fixedly configuring a data acquisition platform A or a data acquisition platform B on a user body, and acquiring triaxial acceleration data of the user in any state through a triaxial acceleration sensor built in the data acquisition platform A or the data acquisition platform B;
s12: the acceleration signals are synthesized, and it is assumed that the triaxial acceleration data collected in step S11 is (a'x,a’y,a’z) Then the combined value of the three-axis acceleration of the user under any state is
S13: detecting a static state, namely, a user places a data acquisition platform A or a data acquisition platform B in a static state, and acquires self triaxial acceleration data of the data acquisition platform A or the data acquisition platform B when the data acquisition platform A or the data acquisition platform B is static through a triaxial acceleration sensor built in the data acquisition platform A or the data acquisition platform B;
s14: the mode S of the stationary combined acceleration is calculated, and the three-axis acceleration data acquired in step S13 is assumed to be (a)x,ay,az) If the data acquisition platform A or the data acquisition platform B is static, the composite value of the self triaxial acceleration is S, and
s15: and (3) synthesizing acceleration normalization, wherein the synthesized acceleration obtained by normalizing the triaxial acceleration data of the user in any state is as follows:
s16: and (3) resampling data within one second, taking T as a time window, generating a synthesized acceleration data oscillogram from the synthesized acceleration data collected by the data collection platform A or the data collection platform B in the time window, and then setting the data resampling frequency as F, namely performing interpolation operation if the sampling frequency of the synthesized acceleration data oscillogram generated by the data collection platform A or the data collection platform B is lower than F, and performing downsampling operation if the sampling frequency is higher than F.
S17: and (4) extracting the subsequent features, namely extracting the features in the synthesized acceleration data waveform diagram generated in the step S16 by adopting a sliding window method.
The scheme is further optimized, and the time window T is 2 seconds.
Further optimizing the scheme, the data resampling frequency F is 32 Hz.
Further optimizing the scheme, the data acquisition platform A and the data acquisition platform B are mobile phones with two different operating systems.
The present invention is described in detail below:
(1) method for normalizing synthetic acceleration
Under the static condition of the equipment, the synthetic acceleration of the equipment is 1g of gravity acceleration, and according to the criterion, the sensor data of different dimensions can be unified;
assuming that the acceleration values of the x, y, z axes at rest constitute a vector (a)x,ay,az) The offset value (offset) of each axis is (o)x,oy,oz) The scale factor (scale factor) of each axis is(s)x,sy,sz) Then they satisfy the formula:
in practical experiments, the values found by us, or their existence, are not enough to affect the accuracy of the subsequent data, i.e. they can be considered negligible, and as the production process is improved, the bias is smaller and smaller, then equation (1) can be approximately converted into:
in the general case, the scale factors (scale factors) of the x, y, and z axes are approximately the same, we denote s, and equation 2 can be developed as:
s is the resultant value of the acceleration at rest;
then, the synthesized acceleration data (a'x,a’y,a’z) The normalized composite acceleration is:
that is, the acceleration data may be normalized to a scenario with a static dimension of 1 by dividing by the static synthetic acceleration, so that sensor data of different offset values (offsets) and scale factors (scale factors) are comparable.
(2) Static state detection
If the user is required to participate in the gathering of static acceleration data, the burden on the system is reduced. Only an interactive interface is provided to prompt a user to place the mobile phone still, a start button and a stop button are pressed, and the system records acceleration data in the time interval;
if user involvement is not required in view of the ease of use of the system, the system may collect data at 2-3 am and count up the data for one second, consider the device to be in a stationary state when the variance of the data on each axis does not exceed 0.5% of the mean, and then calculate the mode of the composite acceleration for that one second as the estimate of the composite acceleration at rest.
(3) The sampling frequency uniformization method based on the time window comprises the following steps:
1) selection of a time window before resampling
The motion of walking, running, ascending stairs, descending stairs, etc. has periodicity, and on a curve drawn according to the data acquired by the acceleration sensor, the curve shows a form in which peaks and troughs appear alternately. When we consider operating or processing these behavior data, the most intuitive idea is to process in units of one behavior cycle. The existing documents mostly adopt a 2-second time window, namely, it is generally considered that the data collected in the 2-second time window is enough to cover one action cycle; under the condition that the sampling frequency of the acceleration sensor is constant, the number of the data collected per second is the same, so that the data with the number 2 times of the sampling frequency can be directly collected as the data in the time window of 2 seconds. However, the problem faced by us is that the sampling frequency is variable, so that it is not possible to simply use a fixed number as a time window; in the data acquisition process, each piece of data corresponds to a time stamp, and the data format with the time stamp is as follows:
(year,month,day,hour,minute,sencond,ax,ay,az),
we regard the data with the same time stamp as the same second, take out the data therein:
wherein m is the number of data in the 1 second window, and according to the mode of dividing the time windows, the number of data in different time windows is changed due to unstable sampling frequency;
2) selection of data window length after resampling
For applications based on acceleration sensor data, it is an essential stage to extract features from raw data, and a commonly used method for extracting features is a sliding window method. While the size of the sliding window is constant and in order to apply a Fast Fourier Transform (FFT), the size of the window is typically taken to be an integer power of 2. Based on the experience accumulated during the study, we set the size of the window to 64, i.e. the sampling frequency of the acceleration sensor is 32;
3) resampling operations within a one-second time window
Namely, resampling the data collected by the acceleration sensors with different sampling frequencies to the frequency of 32Hz, namely: if the sampling frequency is lower than 32Hz, the interpolation operation is carried out, and if the sampling frequency is higher than 32Hz, the downsampling operation is carried out. The data are interpolated to the frequency of 32Hz by adopting a linear interpolation method, and the linear interpolation method not only can effectively supplement the data and well keep the original waveform, but also has low time complexity and small calculated amount.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (5)
1. A method for identifying user behaviors across mobile phone platforms is characterized by comprising the following steps:
s1: collecting data on a data collection platform A and labeling a collected data set DataA;
s2: processing the labeled data set DataA by adopting an acceleration sensor data processing method based on time-frequency domain transformation, then establishing a behavior recognition model ModelA, and recording a parameter beta of the established behavior recognition model ModelA;
s3: acquiring data on a data acquisition platform B to obtain a data set DataB, and processing the data set DataB by adopting an acceleration sensor data processing method based on time-frequency domain transformation;
the acceleration sensor data processing methods based on time-frequency domain transformation adopted in step S2 and step S3 each include the following steps:
s11: acquiring original triaxial acceleration data, namely fixedly configuring a data acquisition platform A or a data acquisition platform B on a user body, and acquiring triaxial acceleration data of the user in any state through a triaxial acceleration sensor built in the data acquisition platform A or the data acquisition platform B;
s12: the acceleration signals are synthesized, and it is assumed that the triaxial acceleration data collected in step S11 is (a'x,a’y,a’z) Then the combined value of the three-axis acceleration of the user under any state is
S13: detecting a static state, namely, a user places a data acquisition platform A or a data acquisition platform B in a static state, and acquires self triaxial acceleration data of the data acquisition platform A or the data acquisition platform B when the data acquisition platform A or the data acquisition platform B is static through a triaxial acceleration sensor built in the data acquisition platform A or the data acquisition platform B;
s14: the mode S of the stationary combined acceleration is calculated, and the three-axis acceleration data acquired in step S13 is assumed to be (a)x,ay,az) If the data acquisition platform A or the data acquisition platform B is static, the composite value of the self triaxial acceleration is S, and
s15: and (3) synthesizing acceleration normalization, wherein the synthesized acceleration obtained by normalizing the triaxial acceleration data of the user in any state is as follows:
s16: resampling data within one second, taking T as a time window, generating a synthesized acceleration data oscillogram from the synthesized acceleration data collected by the data collection platform A or the data collection platform B in the time window, and then setting the data resampling frequency as F, namely performing interpolation operation if the sampling frequency of the synthesized acceleration data oscillogram generated by the data collection platform A or the data collection platform B is lower than F, and performing downsampling operation if the sampling frequency is higher than F;
s17: extracting subsequent features, namely extracting the features in the synthesized acceleration data oscillogram generated in the step S16 by adopting a sliding window method;
s4: and directly realizing a behavior recognition model Modela with the parameter beta on a data acquisition platform B, and then recognizing the processed data set DataB by adopting the behavior recognition model Modela so as to predict the real-time behavior of the user.
2. The method for identifying user behaviors across mobile phone platforms according to claim 1, wherein: in step S2, a decision tree classification method or a neural network classification method is used when the behavior recognition model is established.
3. The method for identifying user behaviors across mobile phone platforms according to claim 1, wherein: the time window T is 2 seconds.
4. The method for identifying user behaviors across mobile phone platforms according to claim 1, wherein: the data resampling frequency F is 32 Hz.
5. The method for identifying user behaviors across mobile phone platforms according to claim 1, wherein: the data acquisition platform A and the data acquisition platform B are mobile phones with two different operating systems.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810967532.0A CN109189221B (en) | 2018-08-23 | 2018-08-23 | User behavior identification method across mobile phone platforms |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810967532.0A CN109189221B (en) | 2018-08-23 | 2018-08-23 | User behavior identification method across mobile phone platforms |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109189221A CN109189221A (en) | 2019-01-11 |
CN109189221B true CN109189221B (en) | 2021-07-16 |
Family
ID=64919682
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810967532.0A Active CN109189221B (en) | 2018-08-23 | 2018-08-23 | User behavior identification method across mobile phone platforms |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109189221B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102681893A (en) * | 2011-03-09 | 2012-09-19 | 腾讯科技(深圳)有限公司 | Cross-platform implementation method for executable programs and mobile terminal |
CN103984416A (en) * | 2014-06-10 | 2014-08-13 | 北京邮电大学 | Gesture recognition method based on acceleration sensor |
CN105046215A (en) * | 2015-07-07 | 2015-11-11 | 中国科学院上海高等研究院 | Posture and behavior identification method without influences of individual wearing positions and wearing modes |
CN106095099A (en) * | 2016-06-12 | 2016-11-09 | 南京邮电大学 | A kind of user behavior motion detection recognition methods |
CN106643722A (en) * | 2016-10-28 | 2017-05-10 | 华南理工大学 | Method for pet movement identification based on triaxial accelerometer |
CN107270931A (en) * | 2016-12-23 | 2017-10-20 | 浙江从泰网络科技有限公司 | A kind of IOS and the general gait auto-correlation pedometer of Android platform |
CN107277222A (en) * | 2016-12-20 | 2017-10-20 | 浙江从泰网络科技有限公司 | User behavior state judging method based on mobile phone built-in sensors |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104662547A (en) * | 2012-10-19 | 2015-05-27 | 迈克菲股份有限公司 | Mobile application management |
US9794229B2 (en) * | 2015-04-03 | 2017-10-17 | Infoblox Inc. | Behavior analysis based DNS tunneling detection and classification framework for network security |
-
2018
- 2018-08-23 CN CN201810967532.0A patent/CN109189221B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102681893A (en) * | 2011-03-09 | 2012-09-19 | 腾讯科技(深圳)有限公司 | Cross-platform implementation method for executable programs and mobile terminal |
CN103984416A (en) * | 2014-06-10 | 2014-08-13 | 北京邮电大学 | Gesture recognition method based on acceleration sensor |
CN105046215A (en) * | 2015-07-07 | 2015-11-11 | 中国科学院上海高等研究院 | Posture and behavior identification method without influences of individual wearing positions and wearing modes |
CN106095099A (en) * | 2016-06-12 | 2016-11-09 | 南京邮电大学 | A kind of user behavior motion detection recognition methods |
CN106643722A (en) * | 2016-10-28 | 2017-05-10 | 华南理工大学 | Method for pet movement identification based on triaxial accelerometer |
CN107277222A (en) * | 2016-12-20 | 2017-10-20 | 浙江从泰网络科技有限公司 | User behavior state judging method based on mobile phone built-in sensors |
CN107270931A (en) * | 2016-12-23 | 2017-10-20 | 浙江从泰网络科技有限公司 | A kind of IOS and the general gait auto-correlation pedometer of Android platform |
Also Published As
Publication number | Publication date |
---|---|
CN109189221A (en) | 2019-01-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106289309B (en) | Step-recording method and device based on 3-axis acceleration sensor | |
CN101695445B (en) | Acceleration transducer-based gait identification method | |
CN106228200B (en) | Action identification method independent of action information acquisition equipment | |
Figo et al. | Preprocessing techniques for context recognition from accelerometer data | |
Candás et al. | An automatic data mining method to detect abnormal human behaviour using physical activity measurements | |
CN104567912B (en) | Method for realizing pedometer on Android mobile phone | |
CN104042191A (en) | Wrist watch type multi-parameter biosensor | |
CN109737952A (en) | Rope skipping data processing method, device and wearable device | |
CN111063437B (en) | Personalized chronic disease analysis system | |
CN107072550A (en) | Body moves recording method and device | |
CN110461215A (en) | Health mark is determined using portable device | |
CN107582077A (en) | A kind of human body state of mind analysis method that behavior is touched based on mobile phone | |
CN202515671U (en) | Non-contact mental scanning and analyzing device | |
CN108958482A (en) | A kind of similitude action recognition device and method based on convolutional neural networks | |
EP4013303A1 (en) | Method and system for analysing biomechanical activity and exposure to a biomechanical risk factor on a human subject in a context of physical activity | |
CN110532898A (en) | A kind of physical activity recognition methods based on smart phone Multi-sensor Fusion | |
CN112464738A (en) | Improved naive Bayes algorithm user behavior identification method based on mobile phone sensor | |
Hirawat et al. | A dynamic window-size based segmentation technique to detect driver entry and exit from a car | |
CN109189221B (en) | User behavior identification method across mobile phone platforms | |
CN115137308A (en) | Method for improving accuracy of in-out sleep detection in sleep algorithm of intelligent wearable device | |
Casaseca-de-la-Higuera et al. | Effect of downsampling and compressive sensing on audio-based continuous cough monitoring | |
Saleh et al. | A highly reliable wrist-worn acceleration-based fall detector | |
CN109271889A (en) | A kind of action identification method based on the double-deck LSTM neural network | |
Chakraborty et al. | An approach for designing low cost deep neural network based biometric authentication model for smartphone user | |
CN109993132B (en) | Pattern recognition generation method and system based on electroencephalogram signals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |