WO2014191803A1 - Acceleration-based step activity detection and classification on mobile devices - Google Patents

Acceleration-based step activity detection and classification on mobile devices Download PDF

Info

Publication number
WO2014191803A1
WO2014191803A1 PCT/IB2014/000786 IB2014000786W WO2014191803A1 WO 2014191803 A1 WO2014191803 A1 WO 2014191803A1 IB 2014000786 W IB2014000786 W IB 2014000786W WO 2014191803 A1 WO2014191803 A1 WO 2014191803A1
Authority
WO
WIPO (PCT)
Prior art keywords
activity
windows
peaks
frequency spectrum
user
Prior art date
Application number
PCT/IB2014/000786
Other languages
French (fr)
Inventor
Vivek CHANDEL
Anirban DUTTA Choudhury
Original Assignee
Tata Consultancy Services Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tata Consultancy Services Limited filed Critical Tata Consultancy Services Limited
Publication of WO2014191803A1 publication Critical patent/WO2014191803A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the present subject matter relates, in general, to detection and classification of- step activity of a user and, particularly- but not exclusively, to detection and classification of acceleration-based step activity of a user on a mobile device.
  • the activities may include walking, jogging, and running.
  • Such activities performed by an individual are tracked for monitoring health of the individual.
  • the tracking of activities includes determining a status of their activities including the type of activity being performed and the rate at which the activity is being performed.
  • Figure 1 illustrates a mobile device for detection and classification of an acceleration-based step activity of a user, in accordance with an implementation of the present subject matter.
  • Figure 2 illustrates a method for detection and classification of an acceleration-based step activity of a user on a mobile device, in accordance with an implementation of the present subject matter.
  • the subject matter disclosed herein relates to mobile device(s) for detection and classification of acceleration-based step activity of a user and method(s) for detection and classification of acceleration-based step activity of a user on a mobile device.
  • the user can be an individual performing an acceleration-based step activity, and the performed acceleration-based step activity can be detected and classified on a mobile device carried by the user.
  • Acceleration-based step activity refers to an activity, performed by an individual, in which the individual takes steps to set himself in motion and remain in motion. Each of the steps taken by the individual has multiple acceleration values associated with it, which can be used for step detection.
  • the acceleration-based step activity can include walking, brisk walking, running, jogging, and the like.
  • the acceleration-based step activity hereinafter is interchangeably referred to as step activity.
  • inertial sensors for example, accelerometers
  • MEMS Micro- Electro-Mechanical Systems
  • portable or mobile electronic devices such as cellular phones, portable music players, pedometers, game controllers, and portable computers.
  • inertial sensors have low cost and low power consumption.
  • a step activity performed by the user can be detected by the inertial sensor in mobile device carried by the user.
  • the detection and classification of a step activity of a user through a mobile device is known.
  • the mobile device is required to be trained or pre-run before the detection and classification of the step activity of the actual user.
  • data sets of acceleration signals, from the inertial sensor are generally taken for a variety of test-users and for a variety of step activities performed by the test-users.
  • the data sets of acceleration signals for the test-users and the step-activities are processed in the mobile device to classify the step activities into a predefined set of step activities. Since the acceleration signals detected for the actual user may be significantly different from those for the test-users considered during the training stage, the conventional methodologies may lead to an inaccuracy in step activity classification.
  • the user himself has to spend time and train the algorithm, used for the classification, using his own data sets of acceleration signals for each of the different step activities.
  • the motion of the user during each of the activities is detected by the inertial sensors and its characteristics are stored in the memory of the mobile device. The stored characteristics are used as reference for the detection and classification of user's step activities in real-time.
  • some conventional methodologies for detection and classification of step activity of the user are dependent on the placement and orientation of the mobile device. For this, the user has to ensure that the mobile device is placed at a predefined position and with a predefined orientation while he is performing the activity. In such cases, depending on how and where the mobile device is kept and any change in the placement and the orientation of the mobile device from its desired position while performing the activity, the acceleration signal profile changes, which leads to a substantially inaccurate classification of the activity of the user. In addition to the restrictions on the placement and orientation of the mobile devices, conventionally, prior training is also required over the data sets for different positions and orientations of the mobile device and for each of the step activities.
  • step activity for a user, there may exist stationary periods in which either no activity is being performed or other non-step activities are being performed.
  • the acceleration signals received during these stationary periods may be significant and may have amplitudes comparable to the acceleration signals due to the step activity.
  • the acceleration signals during the stationary periods are unwanted signals and are referred to as noise. Such noise in the acceleration signal may lead to false detection of step activity for the user.
  • the present subject matter describes mobile device(s) for detection and classification of step activity of a user, and method(s) for detection and classification of step activity of the user on the mobile device.
  • the step activity may include walking, brisk walking, running, jogging, and a combination thereof.
  • the mobile device may include a mobile phone, a smart phone, a portable music player, a pedometer, a game controller, a portable computer and the like, having an inertial sensor.
  • motion of the user is detected by the inertial sensor which then converts the motion into acceleration signals.
  • the acceleration signals contain information about the step activity of the user and are processed in order to classify the step activity of the user.
  • the acceleration signals representing the motion of the user, are generated by the inertial sensors in the form of a data stream.
  • the data stream of the acceleration signals from the inertial sensors is divided into data windows of a predetermined time period.
  • the obtained data windows are then processed one by one for the classification of the step activity of the user.
  • the processing of each data window includes zero normalization, linear interpolation, and low-pass filtration. The low-pass filtration removes high frequency signals, which improves the signals within the data window for subsequent processing.
  • a frequency spectrum is obtained, and peaks are identified within the frequency spectrum of each of the data windows. Based on the peaks within the frequency spectrum of a data window, it is identified whether the data window is a non-stationary activity window or a stationary activity window.
  • a non-stationary activity window is a data window in which the signal contains peaks indicative of the step activity of the user.
  • a stationary activity window is a data window in which the signal, including any peaks therein, is indicative of noise and not the step activity of the user.
  • the identification of non-stationary activity windows is based on a maximum amplitude peak, a number of peaks, and a peak sharpness measure of the maximum amplitude peak within the frequency spectrum of a corresponding data window.
  • the maximum amplitude peak is the peak with the maximum amplitude from amongst all peaks in the frequency spectrum of the data window.
  • the number of peaks is based on peaks whose amplitude is within a predefined percentage, for example, 75 %, of the amplitude of the maximum amplitude peak.
  • the present subject matter is based on the identification of non-stationary activity windows.
  • the non-stationary activity windows are identified and subsequently analyzed for classification of the step activity of the user. Since the classification of step activity, in accordance with the present subject matter, is based on the non-stationary activity windows, and the stationary activity windows are not considered for the classification, the classification of the step activity is not affected by the noise in the acceleration signals. This facilitates in substantial elimination of any false detection and classification of the step activity which was present in the conventional methodologies.
  • peaks within the frequency spectrum of each non-stationary activity window is validated for being indicative of true activity steps.
  • Each of the valid peaks is indicative of a step which the user takes while performing the step activity.
  • the number of valid peaks is indicative of the number of steps or the step count of user performing the step activity.
  • the determination of valid peaks is based on Dynamic Time Warping (DTW) process and Individual Peak Analysis (IPA) performed on each non-stationary activity window.
  • DTW Dynamic Time Warping
  • IPA Individual Peak Analysis
  • the step activity of the user is classified into one of predefined acceleration-based step activities.
  • the predefined acceleration-based step activities may include a combination of activities from amongst walking, brisk walking, running, jogging, and the like.
  • activity weights are calculated based on the valid peaks within the frequency spectrum of each of the non-stationary activity windows and predefined threshold frequencies of each of the predefined acceleration-based step activities. Each activity weight is indicative of the contribution of the peaks in the frequency spectrum of the non-stationary activity window to the corresponding predefined acceleration-based step activity.
  • the number of predefined threshold frequencies required and the number of activity weights to be calculated varies depending on the number of predefined acceleration-based step activities into which the step activity of the user is to be classified.
  • the predefined acceleration-based step activity into which the step activity being performed by the user is classified, is displayed on the mobile device.
  • the predefined acceleration-based step activity, into which the step activity being performed by the user is classified, and a step count determined based on the number of valid peaks within the frequency spectrum of the non-stationary activity windows are displayed on the mobile device. This facilitates in providing information to the user about which step activity he is performing and at what rate the step activity is being performed.
  • the present subject matter can be implemented into practice and does not need any prior training with data sets of acceleration signals for a variety of test-users or from the user himself. It saves time of the user since the classification is carried out in real-time and does not require training prior to actual use by the user which was otherwise required in the conventional methodologies. Also, in the present subject matter, the mobile device is agnostic of position and orientation, and thus can be placed in any position and with any orientation for the detection and classification of the step activity of the user. This removes restrictions on the placement and orientation of the mobile device and also removes the need of training for the different placements and orientations of the mobile device in the convention.
  • FIG. 1 illustrates a mobile device 100 for detection and classification of an acceleration-based step activity, in accordance with an implementation of the present subject matter.
  • the acceleration-based step activity is hereinafter referred to as the step activity.
  • the mobile device 100 is a device having an inertial sensor 130 and can be carried while performing the step activity.
  • the inertial sensor 130 may include an accelerometer.
  • the mobile device 100 may include a mobile phone, a smart phone, a portable music player, a pedometer, a game controller, a portable computer, and the like. As shown in Figure 1 , the mobile device 100 is carried by a user 102 performing the step activity. The mobile device 100 may belong to the user 102. The user 102 may hold the mobile device 100 is his hand, or place the mobile device 100 in a pocket or a bag, or may couple the mobile device 100 using a coupling means, while performing the step activity.
  • the mobile device 100 includes processor(s) 104.
  • the processor(s) 104 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • the processor(s) 104 is configured to fetch and execute computer-readable instructions stored in a memory.
  • the mobile device 100 includes interface(s) 106.
  • the interfaces may include a variety of machine readable instruction-based and hardware-based interfaces that allow the mobile device 100 to communicate with other devices, including servers, data sources and external repositories. Further, the interface(s) 106 may enable the mobile device 100 to communicate with other communication devices, such as network entities, over a communication network.
  • the mobile device 100 includes a memory 108.
  • the memory 108 may be coupled to the processor(s) 104.
  • the memory 108 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
  • volatile memory such as static random access memory (SRAM) and dynamic random access memory (DRAM)
  • non-volatile memory such as read only memory (ROM), erasable read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
  • the mobile device 100 includes module(s) 1 10 and data 1 12.
  • the module(s) 1 10 and the data may be coupled to the processor(s) 104.
  • the modules 1 10 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
  • the data 1 12 serves, amongst other things, as a repository for storing data that may be fetched, processed, received, or generated by the module(s) 1 10.
  • the module(s) 1 10 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof.
  • the processing unit can comprise a computer, a processor, a state machine, a logic array or any other suitable devices capable of processing instructions.
  • the processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform tasks or, the processing unit can be dedicated to perform the required functions.
  • the module(s) 1 10 may be machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the desired functionalities.
  • the machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk or other machine-readable storage medium or non-transitory medium.
  • the machine- readable instructions can also be downloaded to the storage medium via a network connection.
  • the module(s) 1 10 include a signal processing module 1 14, a step activity classifier 1 16, a display module 1 18, and other module(s) 120.
  • the other module(s) 120 may include programs or coded instructions that supplement applications or functions performed by the mobile device 100.
  • the data 1 12 includes signal data 122, signal processing data 124, activity classifier data 126, and other data 128.
  • the other data 128 amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s) 1 10.
  • the description hereinafter describes the detection and classification of the step activity of the user 102 on the mobile device 100 carried by the user 102.
  • the inertial sensor 130 in the mobile device 100 detects the motion of the user 102 due to the step activity and generates a data stream of the acceleration signals corresponding to the motion of the user 102.
  • the data stream may be generated by the inertial sensor 130 at a sampling frequency of 80 Hz.
  • the signal processing module 1 14 receives the data stream of the acceleration signals generated by the inertial sensor 130, and processes the data stream for the classification of the step activity of the user 102.
  • the signal processing module 1 14 For the purpose of processing, the signal processing module 1 14 generates data windows of a predetermined time period by dividing the data stream of the acceleration signals.
  • the predetermined time period is 2 seconds. In an implementation, the predetermined time period may be a value between 1.5 seconds to 2.5 seconds.
  • the data windows thus generated are stored in the signal data 122.
  • each of the data windows is processed for zero normalization, linear interpolation, and low-pass filtration.
  • the low pass filtration may be performed using a low pass discrete time filter. With this, the noise and the disturbances incorporated by the high frequency signals are removed and the signal in the data windows is substantially improved for subsequent processing.
  • a frequency spectrum is obtained and peaks are identified within the frequency spectrum of the each of the data windows.
  • the frequency spectrum may be obtained by performing Fast Fourier Transform (FFT), or Discrete Fourier Transform (DFT), on the data windows.
  • FFT Fast Fourier Transform
  • DFT Discrete Fourier Transform
  • the frequency spectrum of a data window may have one or more peaks corresponding to a true step activity and may have one or more peaks corresponding to noise or other non-step based activities, such as movements of the mobile device 100.
  • the signal processing module 1 14 based on the peaks within the frequency spectrum of the each of the data windows, the signal processing module 1 14 identifies the non-stationary activity windows from amongst all the data windows. By the identification of non-stationary activity windows, any false detection and classification of the step activity of the user 102 can be substantially eliminated.
  • a maximum amplitude peak in the frequency spectrum is identified.
  • the maximum amplitude peak is the peak which has the highest maximum amplitude from amongst all peaks in the frequency spectrum of the data window.
  • the amplitude of the maximum amplitude peak is denoted by ; d
  • the frequency at which the amplitude is maximum be denoted by f d .
  • a number of peaks within 75 % of the height of the amplitude A d is determined. Let the number of peaks be denoted by P n .
  • a measure of peak sharpness of the maximum amplitude peak in the frequency spectrum of the corresponding data window is computed.
  • the peak sharpness measure is denoted by Psharpness ⁇ The greater is the peak sharpness measure Psharpness » the sharper is the peak.
  • the peak sharpness measure Psharpness ' s computed based on the below equation ( 1 ): sharpness
  • k is a number of DFT components about the maximum amplitude A d
  • f d is the frequency corresponding to the amplitude A d
  • A(i) is the amplitude at the i th DFT component
  • a i + 1) is the amplitude at the DFT components next to the i th DFT component for the maximum amplitude peak in the frequency spectrum.
  • k may be equal to 1
  • a (0) is the amplitude A d .
  • the amplitude A d is compared with a predefined threshold amplitude and the peak sharpness measure sh rpness is compared to a predefined threshold sharpness.
  • the predefined threshold amplitude may be 1 and the predefined threshold sharpness may be 0.3.
  • the predefined threshold sharpness of 0.3 differentiates distinct sharp peaks from otherwise wide peaks which may be caused due to noises or any other non-step activity.
  • the data window is identified as a non-stationary activity window, else the data window is identified as a stationary activity window. Further, if the amplitude A d is greater than or equal to 1 and if the peak sharpness measure Pshar ness ' s greater than 0.3 and if the number of peaks P n is 0, 1 or 2, then the data window is identified as a non-stationary activity window, else the data window is identified as a stationary activity window.
  • the signal processing module 1 14 analyzes the non- stationary windows for the classification of the step activity of the user 102.
  • the frequency spectra of the non-stationary activity windows has peaks, out of which either some or all may be valid peaks.
  • valid peaks in the frequency spectrum of the non-stationary activity windows are determined. The determination of valid peaks is based on Dynamic Time Warping (DTW) process and Individual Peak Analysis (IPA). Each of the valid peaks is indicative of a true activity step, which may represent the step taken by the user 102 while performing the acceleration-based step activity.
  • DTW Dynamic Time Warping
  • IPA Individual Peak Analysis
  • DTW t is an indicator value used to determine whether the DTW process has succeeded or failed to determine the valid peaks.
  • the confidence score is indicative of the percentage of peaks determined as valid in the frequency spectrum of the non-stationary activity window.
  • the value of CS is 100 %, otherwise the value of CS is less than 100 %.
  • the i tfl peak is assigned a status of 'check' and the (i - 2) th peak is assigned a status of 'invalid', if its status is not already 'valid'.
  • the tolerance t is 5%.
  • the peaks are individually validated. For validating the i th peak in the frequency spectrum of the non-stationary window as the true activity where A t is the amplitude of the i th peak, A PM is the preceding peak minima. The preceding peak minima refers to the minimum amplitude of acceleration between the i th peak and the last occurred valid peak. Further, SDj is the sample distance between the i th peak and the last occurred valid peak before the i th peak. Further, f s is the sampling frequency. Based on the value of the product Prodi, a criterion for validating the i th peak in the frequency spectrum of the non-stationary window as the true activity step is checked. The criterion is given by:
  • a d is the amplitude of the maximum amplitude peak in the frequency spectrum of the corresponding non-stationary window, and f d is the frequency at which the amplitude i4 d occurs.
  • is a constant which is less than 1 .
  • T is taken as 0.75 as it substantially differentiates the valid peaks from the invalid peaks. The valid peaks correspond to actual activity steps and the invalid peaks represent false peaks generated by noise. If the criterion given by equation (3) is satisfied for the i th peak, then that peak is a valid peak.
  • a step cycle length (SCL) for each valid peak is calculated.
  • the SCLi is the sample distance between the invalid peak in the frequency spectrum of the non-stationary activity window and the (i— 2) th valid peak in the frequency spectrum of the non-stationary activity window.
  • the DTW process is performed on the frequency spectrum of the current non-stationary activity window.
  • the confidence score CS is checked. If the confidence score CS for that window is 100%, then the value of DTW t is updated as the mean of all SCL values found for the current non-stationary activity window. If the confidence score CS for that window is not 100%, then DTW ⁇ is updated as 0, and the IPA is performed on the frequency spectrum of the non- stationary activity window. Based on the SCL values obtained after the IPA, the value of DTW t is updated as the mean of all SCL values found for the current non- stationary activity window. The above procedure is iteratively repeated for all the non-stationary activity windows.
  • the data thus generated by the signal processing module 1 14 for the non-stationary activity windows is stored in the signal processing data 124.
  • the step activity classifier 1 16 classifies the step activity of the user 102 into one of the predefined acceleration- based step activities.
  • the frequency spectrum of at least two continuous non-stationary activity windows are processed by the signal processing module 1 14 as described above.
  • the predefined acceleration-based step activities may include a combination of activities from amongst walking, brisk walking, running, jogging, and the like.
  • the step activity classifier 1 16 calculates activity weights based on the data for the valid peaks and threshold frequencies of the predefined acceleration-based step activities. The process of calculation of activity weights and subsequent classification of the step activity of the user 102 is described below.
  • the predefined acceleration-based step activities include "walking", "brisk walking”, and "running”.
  • two predefined threshold frequencies namely a brisk-walking threshold frequency f bw and a running threshold frequency f r are used.
  • the brisk-walking threshold frequency f bw refers to the threshold frequency marking the separation between walking-brisk- walking
  • the running threshold frequency f r refers to the threshold frequency marking the separation between brisk-walking-running.
  • four activity weights, ⁇ , w 2 , w 3 and w 4 are defined.
  • the activity weight is a measure of the amount of walking activity
  • the activity weights w 2 and w 3 are a measure of the amount of brisk- walking activity
  • the activity weight w is a measure of the amount of running activity. Initially all the activity weights, ⁇ , ⁇ 2 , ⁇ 3 , and w 4 are set to 0.
  • the step activity classifier 1 16 calculates a step frequency SF, for the i tfl valid peak in the frequency spectrum of the non-stationary activity window.
  • the step frequency SFj is computed based on the value of SCL, of the i t l valid peak and sampling frequency f s .
  • the value of step frequency SFj for the i th valid peak is calculated based on equation (5) below:
  • the step activity classifier 1 16 compares the step frequency SF, with the threshold frequencies / bvv and / r . If the step frequency SF, is greater than 1 and less than the brisk-walking threshold frequency f bw then the activity weight Wi is calculated as:
  • step frequency SF is greater than or equal to the brisk-walking threshold frequency f bw and less than running threshold frequency /,-, then the activity weights w 2 and vv 3 are calculated as follows:
  • the activity weight w 4 is calculated as follows:
  • the step activity classifier 1 16 classifies the step activity of the user 102 in to one of the predefined acceleration-based step activities based on the above calculated activity weights. For the classification, the activity weight is compared to the activity weight w 4 , the activity weight w 2 is compared to the activity weight w 1 and the activity weight w 3 is compared to the activity weight w 4 . If the value of the activity weight vv a is greater than the activity weight w 4 and the value of the activity weight Wi is greater than or equal to activity weight w 2 , then the step activity of the user 102 is classified as "walking".
  • the step activity of the user 102 is classified as "brisk walking". If the value of the activity weight ⁇ is less than or equal to the activity weight w and the value of the activity weightw 3 is greater than w 4 , then also the step activity of the user 102 is classified as "brisk walking”. If the value of the activity weight is less than or equal to the activity weight w 4 and the value of the activity weight w 3 is less than or equal to vv 4 , then the step activity of the user 102 is classified as "running”. The data associated with the classified activity is stored in the activity classifier data 126.
  • the number of predefined threshold frequencies and the number of activity weights to be calculated varies depending on the number of predefined acceleration-based step activities used for the classification. In an implementation using n number of predefined acceleration-based step activities, (n— 1) number of threshold frequencies is used and (n + 1) number of activity weight is calculated. In the implementation, all the (n -I- 1) activity weights have 0 as the initial values.
  • the display module 1 18 displays on the mobile device 100, that acceleration-based step activity under which the step activity being performed by the user 102 is classified. Further, in an implementation, a step count, determined based on the number of valid peaks within the frequency spectrum of the non-stationary activity window, is displayed on the mobile device 100 by the display module 1 18. This display facilitates the user 102 to monitor the step activity he is performing and the rate at which he is performing that step activity.
  • the display module 1 18 may display "stationary" as the activity.
  • the step activity performed by the user 102 along with the step count may be stored in the activity classifier data 126 to generate a report for the user 102 stating the type of step activities performed by the user 102 in a session.
  • the step count for a step activity may be calibrated with respect to calories burnt, and the report may state the number of calories burnt by the user 102 in the session.
  • the acceleration signals generated by the inertial sensor, the data generated based on the processing done by the signal processing module 1 14, and the data associated with the type of step activity and the step count may be transmitted from the mobile device 100 to another device, such as a personal computer, a laptop, and the like, for various purposes including keep a track or a history of the step activities and the step counts performed by the user 102.
  • the mobile device 100 may be equipped with compatible input/output interfaces for such communication.
  • the communication may take places via a wired communication link, such as a data cable, or via a wireless communication link, such as BluetoothTM, IR and WiFi.
  • Figure 2 illustrates a method 200 for detection and classification of an acceleration-based step activity of a user 102.
  • the method 200 is implemented in a mobile device 100.
  • the order in which the method 200 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 200, or any alternative methods. Additionally, individual blocks may be deleted from the method 200 without departing from the spirit and scope of the subject matter described herein.
  • the method 200 can be implemented in any suitable hardware.
  • the method 200 may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
  • the method 200 may be implemented in any mobile device having an inertial sensor; in an example described in Figure 2, the method 200 is explained in context of the aforementioned mobile device 100, for the ease of explanation.
  • acceleration signal generated by an inertial sensor 130 in the mobile device 100, while the user 102 is performing a step activity are received from the inertial sensor 130 for processing.
  • the user 102 takes steps which set the user 102 into motion. This motion of the user 102 performing the step activity has an acceleration associated with it.
  • the inertial sensor 130 in the mobile device 100 detects the motion of the user 102 and generates a data stream of acceleration signals for the motion of the user 102.
  • the acceleration signal is divided into data windows of a predetermined period. In an implementation, the predetermined period is 2 seconds. In another implementation, the predetermined period may be between 1.5 seconds to 2.5 seconds. After obtaining the data windows, the data windows are processed for zero normalization, linear interpolation and low pass filtration.
  • peaks are identified within a frequency spectrum of each of the data windows.
  • the frequency spectrum of the data windows may be obtained using Fast Fourier Transform (FFT).
  • FFT Fast Fourier Transform
  • non-stationary activity windows are identified from amongst the data windows of the acceleration signals by obtaining a maximum amplitude peak, number of peaks and peak sharpness measure from the frequency spectrum of each data window. The computation of the peak sharpness measure, and the identification of the non-stationary activity windows are performed in a manner as described earlier in the description.
  • valid peaks are determined within the frequency spectrum of the non- stationary activity window. These valid peaks are indicative of true activity steps and are determined based on Dynamic Time Warping (DTW) and Individual Peak Analysis (IPA). The valid peaks are determined as described earlier in the description.
  • DTW Dynamic Time Warping
  • IPA Individual Peak Analysis
  • the step activity of the user 102 is classified into one of the predefined acceleration-based step activities based on the valid peaks obtained within the frequency spectrum of each of the non-stationary activity window.
  • the classification of the step activity includes, computing step frequency for the valid peaks and computing activity weights based on the step frequencies and predefined threshold frequencies for the predefined acceleration- based step activities.
  • the step activity of the user 102 is then classified based on the activity weights. The computation of step frequency and the activity weights and the classification are done in a manner as described earlier in the description.
  • the classified activity and the step count determined based on the valid peaks in the frequency spectrum of the non- stationary activity windows are displayed on the mobile device 100 for the user 102.

Abstract

The present subject matter discloses a mobile device (100) and a method for detection and classification of an acceleration^based step activity of a user. According to the present subject matter, the mobile device (100) implements the described method, where the method includes receiving acceleration signals from an inertial sensor (130) in the mobile device (100) carried by the user performing a step activity. The acceleration signals are divided into data windows. Peaks are identified within a frequency spectrum of each of the data windows, and non- stationary activity windows are identified from amongst the data windows, based on the peaks within the frequency spectrum of each of the data windows. Further, valid peaks are determined within each of the non-stationary activity windows, and the step activity of the user is classified into one of predefined acceleration- based step activities based on the valid peaks within the each non-stationary activity window.

Description

ACCELERATION-BASED STEP ACTIVITY DETECTION AND CLASSIFICATION ON MOBILE DEVICES
TECHNICAL FIELD
[0001] The present subject matter relates, in general, to detection and classification of- step activity of a user and, particularly- but not exclusively, to detection and classification of acceleration-based step activity of a user on a mobile device.
BACKGROUND
[0002] Individuals are increasingly becoming conscious of their health and are incorporating a variety of activities in their daily routine to maintain their health. The activities may include walking, jogging, and running. Such activities performed by an individual are tracked for monitoring health of the individual. The tracking of activities includes determining a status of their activities including the type of activity being performed and the rate at which the activity is being performed.
BRIEF DESCRIPTION OF DRAWINGS
[0003] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer same features and components.
[0004] Figure 1 illustrates a mobile device for detection and classification of an acceleration-based step activity of a user, in accordance with an implementation of the present subject matter.
[0005] Figure 2 illustrates a method for detection and classification of an acceleration-based step activity of a user on a mobile device, in accordance with an implementation of the present subject matter.
[0006] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computing device or processor, whether or not such computing device or processor is explicitly shown.
DETAILED DESCRIPTION
[0007] The subject matter disclosed herein relates to mobile device(s) for detection and classification of acceleration-based step activity of a user and method(s) for detection and classification of acceleration-based step activity of a user on a mobile device.
[0008] For the purposes of the present subject matter, the user can be an individual performing an acceleration-based step activity, and the performed acceleration-based step activity can be detected and classified on a mobile device carried by the user. Acceleration-based step activity refers to an activity, performed by an individual, in which the individual takes steps to set himself in motion and remain in motion. Each of the steps taken by the individual has multiple acceleration values associated with it, which can be used for step detection. The acceleration-based step activity can include walking, brisk walking, running, jogging, and the like. For the sake of simplicity, the acceleration-based step activity hereinafter is interchangeably referred to as step activity.
[0009] In order to detect acceleration-based motion or movements of the user, inertial sensors, for example, accelerometers, are commonly used. The Micro- Electro-Mechanical Systems (MEMS) technology has enabled the manufacture of inertial sensors of size that fits into portable or mobile electronic devices, such as cellular phones, portable music players, pedometers, game controllers, and portable computers. Such inertial sensors have low cost and low power consumption. A step activity performed by the user can be detected by the inertial sensor in mobile device carried by the user.
[0010] The detection and classification of a step activity of a user through a mobile device is known. Conventionally, the mobile device is required to be trained or pre-run before the detection and classification of the step activity of the actual user. In the training or the pre-running, data sets of acceleration signals, from the inertial sensor, are generally taken for a variety of test-users and for a variety of step activities performed by the test-users. The data sets of acceleration signals for the test-users and the step-activities are processed in the mobile device to classify the step activities into a predefined set of step activities. Since the acceleration signals detected for the actual user may be significantly different from those for the test-users considered during the training stage, the conventional methodologies may lead to an inaccuracy in step activity classification.
[0011] Further, in some conventional methodologies for the detection and classification of the step activity, the user himself has to spend time and train the algorithm, used for the classification, using his own data sets of acceleration signals for each of the different step activities. The motion of the user during each of the activities is detected by the inertial sensors and its characteristics are stored in the memory of the mobile device. The stored characteristics are used as reference for the detection and classification of user's step activities in real-time.
[0012] Further, some conventional methodologies for detection and classification of step activity of the user are dependent on the placement and orientation of the mobile device. For this, the user has to ensure that the mobile device is placed at a predefined position and with a predefined orientation while he is performing the activity. In such cases, depending on how and where the mobile device is kept and any change in the placement and the orientation of the mobile device from its desired position while performing the activity, the acceleration signal profile changes, which leads to a substantially inaccurate classification of the activity of the user. In addition to the restrictions on the placement and orientation of the mobile devices, conventionally, prior training is also required over the data sets for different positions and orientations of the mobile device and for each of the step activities.
[0013] Further, during the detection and classification of step activity for a user, there may exist stationary periods in which either no activity is being performed or other non-step activities are being performed. The acceleration signals received during these stationary periods may be significant and may have amplitudes comparable to the acceleration signals due to the step activity. The acceleration signals during the stationary periods are unwanted signals and are referred to as noise. Such noise in the acceleration signal may lead to false detection of step activity for the user.
[0014] The present subject matter describes mobile device(s) for detection and classification of step activity of a user, and method(s) for detection and classification of step activity of the user on the mobile device. In an implementation, the step activity may include walking, brisk walking, running, jogging, and a combination thereof. The mobile device may include a mobile phone, a smart phone, a portable music player, a pedometer, a game controller, a portable computer and the like, having an inertial sensor.
[0015] In an implementation of the present subject matter, for detection and classification of the step activity of the user, motion of the user is detected by the inertial sensor which then converts the motion into acceleration signals. The acceleration signals contain information about the step activity of the user and are processed in order to classify the step activity of the user.
[0016] In an implementation, the acceleration signals, representing the motion of the user, are generated by the inertial sensors in the form of a data stream. The data stream of the acceleration signals from the inertial sensors is divided into data windows of a predetermined time period. The obtained data windows are then processed one by one for the classification of the step activity of the user. In an implementation, the processing of each data window includes zero normalization, linear interpolation, and low-pass filtration. The low-pass filtration removes high frequency signals, which improves the signals within the data window for subsequent processing.
[0017] Further, for each of the data windows, a frequency spectrum is obtained, and peaks are identified within the frequency spectrum of each of the data windows. Based on the peaks within the frequency spectrum of a data window, it is identified whether the data window is a non-stationary activity window or a stationary activity window. A non-stationary activity window is a data window in which the signal contains peaks indicative of the step activity of the user. A stationary activity window is a data window in which the signal, including any peaks therein, is indicative of noise and not the step activity of the user. In an implementation, the identification of non-stationary activity windows is based on a maximum amplitude peak, a number of peaks, and a peak sharpness measure of the maximum amplitude peak within the frequency spectrum of a corresponding data window. The maximum amplitude peak is the peak with the maximum amplitude from amongst all peaks in the frequency spectrum of the data window. The number of peaks is based on peaks whose amplitude is within a predefined percentage, for example, 75 %, of the amplitude of the maximum amplitude peak.
[0018] Unlike the conventional methodology, the present subject matter is based on the identification of non-stationary activity windows. The non-stationary activity windows are identified and subsequently analyzed for classification of the step activity of the user. Since the classification of step activity, in accordance with the present subject matter, is based on the non-stationary activity windows, and the stationary activity windows are not considered for the classification, the classification of the step activity is not affected by the noise in the acceleration signals. This facilitates in substantial elimination of any false detection and classification of the step activity which was present in the conventional methodologies.
[0019] Once the non-stationary activity windows are identified from amongst the data windows, peaks within the frequency spectrum of each non-stationary activity window is validated for being indicative of true activity steps. Each of the valid peaks is indicative of a step which the user takes while performing the step activity. Thus, the number of valid peaks is indicative of the number of steps or the step count of user performing the step activity. In an implementation, the determination of valid peaks is based on Dynamic Time Warping (DTW) process and Individual Peak Analysis (IPA) performed on each non-stationary activity window. [0020] Further, based on the valid peaks within the frequency spectrum of each of the non-stationary activity windows, the step activity of the user is classified into one of predefined acceleration-based step activities. In an implementation, the predefined acceleration-based step activities may include a combination of activities from amongst walking, brisk walking, running, jogging, and the like.
[0021] For classifying the step activity into one of the predefined acceleration- based step activities, activity weights are calculated based on the valid peaks within the frequency spectrum of each of the non-stationary activity windows and predefined threshold frequencies of each of the predefined acceleration-based step activities. Each activity weight is indicative of the contribution of the peaks in the frequency spectrum of the non-stationary activity window to the corresponding predefined acceleration-based step activity. The number of predefined threshold frequencies required and the number of activity weights to be calculated varies depending on the number of predefined acceleration-based step activities into which the step activity of the user is to be classified.
[0022] Further, in an implementation, the predefined acceleration-based step activity, into which the step activity being performed by the user is classified, is displayed on the mobile device. In an implementation, the predefined acceleration-based step activity, into which the step activity being performed by the user is classified, and a step count determined based on the number of valid peaks within the frequency spectrum of the non-stationary activity windows, are displayed on the mobile device. This facilitates in providing information to the user about which step activity he is performing and at what rate the step activity is being performed.
[0023] The present subject matter can be implemented into practice and does not need any prior training with data sets of acceleration signals for a variety of test-users or from the user himself. It saves time of the user since the classification is carried out in real-time and does not require training prior to actual use by the user which was otherwise required in the conventional methodologies. Also, in the present subject matter, the mobile device is agnostic of position and orientation, and thus can be placed in any position and with any orientation for the detection and classification of the step activity of the user. This removes restrictions on the placement and orientation of the mobile device and also removes the need of training for the different placements and orientations of the mobile device in the convention.
[0024] The manner in which the mobile device(s) and method(s) shall be implemented has been explained in details with respect to Figure 1 and Figure 2. Although the description herein is with reference to hand-held mobile device(s), the method(s) may be implemented in other portable device(s) and system(s) as well, albeit with a few variations, as will be understood by a person skilled in the art. While aspects of described methods can be implemented in any number of different mobile devices, transmission environments, and/or configurations, the implementations are described in the context of the following hand-held mobile device(s).
[0025] Figure 1 illustrates a mobile device 100 for detection and classification of an acceleration-based step activity, in accordance with an implementation of the present subject matter. For the purpose of description and simplicity, the acceleration-based step activity is hereinafter referred to as the step activity. In an implementation, the mobile device 100 is a device having an inertial sensor 130 and can be carried while performing the step activity. The inertial sensor 130 may include an accelerometer.
[0026] The mobile device 100 may include a mobile phone, a smart phone, a portable music player, a pedometer, a game controller, a portable computer, and the like. As shown in Figure 1 , the mobile device 100 is carried by a user 102 performing the step activity. The mobile device 100 may belong to the user 102. The user 102 may hold the mobile device 100 is his hand, or place the mobile device 100 in a pocket or a bag, or may couple the mobile device 100 using a coupling means, while performing the step activity.
[0027] In an implementation, the mobile device 100 includes processor(s) 104. The processor(s) 104 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 104 is configured to fetch and execute computer-readable instructions stored in a memory.
[0028] The mobile device 100 includes interface(s) 106. The interfaces may include a variety of machine readable instruction-based and hardware-based interfaces that allow the mobile device 100 to communicate with other devices, including servers, data sources and external repositories. Further, the interface(s) 106 may enable the mobile device 100 to communicate with other communication devices, such as network entities, over a communication network.
[0029] Further, the mobile device 100 includes a memory 108. The memory 108 may be coupled to the processor(s) 104. The memory 108 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0030] Further, the mobile device 100 includes module(s) 1 10 and data 1 12. The module(s) 1 10 and the data may be coupled to the processor(s) 104. The modules 1 10, amongst other things, include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The modules 1 10 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions. The data 1 12 serves, amongst other things, as a repository for storing data that may be fetched, processed, received, or generated by the module(s) 1 10. Although the data is shown internal to the mobile device 100, it may be understood that the data 1 12 can reside in an external repository (not shown in the Figure), which may be coupled to the mobile device 100. The mobile device 100 may communicate with the external repository through the interface(s) 106. [0031] Further, the module(s) 1 10 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, a state machine, a logic array or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform tasks or, the processing unit can be dedicated to perform the required functions. In another aspect of the present subject matter, the module(s) 1 10 may be machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the desired functionalities. The machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk or other machine-readable storage medium or non-transitory medium. In an implementation, the machine- readable instructions can also be downloaded to the storage medium via a network connection.
[0032] In an implementation, the module(s) 1 10 include a signal processing module 1 14, a step activity classifier 1 16, a display module 1 18, and other module(s) 120. The other module(s) 120 may include programs or coded instructions that supplement applications or functions performed by the mobile device 100. In said implementation, the data 1 12 includes signal data 122, signal processing data 124, activity classifier data 126, and other data 128. The other data 128 amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s) 1 10.
[0033] The description hereinafter describes the detection and classification of the step activity of the user 102 on the mobile device 100 carried by the user 102. In an implementation, while the user 102 is performing the step activity, the inertial sensor 130 in the mobile device 100 detects the motion of the user 102 due to the step activity and generates a data stream of the acceleration signals corresponding to the motion of the user 102. The data stream may be generated by the inertial sensor 130 at a sampling frequency of 80 Hz. [0034] The signal processing module 1 14 receives the data stream of the acceleration signals generated by the inertial sensor 130, and processes the data stream for the classification of the step activity of the user 102. For the purpose of processing, the signal processing module 1 14 generates data windows of a predetermined time period by dividing the data stream of the acceleration signals. In an implementation, the predetermined time period is 2 seconds. In an implementation, the predetermined time period may be a value between 1.5 seconds to 2.5 seconds. The data windows thus generated are stored in the signal data 122.
[0035] In the implementation, each of the data windows is processed for zero normalization, linear interpolation, and low-pass filtration. The low pass filtration may be performed using a low pass discrete time filter. With this, the noise and the disturbances incorporated by the high frequency signals are removed and the signal in the data windows is substantially improved for subsequent processing.
[0036] After processing the data windows, for each of the data windows, a frequency spectrum is obtained and peaks are identified within the frequency spectrum of the each of the data windows. The frequency spectrum may be obtained by performing Fast Fourier Transform (FFT), or Discrete Fourier Transform (DFT), on the data windows. The frequency spectrum of a data window may have one or more peaks corresponding to a true step activity and may have one or more peaks corresponding to noise or other non-step based activities, such as movements of the mobile device 100. Further, based on the peaks within the frequency spectrum of the each of the data windows, the signal processing module 1 14 identifies the non-stationary activity windows from amongst all the data windows. By the identification of non-stationary activity windows, any false detection and classification of the step activity of the user 102 can be substantially eliminated.
[0037] In an implementation, for the purpose of identification of whether a data window is a non-stationary activity window, a maximum amplitude peak in the frequency spectrum is identified. The maximum amplitude peak is the peak which has the highest maximum amplitude from amongst all peaks in the frequency spectrum of the data window. Let the amplitude of the maximum amplitude peak is denoted by ; d, and the frequency at which the amplitude is maximum be denoted by fd . In addition to the maximum amplitude peak, a number of peaks within 75 % of the height of the amplitude Ad is determined. Let the number of peaks be denoted by Pn .
[0038] Further, a measure of peak sharpness of the maximum amplitude peak in the frequency spectrum of the corresponding data window is computed. The peak sharpness measure is denoted by Psharpness■ The greater is the peak sharpness measure Psharpness » the sharper is the peak. The peak sharpness measure Psharpness 's computed based on the below equation ( 1 ): sharpness
Figure imgf000012_0001
( 1 ) where k is a number of DFT components about the maximum amplitude Ad, and fd is the frequency corresponding to the amplitude Ad . A(i) is the amplitude at the ith DFT component, and A i + 1) is the amplitude at the DFT components next to the ith DFT component for the maximum amplitude peak in the frequency spectrum. In an example, k may be equal to 1 , and the summation in equation ( 1 ) is taken for / = - 1 , 0, 1. A (0) is the amplitude Ad .
[0039] Based on the above identifications and computations, the amplitude Ad is compared with a predefined threshold amplitude and the peak sharpness measure sh rpness is compared to a predefined threshold sharpness. The predefined threshold amplitude may be 1 and the predefined threshold sharpness may be 0.3. The predefined threshold sharpness of 0.3 differentiates distinct sharp peaks from otherwise wide peaks which may be caused due to noises or any other non-step activity. If the amplitude Ad is less than 1 and if the peak sharpness measure Ps rpness 's greater than 0.3 and if the number of peaks Pn is 0 or 1 , then the data window is identified as a non-stationary activity window, else the data window is identified as a stationary activity window. Further, if the amplitude Ad is greater than or equal to 1 and if the peak sharpness measure Pshar ness 's greater than 0.3 and if the number of peaks Pn is 0, 1 or 2, then the data window is identified as a non-stationary activity window, else the data window is identified as a stationary activity window.
[0040] After the identification of the non-stationary activity windows from amongst the data windows, the signal processing module 1 14 analyzes the non- stationary windows for the classification of the step activity of the user 102. The frequency spectra of the non-stationary activity windows has peaks, out of which either some or all may be valid peaks. For the classification of the step activity, valid peaks in the frequency spectrum of the non-stationary activity windows are determined. The determination of valid peaks is based on Dynamic Time Warping (DTW) process and Individual Peak Analysis (IPA). Each of the valid peaks is indicative of a true activity step, which may represent the step taken by the user 102 while performing the acceleration-based step activity.
[0041] In the DTW process of determination of valid peaks, DTWt is an indicator value used to determine whether the DTW process has succeeded or failed to determine the valid peaks. When the DTW process fails for a non- stationary activity window, it is reflected in a confidence score CS. The confidence score is indicative of the percentage of peaks determined as valid in the frequency spectrum of the non-stationary activity window. When all the peaks in the frequency spectrum of the non-stationary window are inferred as valid peaks, the value of CS is 100 %, otherwise the value of CS is less than 100 %.
[0042] In the DTW process, for the ith peak in the frequency spectrum of the non-stationary activity window, if the sample distance between the i lpeak and the (i— 2)th peak lies within a predefined sample distance range of DTWt ± tolerance of t %, then both the ith peak and the (i— 2)th peak are assigned a status of 'valid'. If the sample distance between the ith peak and the (t - 2)th peak does not lie within the predefined sample distance range of DTWt ± tolerance of t %, then the itfl peak is assigned a status of 'check' and the (i - 2)th peak is assigned a status of 'invalid', if its status is not already 'valid'. In an implementation, the tolerance t is 5%.
[0043] In the IPA, the peaks are individually validated. For validating the ith peak in the frequency spectrum of the non-stationary window as the true activity where At is the amplitude of the ith peak, APM is the preceding peak minima. The preceding peak minima refers to the minimum amplitude of acceleration between the ith peak and the last occurred valid peak. Further, SDj is the sample distance between the ith peak and the last occurred valid peak before the ith peak. Further, fs is the sampling frequency. Based on the value of the product Prodi, a criterion for validating the ith peak in the frequency spectrum of the non-stationary window as the true activity step is checked. The criterion is given by:
T * Prodi > Fp (3) where Fp = fd * Ad , (4)
Ad is the amplitude of the maximum amplitude peak in the frequency spectrum of the corresponding non-stationary window, and fd is the frequency at which the amplitude i4doccurs. τ is a constant which is less than 1 . In an example, T is taken as 0.75 as it substantially differentiates the valid peaks from the invalid peaks. The valid peaks correspond to actual activity steps and the invalid peaks represent false peaks generated by noise. If the criterion given by equation (3) is satisfied for the ith peak, then that peak is a valid peak.
[0044] After the peak validation process for the frequency spectrum of a non- stationary activity window is completed, a step cycle length (SCL) for each valid peak is calculated. The SCLi is the sample distance between the invalid peak in the frequency spectrum of the non-stationary activity window and the (i— 2)thvalid peak in the frequency spectrum of the non-stationary activity window. SCLi has a unit samples per step. For i = 1, 2, valid peaks from the frequency spectrum of the previous non-stationary windows are used.
[0045] The process of identification of valid peaks within the frequency spectrum of the non-stationary activity windows is described hereinafter. For a non-stationary activity window, DTWt of the previous non-stationary activity window is checked and compared with 0. For the first non-stationary activity window, the value of DTWt is set to 0. If the DTWt is 0, IPA is performed on the frequency spectrum of the current non-stationary activity window to identify the valid peaks as described above. Based on the values of SCL for the peaks obtained during the IPA, the value of DTWt is updated as the mean of all SCL values found for the current non-stationary activity window. If the D7Wtof the last non-stationary activity window is not 0, then the DTW process is performed on the frequency spectrum of the current non-stationary activity window. After performing the DTW process, the confidence score CS is checked. If the confidence score CS for that window is 100%, then the value of DTWt is updated as the mean of all SCL values found for the current non-stationary activity window. If the confidence score CS for that window is not 100%, then DTW^is updated as 0, and the IPA is performed on the frequency spectrum of the non- stationary activity window. Based on the SCL values obtained after the IPA, the value of DTWt is updated as the mean of all SCL values found for the current non- stationary activity window. The above procedure is iteratively repeated for all the non-stationary activity windows. The data thus generated by the signal processing module 1 14 for the non-stationary activity windows is stored in the signal processing data 124.
[0046] Further, based on the valid peaks within the frequency spectrum of each of the non-stationary activity windows, the step activity classifier 1 16 classifies the step activity of the user 102 into one of the predefined acceleration- based step activities. In an implementation, for the classification of step activity of the user 102, the frequency spectrum of at least two continuous non-stationary activity windows are processed by the signal processing module 1 14 as described above. In an implementation, the predefined acceleration-based step activities may include a combination of activities from amongst walking, brisk walking, running, jogging, and the like. For this, the step activity classifier 1 16 calculates activity weights based on the data for the valid peaks and threshold frequencies of the predefined acceleration-based step activities. The process of calculation of activity weights and subsequent classification of the step activity of the user 102 is described below.
[0047] Let us consider a case where the predefined acceleration-based step activities include "walking", "brisk walking", and "running". With this set of predefined acceleration-based step activities, two predefined threshold frequencies, namely a brisk-walking threshold frequency fbw and a running threshold frequency fr are used. The brisk-walking threshold frequency fbw refers to the threshold frequency marking the separation between walking-brisk- walking, and the running threshold frequency fr refers to the threshold frequency marking the separation between brisk-walking-running. Further, with this set of predefined acceleration-based step activities four activity weights, ^ , w2 , w3 and w4 are defined. The activity weight is a measure of the amount of walking activity, the activity weights w2 and w3 are a measure of the amount of brisk- walking activity, and the activity weight w is a measure of the amount of running activity. Initially all the activity weights,^ , νν2,νν3, and w4 are set to 0.
[0048J For the purpose of classification of the step activity of the user 102 into one of "walking", "brisk walking", and "running", the step activity classifier 1 16 calculates a step frequency SF, for the itfl valid peak in the frequency spectrum of the non-stationary activity window. The step frequency SFj is computed based on the value of SCL, of the it l valid peak and sampling frequency fs. The value of step frequency SFj for the ith valid peak is calculated based on equation (5) below:
SFi = (5)
SF, has the unit of steps per second. (0049] Further, for calculating the activity weights, the step activity classifier 1 16 compares the step frequency SF, with the threshold frequencies /bvvand /r . If the step frequency SF, is greater than 1 and less than the brisk-walking threshold frequency fbw then the activity weight Wi is calculated as:
wi = w1 + (Jbw - SF,)2 (6) where the initial value of vvx is 0.
[0050] If the step frequency SF, is greater than or equal to the brisk-walking threshold frequency fbw and less than running threshold frequency /,-, then the activity weights w2and vv3 are calculated as follows:
w2 = w2 + (fbw - SFi)2 (7)
Figure imgf000017_0001
where the initial values of w2 and w3 is 0.
[0051] Further, if the step frequency SFi is greater than or equal to the running threshold frequency fr, then the activity weight w4is calculated as follows:
Figure imgf000017_0002
where the initial value of w4 is 0.
[0052] The above procedure of computation of activity weights is iteratively repeated for all the valid peaks of the non-stationary activity window under consideration.
[0053] After computation of the activity weights, the step activity classifier 1 16 classifies the step activity of the user 102 in to one of the predefined acceleration-based step activities based on the above calculated activity weights. For the classification, the activity weight is compared to the activity weight w4, the activity weight w2 is compared to the activity weight w1 and the activity weight w3 is compared to the activity weight w4. If the value of the activity weight vva is greater than the activity weight w4 and the value of the activity weight Wi is greater than or equal to activity weight w2, then the step activity of the user 102 is classified as "walking". If the value of the activity weight w is greater than the activity weight w4 and the value of the activity weight wx is less than the activity weightw2, then the step activity of the user 102 is classified as "brisk walking". If the value of the activity weight ΗΊ is less than or equal to the activity weight w and the value of the activity weightw3 is greater than w4, then also the step activity of the user 102 is classified as "brisk walking". If the value of the activity weight is less than or equal to the activity weight w4 and the value of the activity weight w3 is less than or equal to vv4, then the step activity of the user 102 is classified as "running". The data associated with the classified activity is stored in the activity classifier data 126.
[0054] The number of predefined threshold frequencies and the number of activity weights to be calculated varies depending on the number of predefined acceleration-based step activities used for the classification. In an implementation using n number of predefined acceleration-based step activities, (n— 1) number of threshold frequencies is used and (n + 1) number of activity weight is calculated. In the implementation, all the (n -I- 1) activity weights have 0 as the initial values.
[0055] In an implementation, the display module 1 18 displays on the mobile device 100, that acceleration-based step activity under which the step activity being performed by the user 102 is classified. Further, in an implementation, a step count, determined based on the number of valid peaks within the frequency spectrum of the non-stationary activity window, is displayed on the mobile device 100 by the display module 1 18. This display facilitates the user 102 to monitor the step activity he is performing and the rate at which he is performing that step activity.
[0056] In an implementation, if no valid peak is identified in the frequency spectrum of the non-stationary activity windows, then the display module 1 18 may display "stationary" as the activity.
[0057] In an implementation, the step activity performed by the user 102 along with the step count may be stored in the activity classifier data 126 to generate a report for the user 102 stating the type of step activities performed by the user 102 in a session. Further, in the implementation, the step count for a step activity may be calibrated with respect to calories burnt, and the report may state the number of calories burnt by the user 102 in the session.
[0058] In an implementation, the acceleration signals generated by the inertial sensor, the data generated based on the processing done by the signal processing module 1 14, and the data associated with the type of step activity and the step count may be transmitted from the mobile device 100 to another device, such as a personal computer, a laptop, and the like, for various purposes including keep a track or a history of the step activities and the step counts performed by the user 102. The mobile device 100 may be equipped with compatible input/output interfaces for such communication. In an implementation, the communication may take places via a wired communication link, such as a data cable, or via a wireless communication link, such as Bluetooth™, IR and WiFi.
[0059] Figure 2 illustrates a method 200 for detection and classification of an acceleration-based step activity of a user 102. The method 200 is implemented in a mobile device 100. The order in which the method 200 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 200, or any alternative methods. Additionally, individual blocks may be deleted from the method 200 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 200 can be implemented in any suitable hardware.
[0060] The method 200 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. Further, although the method 200 may be implemented in any mobile device having an inertial sensor; in an example described in Figure 2, the method 200 is explained in context of the aforementioned mobile device 100, for the ease of explanation.
[0061] Referring to Figure 2, at block 202, acceleration signal generated by an inertial sensor 130 in the mobile device 100, while the user 102 is performing a step activity, are received from the inertial sensor 130 for processing. As mentioned earlier, while performing the step activity, the user 102 takes steps which set the user 102 into motion. This motion of the user 102 performing the step activity has an acceleration associated with it. The inertial sensor 130 in the mobile device 100 detects the motion of the user 102 and generates a data stream of acceleration signals for the motion of the user 102. [0062] At block 204, the acceleration signal is divided into data windows of a predetermined period. In an implementation, the predetermined period is 2 seconds. In another implementation, the predetermined period may be between 1.5 seconds to 2.5 seconds. After obtaining the data windows, the data windows are processed for zero normalization, linear interpolation and low pass filtration.
[0063] At block 206, peaks are identified within a frequency spectrum of each of the data windows. In an implementation, the frequency spectrum of the data windows may be obtained using Fast Fourier Transform (FFT). Further, at block 208, non-stationary activity windows are identified from amongst the data windows of the acceleration signals by obtaining a maximum amplitude peak, number of peaks and peak sharpness measure from the frequency spectrum of each data window. The computation of the peak sharpness measure, and the identification of the non-stationary activity windows are performed in a manner as described earlier in the description.
[0064] Further, at block 210, within the frequency spectrum of the non- stationary activity window, valid peaks are determined. These valid peaks are indicative of true activity steps and are determined based on Dynamic Time Warping (DTW) and Individual Peak Analysis (IPA). The valid peaks are determined as described earlier in the description.
[0065] At block 212, the step activity of the user 102 is classified into one of the predefined acceleration-based step activities based on the valid peaks obtained within the frequency spectrum of each of the non-stationary activity window. In an implementation, the classification of the step activity includes, computing step frequency for the valid peaks and computing activity weights based on the step frequencies and predefined threshold frequencies for the predefined acceleration- based step activities. The step activity of the user 102 is then classified based on the activity weights. The computation of step frequency and the activity weights and the classification are done in a manner as described earlier in the description.
[0066] In an implementation, the classified activity and the step count determined based on the valid peaks in the frequency spectrum of the non- stationary activity windows are displayed on the mobile device 100 for the user 102.
[0067] Although implementations for mobile device(s) for detection and classification of acceleration-based step activity of a user 102 and method(s) for detection and classification of acceleration-based step activity of a user 102 on a mobile device are described', it is to be understood that the present subject matter is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as implementations to detect and classify the step activity of user 102 on mobile device 100.

Claims

claim:
A method for detection and classification of an acceleration-based step activity of a user (102) on a mobile device (100), the method comprising: receiving acceleration signals from an inertial sensor ( 130) in the mobile device ( 100) carried by the user (102) performing a step activity; dividing the acceleration signals into data windows of a predetermined time period;
identifying peaks within a frequency spectrum of each of the data windows;
identifying non-stationary activity windows, from amongst the data windows, based on the peaks within the frequency spectrum of the each of the data windows;
determining valid peaks within the frequency spectrum of each of the non-stationary activity windows, wherein each of the valid peaks is indicative of a true activity step; and
classifying the step activity of the user ( 102) into one of predefined acceleration-based step activities based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
The method as claimed in claim 1 further comprising processing the data windows by performing zero normalization, linear interpolation, and low pass filtration on the data windows.
The method as claimed in claim 1, wherein the identifying of each of the non- stationary activity windows is based on a maximum amplitude peak, a number of peaks, and a measure of peak sharpness of the peaks within the frequency spectrum of a corresponding data window.
The method as claimed in claim 1, wherein the determining of the valid peaks within the frequency spectrum of the each of the non-stationary activity windows is based on Dynamic Time Warping process and Individual Peak Analysis.
5. The method as claimed in claim 1 , wherein the classifying the step activity of the user ( 102) into one of the predefined acceleration-based step activities comprises calculating activity weights for the each of the non-stationary activity windows based on the valid peaks within the frequency spectrum of that non-stationary activity window and predefined threshold frequencies for the predefined acceleration-based step activities.
6. The method as claimed in claim 1 further comprising displaying on the mobile device ( 100) a step count of the one of the predefined acceleration-based step activity being performed by the user ( 102), wherein the step count is based on the valid peaks within the frequency spectrum of the each of the non- stationary activity windows.
7. The method as claimed in claim 1 , wherein the predefined acceleration-based step based activities comprise walking, brisk walking, running, and jogging. 8. A mobile device ( 100) for detection and classification of an acceleration-based step activity of a user ( 102), the mobile device (100) comprising:
a processor ( 104);
an inertial sensor ( 130) to detect motion of the user ( 102) performing a step activity and to generate acceleration signals based on the motion of the user ( 102);
a signal processing module ( 1 14) coupled to the processor ( 104), to: generate data windows of a predetermined time period from the acceleration signals;
identify peaks within a frequency spectrum of each of the data windows; identify non-stationary activity windows, from amongst the data windows, based on the peaks within the frequency spectrum of the each of the data windows;
determine valid peaks within the frequency spectrum of each of the non-stationary activity windows, wherein each of the valid peaks is indicative of a true activity step; and
a step activity classifier (1 16) coupled to the processor (104), to classify the step activity of the user (102) into one of predefined acceleration-based step activities based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
9. The mobile device (100) as claimed in claim 8, wherein the signal processing module (1 14) performs zero normalization, linear interpolation and low pass filtration on the data windows.
10. The mobile device (100) as claimed in claim 8, wherein the signal processing module (1 14) identifies each of the non-stationary activity windows based on a maximum amplitude peak, a number of peaks, and a measure of peak sharpness of the peaks within the frequency spectrum of a corresponding data window.
1 1. The mobile device (100) as claimed in claim 8, wherein the signal processing module (1 14) determines the valid peaks within the frequency spectrum of the each of the non-stationary activity windows based on Dynamic Time Warping process and Individual Peak Analysis.
12. The mobile device ( 100) as claimed in claim 8, wherein the step activity classifier (1 16) classifies the step activity of the user into one of the predefined acceleration-based step activities by calculating activity weights for the each of the non-stationary activity windows based on the valid peaks within the frequency spectrum of that non-stationary activity window and predefined threshold frequencies for the predefined acceleration-based step activities.
13. The mobile device ( 100) as claimed in claim 8 further comprising a display module ( 1 18) to display on the mobile device (100) the one of predefined acceleration-based step activities.
14. The mobile device ( 100) as claimed in claim 13, wherein the display module ( 1 18) displays on the mobile device (100) a step count of the one of predefined acceleration-based step activities being performed by the user
( 102), wherein the step count is based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
15. A non-transitory computer readable medium having a set of computer readable instructions that, when executed, cause a mobile device to: receive acceleration signals from an inertial sensor in the mobile device carried by a user performing an acceleration-based step activity; divide the acceleration signals into data windows of a predetermined time period; identify peaks within a frequency spectrum of each of the data windows; identify non-stationary activity windows, from amongst the data windows, based on the peaks within the frequency spectrum of the each of the data windows; determine valid peaks within the frequency spectrum of each of the non-stationary activity windows, wherein each of the valid peaks is indicative of a true activity step; and classify the acceleration-based step activity of the user into one of predefined step activities based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
PCT/IB2014/000786 2013-05-27 2014-05-21 Acceleration-based step activity detection and classification on mobile devices WO2014191803A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN1875/MUM/2013 2013-05-27
IN1875MU2013 IN2013MU01875A (en) 2013-05-27 2014-05-21

Publications (1)

Publication Number Publication Date
WO2014191803A1 true WO2014191803A1 (en) 2014-12-04

Family

ID=50942707

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/000786 WO2014191803A1 (en) 2013-05-27 2014-05-21 Acceleration-based step activity detection and classification on mobile devices

Country Status (2)

Country Link
IN (1) IN2013MU01875A (en)
WO (1) WO2014191803A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104457781A (en) * 2014-12-22 2015-03-25 北京航空航天大学 Self-adaption step number detection method based on single-axis accelerometer
CN106971203A (en) * 2017-03-31 2017-07-21 中国科学技术大学苏州研究院 Personal identification method based on characteristic on foot
CN108937852A (en) * 2018-05-28 2018-12-07 深圳市北高智电子有限公司 A kind of intelligence step counting, sleep monitor operation method
CN109214318A (en) * 2018-08-22 2019-01-15 北京天泽智云科技有限公司 A method of finding the faint spike of unstable state time series

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
AKIN AVCI ET AL: "Activity Recognition Using Inertial Sensing for Healthcare, Wellbeing and Sports Applications: A Survey", 23TH INTERNATIONAL CONFERENCE ON ARCHITECTURE OF COMPUTING SYSTEMS, ARCS 2010, 23 February 2010 (2010-02-23), Hannover, Germany, XP055130610, Retrieved from the Internet <URL:http://purl.utwente.nl/publications/70138> [retrieved on 20140721] *
ANNAPURNA SHARMA ET AL: "Frequency based classification of activities using accelerometer data", MULTISENSOR FUSION AND INTEGRATION FOR INTELLIGENT SYSTEMS, 2008. MFI 2008. IEEE INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 20 August 2008 (2008-08-20), pages 150 - 153, XP031346262, ISBN: 978-1-4244-2143-5 *
MELANIA SUSI ET AL: "Motion Mode Recognition and Step Detection Algorithms for Mobile Phone Users", SENSORS, vol. 13, no. 2, 24 January 2013 (2013-01-24), pages 1539 - 1562, XP055130598, DOI: 10.3390/s130201539 *
PREECE S J ET AL: "A Comparison of Feature Extraction Methods for the Classification of Dynamic Activities From Accelerometer Data", IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, IEEE SERVICE CENTER, PISCATAWAY, NJ, USA, vol. 56, no. 3, 1 March 2009 (2009-03-01), pages 871 - 879, XP011342877, ISSN: 0018-9294, DOI: 10.1109/TBME.2008.2006190 *
VO QUANG VIET ET AL: "Balancing Precision and Battery Drain in Activity Recognition on Mobile Phone", PARALLEL AND DISTRIBUTED SYSTEMS (ICPADS), 2012 IEEE 18TH INTERNATIONAL CONFERENCE ON, IEEE, 17 December 2012 (2012-12-17), pages 712 - 713, XP032311003, ISBN: 978-1-4673-4565-1, DOI: 10.1109/ICPADS.2012.108 *
YANG XUE ET AL: "Walking Pattern Discrimination based on Wavelet and Fractal Analysis", PROCEEDINGS ON THE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (ICAI), 1 January 2011 (2011-01-01), Athens, pages 1 - 5, XP055130595 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104457781A (en) * 2014-12-22 2015-03-25 北京航空航天大学 Self-adaption step number detection method based on single-axis accelerometer
CN104457781B (en) * 2014-12-22 2018-01-30 北京航空航天大学 A kind of adaptive step number detection method based on single-axis accelerometer
CN106971203A (en) * 2017-03-31 2017-07-21 中国科学技术大学苏州研究院 Personal identification method based on characteristic on foot
CN106971203B (en) * 2017-03-31 2020-06-09 中国科学技术大学苏州研究院 Identity recognition method based on walking characteristic data
CN108937852A (en) * 2018-05-28 2018-12-07 深圳市北高智电子有限公司 A kind of intelligence step counting, sleep monitor operation method
CN109214318A (en) * 2018-08-22 2019-01-15 北京天泽智云科技有限公司 A method of finding the faint spike of unstable state time series
CN109214318B (en) * 2018-08-22 2021-10-22 北京天泽智云科技有限公司 Method for searching weak peak of unsteady time sequence

Also Published As

Publication number Publication date
IN2013MU01875A (en) 2015-07-03

Similar Documents

Publication Publication Date Title
US10653339B2 (en) Time and frequency domain based activity tracking system
US8930300B2 (en) Systems, methods, and apparatuses for classifying user activity using temporal combining in a mobile device
US9407706B2 (en) Methods, devices, and apparatuses for activity classification using temporal scaling of time-referenced features
US9891701B2 (en) Apparatus, system, and method for automatic identification of sensor placement
JP6155276B2 (en) Method and apparatus for elevator motion detection
US10022071B2 (en) Automatic recognition, learning, monitoring, and management of human physical activities
JP6567658B2 (en) Device and method for classifying user activity and / or counting user steps
US20140278208A1 (en) Feature extraction and classification to determine one or more activities from sensed motion signals
WO2014145114A1 (en) Dynamic control of sampling rate of motion to modify power consumption
JP6083799B2 (en) Mobile device location determination method, mobile device, mobile device location determination system, program, and information storage medium
KR20140116481A (en) Activity classification in a multi-axis activity monitor device
EP2972416A2 (en) Identification of motion characteristics to determine activity
RU2601152C2 (en) Device, method and computer program to provide information to user
CN109414174B (en) Method and system for probabilistically estimating pulse rate of an individual
Ahmed et al. An approach to classify human activities in real-time from smartphone sensor data
WO2014145112A2 (en) Methods and architecture for determining activity and activity types from sensed motion signals
WO2014191803A1 (en) Acceleration-based step activity detection and classification on mobile devices
CN107277222A (en) User behavior state judging method based on mobile phone built-in sensors
EP2972415A2 (en) Intermediate motion signal extraction to determine activity
EP2972930A2 (en) Inline calibration of motion sensor
US20170311899A1 (en) Apparatus and method for identifying movement in a patient
US20180267073A1 (en) Device and method of characterizing motion
JP2012108836A (en) Interpersonal property estimation device, estimation method and estimation program based on daily measurement data
Zhao et al. An adaptive step detection algorithm based on the state machine
US20230029222A1 (en) Wearable device and method for processing acceleration data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14730186

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14730186

Country of ref document: EP

Kind code of ref document: A1