KR101733746B1 - User context based motion counting method, sensor device and wearable device performing the same - Google Patents
User context based motion counting method, sensor device and wearable device performing the same Download PDFInfo
- Publication number
- KR101733746B1 KR101733746B1 KR1020150148565A KR20150148565A KR101733746B1 KR 101733746 B1 KR101733746 B1 KR 101733746B1 KR 1020150148565 A KR1020150148565 A KR 1020150148565A KR 20150148565 A KR20150148565 A KR 20150148565A KR 101733746 B1 KR101733746 B1 KR 101733746B1
- Authority
- KR
- South Korea
- Prior art keywords
- user context
- sensor data
- determining
- motion
- motion counting
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P13/00—Indicating or recording presence, absence, or direction, of movement
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A user context based motion counting method, a sensor device for performing the motion counting method, and a wearable device are provided. Wherein the user context based motion counting method comprises: extracting at least one feature from sensor data; identifying a user context based on the extracted feature; determining a motion count based on the identified user context; Determining an algorithm based on the sensor data and using the determined motion counting algorithm to count a user's movement based on the sensor data.
Description
The present invention relates to a method of counting a user's movement, and more particularly, to a user context-based motion counting method, a sensor device and a wearable device performing the motion counting method.
A wearable device represents a computer system that is provided in a form that can be worn on the wearer's body. The wearable device can measure the movement of the user by using various sensors and provide the measurement result to the user. The wearable device can manually input a user's motion operation and count the number of times the input motion motion is repeated. The wearable device monitors the movement of the user at a specific period of time and can provide additional information such as the amount of calories consumed and the activity time.
SUMMARY OF THE INVENTION It is an object of the present invention to provide a user context-based motion counting method that can more accurately count user's movements.
It is another object of the present invention to provide a user context-based motion counting method that can count user's movements more intuitively and conveniently.
Another object of the present invention is to provide a sensor device and a wearable device for performing the user context based motion counting method.
The technical objects of the present invention are not limited to the above-mentioned problems, and other matters not mentioned can be clearly understood by those skilled in the art from the following description.
According to an aspect of the present invention, there is provided a method of counting motion based on a user context, the method comprising: extracting at least one feature from sensor data; determining a user context based on the extracted feature; Determining a motion counting algorithm based on the identified user context, and counting a user's movement based on the sensor data using the determined motion counting algorithm.
In some embodiments of the present invention, the extracted feature may comprise at least one statistical value of the sensor data.
In some embodiments of the present invention, identifying the user context may identify the user context based on the extracted feature using a K-Nearest Neighborhood (KNN) algorithm.
In some embodiments of the present invention, the step of determining the motion counting algorithm comprises determining an autocorrelation parameter of a motion counting algorithm based on the identified user context, Parameters, and determine a second autocorrelation parameter for a second user context.
Also, the autocorrelation parameter may correspond to a repetition period of a peak included in the sensor data.
In some embodiments of the present invention, the step of determining the motion counting algorithm comprises determining a first motion counting algorithm for a first user context based on the identified user context, and determining a second motion counting algorithm for a second user context, The counting algorithm can be determined.
In some embodiments of the present invention, the motion counting algorithm comprises filtering for sensor data in a predetermined frequency range of the sensor data, and wherein the step of determining the motion counting algorithm comprises, based on the identified user context Determine the filtering setting, determine a first filtering setting for the first user context, and determine a second filtering setting for the second user context.
In some embodiments of the present invention, the method may further comprise obtaining the sensor data from an acceleration sensor or a gyro sensor.
According to another aspect of the present invention, there is provided a sensor device capable of performing any one of the above-described user context based motion counting methods.
According to another aspect of the present invention, there is provided a wearable apparatus capable of performing any one of the above-described user context-based motion counting methods.
Other specific details of the invention are included in the detailed description and drawings.
According to the user context based motion counting method of the present invention, a user context is identified and a more suitable motion counting algorithm is determined for each identified user context, so that the motion of the user can be counted more accurately.
In addition, since direct input or selection of the user is not required separately for user context identification, it is more intuitive and user-friendly.
The effects of the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description.
FIG. 1 is a flowchart schematically illustrating a user context-based motion counting method according to an embodiment of the present invention.
2 is a diagram schematically illustrating identification of a user context using a KNN algorithm.
3 is a diagram schematically illustrating determination of autocorrelation parameters based on a user context.
4 is a diagram schematically illustrating determining filtering settings for sensor data based on a user context.
5 is a block diagram schematically illustrating a wearable device for performing a user context-based motion counting method according to an embodiment of the present invention.
FIG. 6 is a view schematically showing a state in which the wearable device of FIG. 5 is worn by the wearer's body.
7 is a block diagram schematically illustrating a wearable device for performing a user context-based motion counting method according to another embodiment of the present invention.
FIG. 8 is a block diagram schematically illustrating a sensor hub for performing a user context-based motion counting method according to an embodiment of the present invention.
9 is a block diagram schematically illustrating a sensor hub for performing a user context-based motion counting method according to another embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. However, it is to be understood that the present invention is not limited to the disclosed embodiments, but may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. It is to be understood by those of ordinary skill in the art that the present invention is not limited to the above embodiments, but may be modified in various ways. Like reference numerals refer to like elements throughout the specification.
Although the first, second, etc. are used to describe various elements, components and / or sections, it is needless to say that these elements, components and / or sections are not limited by these terms. These terms are only used to distinguish one element, element or section from another element, element or section. Therefore, it goes without saying that the first element, the first element or the first section mentioned below may be the second element, the second element or the second section within the technical spirit of the present invention.
Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. In addition, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.
The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.
FIG. 1 is a flowchart schematically illustrating a user context-based motion counting method according to an embodiment of the present invention.
Referring to FIG. 1, in step S10, sensor data is obtained from an acceleration sensor, a gyro sensor, or at least one sensor to which they are coupled. The sensor data may include acceleration data in three axial directions (e.g., X, Y, and Z axes) or acceleration data in three axial directions (e.g., a pitch axis, a yaw axis, But is not limited to, angular velocity data. Depending on the embodiment, processing such as predetermined sampling or quantization of the sensor data may be provided.
Then, in step S20, at least one feature is extracted from the sensor data. The feature indicates the aspect or component in which the sensor data is differentiated from the other sensor data. The feature can represent the whole part of the sensor data or represent part of the sensor data. For example, the feature may include at least one statistical value of the sensor data. For example, the statistical value may include a maximum value, a minimum value, an intermediate value, an average value, an IQR (Interquartile Range) value, a root mean square (RMS) value, and the like of a predetermined number of sampled sensor data , But is not limited thereto. The feature may be extracted by arbitrary processing on the sensor data.
Then, in step S30, the user context is identified based on the extracted features. The user context represents a user's situation or information defining the situation. The user context may be related to the type of action that can specify the user's movement. For example, the user context may be classified into various types of exercise operations such as push-up, biceps curls, and rowing, but is not limited thereto.
For example, the user context can be identified using the KNN algorithm. 2 is a diagram schematically illustrating identification of a user context using a KNN algorithm. Referring to FIG. 2, a plurality of features are processed using the KNN algorithm. By the KNN algorithm, a plurality of features can be compared with pre-classified or learned features and classified into groups having the same or similar tendencies. By combining the classification results, a specific user context can be derived. the k value, and the distance between neighboring features, and the like can be variously adjusted according to the embodiment. Although not explicitly shown, a weight relating to high importance and distance to some features may be applied. The detailed description of the KNN algorithm is omitted because it may obscure the gist of the present invention.
On the other hand, for the identification of the user context, any unillustrated machine learning algorithm may be used.
Then, in step S40, a motion counting algorithm is determined based on the identified user context. At this time, the autocorrelation parameters of the motion counting algorithm may be determined together based on the identified user context. The autocorrelation parameters can be determined differently depending on the user context. That is, a first autocorrelation parameter may be determined for a first user context, and a second autocorrelation parameter may be determined for a second user context. The autocorrelation parameter may be used to remove noise or counting errors within the motion counting algorithm.
For example, the autocorrelation parameter may correspond to a repetition period of the peaks included in the sensor data. The peak may be an argument or an element for counting the user's movement as described below. 3 is a diagram schematically illustrating determination of autocorrelation parameters based on a user context. Referring to FIG. 3, a plurality of peaks p0 to p2 may be included in sensor data in a predetermined time range. The peak can be selected based on the time interval between the amplitude and the previous peak. On the other hand, the repetitive pattern of peaks may be different depending on the user context (user's action). For example, when the user performs the push-up operation, when the repetition period of the peaks, that is, the time interval between the previous peak and the current peak is 100 msec, considering the repetition pattern of the general peak appearing in the push- , The current peak is presumed to be due to noise or error, and is not used for counting. In the example shown in FIG. 3, if the reference time interval with respect to the repetition period of the peak is tr1, then p1 can be selected as the peak for counting after p0, but if the reference time interval is tr2, And p2 may be selected as the peak for counting.
At least some motion counting algorithms may include filtering for sensor data in a predetermined frequency range of sensor data to remove noise or counting errors. And, when the motion counting algorithm is determined, the filtering setting may be determined together based on the identified user context. The filtering settings can be determined differently depending on the user context. That is, a first filtering setting may be determined for a first user context, and a second filtering setting may be determined for a second user context.
4 is a diagram schematically illustrating determining filtering settings for sensor data based on a user context. Referring to FIG. 4, a plurality of filtering settings are shown that define a predetermined passband. In the example shown in Fig. 4, it is confirmed that fl1 and fh1 are set to the cut-off frequency according to the first filtering setting, and pass-bands when fl2 and fh2 are set to the cut-off frequency according to the second filtering setting . Although a band-pass filter is shown as an example in FIG. 4, the present invention is not limited to this, and a filtering setting may be provided in substantially the same manner for a high-pass filter and a low-pass filter.
On the other hand, a separate motion counting algorithm may be determined based on the identified user context (in the case of certain operations that move in a completely different way than other operations). In this case, a first motion counting algorithm may be determined for a first user context, and a second motion counting algorithm may be determined for a second user context.
Subsequently, in step S50, the motion of the user is counted based on the sensor data, using the determined motion counting algorithm. For example, the sensor data of the axis having the largest change among the sensor data of the three axes is selected, and the peaks in the sensor data of the selected axis can be counted. The method of counting movements can be variously modified according to specific embodiments.
5 is a block diagram schematically illustrating a wearable device for performing a user context-based motion counting method according to an embodiment of the present invention.
5, the
The acceleration sensor 110 detects acceleration in three axial directions (e.g., X-axis, Y-axis, and Z-axis). Although only one acceleration sensor is shown in FIG. 5, the present invention is not limited thereto, and a plurality of acceleration sensors capable of detecting acceleration in one axial direction may be provided. The acceleration sensor 110 may transmit the acceleration data in the three axial directions to the controller 160.
The gyro sensor 120 detects the angular velocity in three axial directions (for example, a pitch axis, a yaw axis, and a roll axis). Although only one gyro sensor is shown in Fig. 5, the present invention is not limited thereto, and a plurality of gyro sensors each capable of detecting an angular velocity in the axial direction may be provided. The gyro sensor 120 can transmit angular velocity data in three axial directions to the controller 160. [
The storage unit 130 stores various data and commands. The storage unit 130 may store various software modules including system software for operation of the
The input unit 140 receives various information from the user. The input unit 140 may include various input means such as a key, a button, a switch, a wheel, and a touch pad.
The output unit 150 notifies the user of various types of information. The output unit 150 may output information in the form of text, image, or voice. To this end, the output unit 150 may include a display module 151 and a speaker module 152. The display module 151 may be a liquid crystal display (LCD), a thin film transistor (TFT) LCD, an organic light emitting diode (OLED), a flexible display, a three-dimensional display, an electronic ink display, And may be provided in any form.
The controller 160 controls the overall operation of the
The power supply unit 170 supplies power necessary for the operations of the acceleration sensor 110, the gyro sensor 120, the storage unit 130, the input unit 140, the output unit 150, and the controller 160. The power supply unit 170 may include an internal battery or may convert a power supplied from the outside into a power suitable for the components.
On the other hand, the components shown in Fig. 5 are not essential, so that the
FIG. 6 is a view schematically showing a state in which the wearable device of FIG. 5 is worn by the wearer's body.
Referring to FIG. 6, the
7 is a block diagram schematically illustrating a wearable device for performing a user context-based motion counting method according to another embodiment of the present invention. For convenience of description, the same components as those of the
7, the
The wireless communication unit 230 may wirelessly communicate with an external device (such as various servers or user terminals). The wireless communication unit 230 can wirelessly communicate with an external device using a wireless communication method such as mobile communication, WiBro, WiFi, Bluetooth, Zigbee, ultrasonic, infrared, and RF have. The wireless communication unit 230 may transmit the data and / or information received from the external device to the control unit 280 and may transmit the data and / or information transmitted from the control unit 280 to the external device. For this, the wireless communication unit 230 may include a mobile communication module, a short-range communication module, and the like.
The vibration unit 260 may perform a vibration notification for notifying the user of various information.
The
A wearable device according to an embodiment of the present invention may be provided in any computer system that can be worn on the body of an unillustrated user.
FIG. 8 is a block diagram schematically illustrating a sensor hub for performing a user context-based motion counting method according to an embodiment of the present invention.
Referring to FIG. 8, the
9 is a block diagram schematically illustrating a sensor hub for performing a user context-based motion counting method according to another embodiment of the present invention.
Referring to FIG. 9, the
The methods described in connection with the embodiments of the present invention may be implemented with software modules executed by a processor. The software modules may reside in RAM, ROM, EPROM, EEPROM, flash memory, hard disk, removable disk, CD-ROM, or any form of computer readable recording medium known in the art .
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be understood. Therefore, it should be understood that the above-described embodiments are illustrative in all aspects and not restrictive.
Claims (10)
Identifying a user context based on the extracted features;
Determining a motion counting algorithm based on the identified user context; And
Counting a user's movement based on the sensor data using the determined motion counting algorithm,
Wherein determining the motion counting algorithm comprises:
Determining an autocorrelation parameter of a motion counting algorithm based on the identified user context, determining a first autocorrelation parameter for a first user context, and determining a second autocorrelation parameter for a second user context Wherein the user context-based motion counting method comprises:
The extracted feature may be,
And at least one statistical value of the sensor data.
Wherein identifying the user context comprises:
Wherein the user context is identified based on the extracted feature using a KNN (K-Nearest Neighborhood) algorithm.
The autocorrelation parameter may comprise:
Corresponding to a repetition period of a peak included in the sensor data.
Wherein determining the motion counting algorithm comprises:
Determine a first motion-counting algorithm for a first user context based on the identified user context, and determine a second motion-counting algorithm for a second user context.
The motion counting algorithm includes:
And filtering the sensor data in a predetermined frequency range of the sensor data,
Wherein determining the motion counting algorithm comprises:
Determining a filtering setting for sensor data in a predetermined frequency range of the sensor data based on the identified user context, determining a first filtering setting for a first user context, and determining a second filtering setting for a second user context, And determining a setting based on the user context.
Further comprising the step of obtaining the sensor data from an acceleration sensor or a gyro sensor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150148565A KR101733746B1 (en) | 2015-10-26 | 2015-10-26 | User context based motion counting method, sensor device and wearable device performing the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150148565A KR101733746B1 (en) | 2015-10-26 | 2015-10-26 | User context based motion counting method, sensor device and wearable device performing the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101733746B1 true KR101733746B1 (en) | 2017-05-08 |
Family
ID=60164277
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150148565A KR101733746B1 (en) | 2015-10-26 | 2015-10-26 | User context based motion counting method, sensor device and wearable device performing the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101733746B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190047644A (en) * | 2017-10-27 | 2019-05-08 | 주식회사 뉴클리어스 | Method and wearable device for providing feedback on exercise |
KR20190047648A (en) * | 2017-10-27 | 2019-05-08 | 주식회사 뉴클리어스 | Method and wearable device for providing feedback on action |
-
2015
- 2015-10-26 KR KR1020150148565A patent/KR101733746B1/en active IP Right Grant
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190047644A (en) * | 2017-10-27 | 2019-05-08 | 주식회사 뉴클리어스 | Method and wearable device for providing feedback on exercise |
KR20190047648A (en) * | 2017-10-27 | 2019-05-08 | 주식회사 뉴클리어스 | Method and wearable device for providing feedback on action |
KR102089002B1 (en) * | 2017-10-27 | 2020-03-13 | 김현우 | Method and wearable device for providing feedback on action |
KR102108180B1 (en) * | 2017-10-27 | 2020-05-08 | 김현우 | Method and wearable device for providing feedback on exercise |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102347067B1 (en) | Portable device for controlling external apparatus via gesture and operating method for same | |
US10222868B2 (en) | Wearable device and control method using gestures | |
KR102330889B1 (en) | Methods and devices that combine muscle activity sensor signals and inertial sensor signals for gesture-based control | |
CN107079103B (en) | Cloud platform control method, device and holder | |
EP2661663B1 (en) | Method and apparatus for tracking orientation of a user | |
EP2836892B1 (en) | Control of remote device based on gestures | |
JP6640249B2 (en) | Technique for input gesture control of wearable computing device based on fine movement | |
JP2018504682A5 (en) | ||
JP2010015535A (en) | Input device, control system, handheld device, and calibration method | |
WO2014130577A1 (en) | Systems and methods for activity recognition training | |
EP3289435B1 (en) | User interface control using impact gestures | |
US10877562B2 (en) | Motion detection system, motion detection method and computer-readable recording medium thereof | |
US11237632B2 (en) | Ring device having an antenna, a touch pad, and/or a charging pad to control a computing device based on user motions | |
KR101733746B1 (en) | User context based motion counting method, sensor device and wearable device performing the same | |
US20190103108A1 (en) | Input device, electronic device, system comprising the same and control method thereof | |
CN102024316B (en) | Wireless intelligent sensing method, device and system | |
CN113906371A (en) | Electronic device for providing exercise information according to exercise environment and method for operating the same | |
US11029753B2 (en) | Human computer interaction system and human computer interaction method | |
WO2015137014A1 (en) | Information input and output apparatus and information input and output method | |
JP2017191426A (en) | Input device, input control method, computer program, and storage medium | |
CN108491074B (en) | Electronic device, exercise assisting method and related product | |
KR101870542B1 (en) | Method and apparatus of recognizing a motion | |
US10156907B2 (en) | Device for analyzing the movement of a moving element and associated method | |
US11934588B1 (en) | Controller for sensing downward force applied to a movable thumbstick and providing a haptic response thereto, and methods of use thereof | |
WO2017061639A1 (en) | User context based motion counting method, sensor device and wearable device performing same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |