US20140278219A1 - System and Method For Monitoring Movements of a User - Google Patents
System and Method For Monitoring Movements of a User Download PDFInfo
- Publication number
- US20140278219A1 US20140278219A1 US14/214,046 US201414214046A US2014278219A1 US 20140278219 A1 US20140278219 A1 US 20140278219A1 US 201414214046 A US201414214046 A US 201414214046A US 2014278219 A1 US2014278219 A1 US 2014278219A1
- Authority
- US
- United States
- Prior art keywords
- user
- repetitive
- movement
- motion
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0062—Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1124—Determining motor skills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/683—Means for maintaining contact with the body
- A61B5/6831—Straps, bands or harnesses
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7246—Details of waveform analysis using correlation, e.g. template matching or determination of similarity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/10—Athletes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
Definitions
- the disclosure generally relates to the field of tracking user movements, and in particular to monitoring and quantifying repetitive and non-repetitive movements made by a user.
- Motion processing and wireless communication technology allows people to track things such as their sleeping patterns and the amount of steps they walk each day.
- motion capturing devices and functionality have not seen much success in the marketplace because of limits in the functions that can be performed and movement that can be monitored, for example.
- Embodiments include, a motion tracking system that monitors the motions performed by a user in real time, based on motion data received from one or more sensors.
- the motion tracking system may include a motion tracking device with one or more sensors, a smart device with one or more sensors and/or a server, for example.
- the user may wear the motion tracking device and or/carry the motion tracking device or the smart device while performing motions.
- the motion data generated by one or more sensors is processed by a software application.
- the software application may be present on the smart device, the server, and/or the motion tracking device.
- the software application generates interpreted data based on the motion data and contextual data such as the equipment being used by the user.
- the interpreted data may include the performance of the user as the user performs a motion and/or feedback provided to the user during or after the user performs a motion or set of motions.
- the software application identifies the movement being performed by the user based on features present in one or more signals of the motion data.
- the software application may count and or generate motion metrics associated with the performance of the user as a user performs a motion.
- the interpreted data is then provided to the user during and/or after the user has performed a motion or a set of motions.
- the feedback provided to the user may be visual, audio or tactile, for example.
- the software application is monitoring the user's movements, evaluating and keeping track of qualitative and quantitative metrics such as the current exercise being performed by the user, the number of repetitions performed by the user and the form of the user, all in real time and/or after the user has performed the motion, a set of motions, multiple sets of motions and/or one or more routines.
- the user does not have to provide input to the application by interacting with the smart device or the motion tracking device.
- the user has the freedom to perform the workout at his/her own pace, without the interruption of periodically providing user input to the application via the smart device or the motion tracking device.
- FIG. 1 is a perspective view of a motion tracking system, according to one embodiment.
- FIG. 2 is a flowchart illustrating one implementation of the motion tracking system, according to one embodiment.
- FIG. 3 is a flowchart illustrating the motion tracking system monitoring user movements, according to one embodiment.
- FIG. 4 illustrates repeated and non-repeated movements present in the processed signal, according to one embodiment.
- FIG. 5 is a flowchart illustrating the motion tracking system identifying user movements based on motion data, according to one embodiment.
- FIG. 1 is a perspective view of a motion tracking system 100 , according to one embodiment.
- a user 23 wears a motion tracking device 24 while such user 23 is performing motions such as weight training, walking and cardiovascular movements and/or lifting objects.
- the motion tracking system 100 monitors the motion of a user in real time.
- the motion tracking device 24 includes a motion processing unit 5 which measures a repetitive movement 32 or a non-repetitive movement 33 performed by the user 23 .
- the motion processing unit 5 includes one or more sensors, such as an accelerometer 6 , a gyroscope 7 and/or a magnetometer 8 .
- the motion data 25 measured by the sensors and the motion processing unit 5 may be used to monitor the movements of a user in real time.
- the motion data 25 is transmitted to an auxiliary smart device 18 running a software application 19 .
- the application 19 analyzes the motion data 25 and generates an interpreted data 28 to provide to the user 23 .
- the application 19 also provides the user 23 with feedback regarding the user's movements.
- the application 19 may analyze motion data 25 related to a user performing an exercise and provide feedback to the user 23 in real time.
- the feedback may include the quality of the form of the user's motion, recommendations for other exercises or the performance of the user.
- Motion data 25 is also, in one aspect, analyzed by the application 19 along with contextual data 26 .
- the contextual data 26 may be gathered from a number of sources such as other application data on the smart device 19 (e.g., geographical location, time of day, etc) or from capturing devices such as a camera or a RFID tag/reader 2 . Associating contextual data 26 with motion data 25 allows the application 19 on the auxiliary smart device 18 to provide additional information to the user related to the health, fitness or motions being performed by the user.
- sources such as other application data on the smart device 19 (e.g., geographical location, time of day, etc) or from capturing devices such as a camera or a RFID tag/reader 2 .
- Associating contextual data 26 with motion data 25 allows the application 19 on the auxiliary smart device 18 to provide additional information to the user related to the health, fitness or motions being performed by the user.
- the motion tracking device 24 houses a microcontroller 1 .
- Microcontroller 1 may be a small computer on a single integrated circuit containing a processor core, memory, and programmable input/output peripherals which manage multiple inputs and outputs that take place within the motion tracking device 24 .
- Microcontroller 1 may receive direct inputs from user input 11 to power the motion tracking device 24 on/off, to trigger data visualization sent to a display 10 and to turn down the volume on a speaker 13 .
- microcontroller 1 is coupled to other components via a single printed circuit board or flexible circuit board.
- the motion processing unit 5 is connected to the microcontroller 1 and a regulated power supply 17 .
- Motion processing unit 5 includes multiple sensors which measure user 23 's repetitive movements 32 and non-repetitive movements 33 .
- Each component within the motion processing unit 5 measures a type of motion.
- the accelerometer 6 detects changes in orientation and acceleration of the motion tracking device 24
- the gyroscope 7 measures the angular velocity
- the magnetometer 8 measures the strength and direction of magnetic fields.
- the sensors in the motion processing unit 5 allow the motion tracking device 24 to track the movements performed by the user 23 .
- motion data 25 is recorded by the motion processing unit 5 , it may be sent to one or more locations.
- motion data 25 is sent from the motion processing unit 5 to the microcontroller 1 , where motion data 25 may be temporarily stored in an onboard memory 9 .
- motion data 25 along with the possible contextual data 26 , are sent to smart device 18 via a communications module 4 .
- motion data 25 may be sent directly to smart device 18 by the communications module 4 .
- Communications module 4 is, in one embodiment, a Bluetooth module, but could also include Wi-Fi, zigbee, or any other form of wireless communication, either in conjunction with or instead of Bluetooth.
- the communications module 4 is coupled to other components such as the microcontroller 1 and a regulated power supply 17 .
- the regulated power supply 17 regulates the power transferred to different components from a battery 16 .
- a recharge management 15 component acquires power from a USB input 12 and delivers it to the battery 16 .
- the recharge management 15 component acquires power from other forms of input and is not limited to acquiring power from the USB input 12 .
- Battery 16 may be, but is not limited to, a rechargeable or non-rechargeable lithium ion battery, a rechargeable or non-rechargeable nickel metal hydride battery, a rechargeable or a non-rechargeable alkaline battery.
- the battery 16 sends the power needed to the regulated power supply 17 . The regulated power supply then distributes power to all components which need it.
- the motion tracking device 24 may be powered using solar cells mounted on a surface of the motion tracking device 24 .
- the speaker 13 is connected to the microcontroller 1 and/or the regulated power supply 17 .
- the speaker 12 receives audio cues from microcontroller 1 . Sound from speaker 13 is emitted through one or more speaker ports. Speaker ports 34 may be, but not limited to, perforations located on the surface of the motion tracking device 24 .
- Microcontroller 1 may also use the vibrator 14 to send tactile cues to the user 23 . Vibrator 14 can be an off-axis motor which when triggered by microcontroller 1 creates a vibrating sensation for user 23 . Vibrator 14 is connected to microcontroller 1 and regulated power supply 17 , power is pulled from battery 16 to power the component.
- the motion tracking device 24 is a wearable apparatus intended to be worn by the user 23 while performing repetitive movements 32 .
- Motion tracking device 24 may be wrapped around a limb or part of the user 23 's body using a strap band and a strap connector (not shown in FIG. 1 ).
- Motion tracking device 24 has a surface which may be intended to communicate and/or display data to the user 23 via components such as display 10 and speaker ports 34 .
- Display 10 is a visual screen that the user 23 can read. Functions pertaining to display 10 may be, but are not limited to, displaying interpreted data 28 , managing interpreted data 28 , displaying battery life and managing the settings installed on motion tracking device 24 such as the volume associated with speaker 13 .
- the display 10 may be, but is not limited to, an LED display, an LCD display, an electronic ink display, plasma display or ELD display and may be, but not limited to, being mounted on the surface of the motion tracking device 24 .
- the speaker port is a collection of perforations that emit audio cues given off by speaker 13 .
- the speaker port may be, but is not limited to, being located on the surface of the motion tracking device 24 , it may be located in other locations such as on a side wall of the motion tracking device 24 .
- User inputs 11 for example, buttons, protrude through the surface 36 of motion tracking device 24 .
- User inputs 11 may be located on any other exterior surface of motion tracking device 24 such as side wall.
- Functions of user inputs 11 may be, but are not limited to, scrolling through interpreted data 28 on display 10 , turning motion tracking device 24 on/off, managing interpreted data 28 via display 10 , visualizing battery life, displaying notifications regarding motion tracking device 24 and managing volume levels of speaker 13 .
- Motion tracking device 24 is charged via charging port.
- the charging port may be, but is not limited to being, located on the side wall of the motion tracking device 24 .
- the charging port may be a micro USB input 12 , a mini USB port, an audio input, or any other means of transferring power.
- the motion tracking device 24 may be, but not limited to, being manufactured out of a flexible composite, so it may naturally convert from laid out flat, to wrapped around a limb.
- motion tracking device 24 including the strap bands and the surface of the motion tracking device 24 , is injection-molded out of a water resistant silicone, capable of various ranges of motion without causing stress on the silicone or the internal components.
- the strap bands may be made of rugged textile, capable of various ranges of movement.
- the strap connectors 41 have contact surfaces which may be, but not limited to a VelcroTM adhesive, magnetic tape a snapping mechanism or any other components thereof.
- the strap bands are embedded with magnets which then create the resulting connection between each strap band 40 .
- the motion tracking device 24 may be of various lengths and sizes, dependent on the part of the body from which motion data 25 is being recorded.
- strap bands 40 may be capable of stretching to meet the length requirements necessary to secure motion tracking device around user 23 via strap connector 41 .
- the motion tracking device 24 houses components for capturing contextual data 26 such as a camera or a RFID tag reader 2 .
- the camera captures images or videos of the environment the user is in or items the user is interacting with. For example if a user is performing a curl, the camera may capture an image of the dumbbell being used by the user to perform a curl, as the user is performing a curl.
- the microcontroller 1 receives the image from the camera and sends the image to the software application 19 .
- the software application 19 may process the captured image (contextual data 26 ) and generate interpreted data 28 identifying the weight of the dumbbell being used by the user.
- an RFID tag reader 2 may capture an RFID tag associated with the dumbbell being used by the user to perform a curl.
- the microcontroller 1 receives the RFID tag identifier from the RFID tag reader 2 and sends the RFID tag to the software application 19 .
- the software application 19 may process the RFID tag (contextual data 26 ) and generate interpreted data 28 identifying the weight of the dumbbell being used by the user, as identified by the RFID tag.
- the user may input the contextual information via the software application 19 and/or the motion tracking device 24 .
- motion data 25 and contextual data 26 are sent to the software application 19 installed onto smart device 18 .
- Software application 19 interprets motion data 25 and contextual data 26 into interpreted data 28 .
- the interpreted data 28 may include the user's 23 movements, pauses in movement, collections of movements and any other contextual information related to the user's movements.
- Interpreted data 28 can also be interpretations of contextual data 26 , which can also, in one aspect, include estimates of the calories burned by the user during a given exercise reflected by a given set of motion data 25 using a piece of equipment identified by a given set of contextual data 26 .
- the interpreted data 28 includes the performance of the user during a set of motions and feedback provided to the user during and/or after the user performs a set of motions.
- the smart device 18 may be any device capable of accepting wireless data transfer such as a smartphone, tablet or a laptop. In one embodiment the smart device 18 has computing power sufficient to run the software application 19 .
- the software application 19 interacts with the smart device 18 through a smart device API.
- the software application 19 receives motion data 25 from the motion tracking device 24 by using the smart device API to interact with the communication module 4 .
- the software application 19 may be adapted to interact with a variety of smart device APIs. This would allow the software application 19 to function on a variety of smart device platforms 18 each having their own smart device API. Hence, the user is not restricted to a specific smart device 18 in order to be able to use the application 19 .
- the software application 19 is hosted or installed on the motion tracking device 24 .
- the software application 19 may be executed by the processor on the microcontroller 1 .
- the analysis of the motion data 25 may be performed by the software application 19 on the motion tracking device 24 , independent of the smart device 18 or in combination with the smart device 18 and/or a remote processing device, e.g., server 21 .
- the software application may be installed on a device with at least one sensing component, such as a smartphone.
- the software application 19 in this embodiment may use motion data 25 provided by the sensors on the device to generate interpreted data 28 , and not on the pairing of the smart device 18 and the motion tracking device 24 .
- the application 19 installed on a smartphone 18 may use the motion data, generated by the accelerometer 6 on the smart phone to determine the number of steps taken by the user as the user 23 walked from his/her house to work.
- the motion tracking system 100 is not restricted to the coupling of a smart device 18 and a motion tracking device 24 , and can be performed in any number of steps with any number of devices involving the transfer of motion data 25 to the software application 19 .
- sensor information from multiple devices e.g., smart device 18 and motion tracking device 24 , can be used by software application 19 .
- the interpreted data 28 is sent from the smart device 18 to a remote processing device (cloud based device and system), e.g., server 21 via a wireless data transfer or a network.
- a remote processing device cloud based device and system
- server 21 will be used in this description, but any remote, e.g., cloud based, processing device including multiple devices such as a remote database, storage, memory, processor(s) can be used.
- the server 21 can be any remote processing device.
- the Server 21 attaches/correlates/identifies the interpreted data 28 to a user profile 29 .
- the user 23 may then review, access and/or visualize the interpreted data 28 history associated with their user profile 29 via any device capable of wireless data transfer such as, without limitation, a smart phone, a tablet, a motion tracking device, or a computer, using a dedicated software application or a web browser to display the interpreted data 28 .
- the interpreted data 28 is also relayed back to the user 23 through software application 19 installed on the smart device 18 or on the motion tracking device 24 .
- Interpreted data 28 may be displayed by the software application 19 for the user 23 to see during and/or following the user 23 performing movements. Feedback regarding the interpreted data 28 may be provided to the user 23 in real time in a number of ways.
- visual feedback is provided to the user 23 either on the smart device 18 or on the display 10 of the motion tracking device 24 .
- audio feedback is provided to the user through speakers on the smart device 18 or on the motion tracking device 24 .
- Tactile feedback may also be provided to the user through the vibrator 14 .
- the software application 19 may be stored on the server 21 .
- the software application 19 on the server 21 may analyze the motion data 25 sent to the server and generate the interpreted data 28 to associate with a user profile 29 .
- the smart device 18 may send motion data 25 received from the motion tracking device 24 to the server 21 for processing.
- the processing of the motion data 25 by the software application is not limited to taking place on the smart device 18 or on the motion tracking device 24 .
- the software application 19 or other code stored on the smart device 18 or the motion tracking device 24 may regulate the power consumed by the sensors by turning on or off one or more sensors.
- the sensors are turned off when the user has not activated or moved the device 24 .
- one or more sensors are turned on or off for particular movements performed by the user.
- FIG. 2 is a flowchart illustrating one implementation of the motion tracking system 100 , according to one embodiment.
- the user is using the motion tracking system 100 as an artificial aide and a monitoring unit while performing a fitness routine.
- the user activates 205 the motion tracking device 24 or the application 19 on the motion tracking device 24 by either pressing user input 11 or moving the motion tracking device 24 .
- the application 19 on the motion tracking device 24 identifies that the user has activated the device based on motion data 25 received from the sensors.
- the user then begins the fitness routine by either following a routine suggested by the application 19 or by following a routine the user would like to perform.
- the routine suggested by the application 19 may include 3 sets of hammer curls using 30 pound dumbbells with a rest period of 60 seconds between each set, followed by 4 sets of 20 crunches with a rest period of 30 seconds between each set.
- the application 19 monitors a number of characteristics related to the movements performed by the user based on the motion data 25 . For example the application 19 determines and monitors 215 the type of exercise being performed by the user, the quality of the form of the user as the user is performing the exercise and/or the number of counts or repetitions performed by the user. In one embodiment, the application 19 suggests and monitors 215 the rest time observed by the user in-between sets of exercises as the user goes through the fitness routine.
- the application 19 may also provide feedback 220 to the user in real-time as the user performs the fitness routine.
- the vibrator 14 on the motion tracking device 24 may vibrate, notifying the user of bad form as the user is performing a curl.
- the feedback includes, charts and tables displayed on the display 10 of the motion tracking device 24 describing the performance of the user through the fitness routine.
- the application 19 sends 225 the interpreted data and a performance data to the server 21 .
- the performance data may include statistics describing the performance of the user throughout the fitness routine, or quantitative metrics (e.g., percentage of routine completed, goals reached, repetitions of each exercise, etc) evaluating the fitness routine performed by the user.
- the server 21 then associates or attaches 230 the performance data and/or the interpreted data 28 to the user's user profile 29 .
- FIG. 3 is a flowchart illustrating the motion tracking system 100 monitoring user movements, according to one embodiment.
- the application 19 monitors the movements made by the user based on the raw real time motion data 25 obtained 305 from the sensors.
- the sensors generate raw real time motion data 25 based on the movements of the user.
- the accelerometer 6 generates acceleration data and change in acceleration data based on the relative movement of the device 18 , 24 on the user.
- the application 19 processes 310 the real time motion data obtained 305 from one or more of the sensors or the motion tracking device 24 .
- Processing 310 the raw real time data or signal removes the noise and other irrelevant features carried by the signal.
- a low pass filter is used to filter out the noise in the raw signal obtained 305 from the sensors.
- a moving average filter is used to filter out the noise in the raw signal obtained 305 from the sensors. It is understood that other filters can be used to increase the signal-to-noise ratio of the raw signals.
- the application 19 determines 315 a classification of the movement performed by the user based on one or more processed real time signals. Classifying the movement performed by the user is important as it helps the system identify and understand the movement being performed by the user. For example, the application 19 first determines 315 that the user is performing a curl, prior to identifying the characteristics associated with the user performing the curl, such as the form of the user's movements with respect to that of a correct curl movement.
- a classification algorithm may be a machine learning algorithm, a pattern recognition algorithm, a template matching algorithm, a statistical inference algorithm, and/or an artificial intelligence algorithm that operates based on a learning model.
- kNN k-Nearest Neighbor
- SVM Support Vector Machines
- ANN Artificial Neural Networks
- Decision Trees k-Nearest Neighbor
- the application 19 quantifies 320 characteristics of the movement being performed by a user such as the count of the number of repetitive movements made by the user to determine the repetitions of a movement performed by a user. For example, the application 19 determines the number of times a user has performed a curl during a given set of curls, based on the number of repetitive movements (that have been classified as a curl) performed by the user. In one embodiment, the application 19 determines the number of real peaks present in a rolling window of one or more signals.
- a real peak may be determined based on the amplitude of the peak relative to the whole signal and/or other contextual information such as the expected pattern of peaks or duration of peaks for the classified or identified movement being performed by the user.
- the application 19 may have identified that the user is performing a curl. Based on this information, real peaks may be known to appear in the rolling window of the z-axis of the accelerometer 6 signal above an amplitude of 0.6 G and over a time period n as a user is performing a curl. Similarly real peaks may be known to appear in the rolling window of the y-axis of the gyroscope 7 signal above an amplitude of 0.5 radians/sec and over a period of 2n as a user is performing a curl.
- the application 19 may count the number of real peaks present in the z-axis accelerometer 6 signal as 1 per time period of n, and those present in the y-axis gyroscope 7 signal as 2 per period of 2n, thereby counting the number of curls performed by the user.
- the application 19 may quantify 320 other characteristics of the movement being performed by the user such as the speed of the movement being performed by the user.
- the application 19 may determine the time period over which a peak or valley or morphological feature in one or more signals occurs to determine the rate at which each repetitive movement is performed by the user. Longer time periods may correspond to slower movement speeds, and shorter time periods may correspond to fast movement speeds.
- the application 19 may thus quantify 320 a number of characteristics associated with the movements performed by a user based on the morphological features present in one or more signals.
- FIG. 4 illustrates repeated 405 and non-repeated 410 movements present in the processed signal, according to one embodiment.
- the application 19 counts the repetitive movements 405 performed by the user during a fitness routine. For example, if the repeated movements 405 were that of curls performed by the user, the application 19 would determine that the user performed 5 repeated movements or 5 curls.
- the application 19 differentiates between the non-repeated movements 410 represented by a portion of the processed signal and the repeated movements 405 represented by a different portion of the processed signal.
- the application 19 identifies groups of repeated movements 405 performed by a user.
- the fitness routine suggested by the application may include the user receiving instructions to perform 3 sets of 5 curls with a rest time of 30 seconds between each set.
- the application 19 based on the processed real time signal first identifies and classifies the user's movements as curls. Then the application 19 is notified of the user performing the first set of curls, based on the user performing repetitive curl movements 405 .
- the application 19 After the application 19 has recorded group 1 ( 415 ) comprising of 5 curls, the application 19 also monitors the transition time 1 ( 430 ) or the rest time, represented by the non-repeated movements 410 between groups 1 ( 415 ) and 2 ( 420 ).
- the application 19 then monitors group 2 ( 420 ) comprising of 5 curls, and the transition time 2 ( 435 ) between group 2 ( 420 ) and group 3 ( 425 ).
- the application 19 identifies that the user has finished the 3 sets of curls once the application has finished monitoring group 3 ( 425 ), the last set of curls performed by the user.
- the application 19 monitors the fitness routine followed by the user based on the processed real time signal, representing the movements performed by a user.
- FIG. 5 is a flowchart illustrating the motion tracking system 100 identifying user movements based on motion data 25 , according to one embodiment.
- the application 19 determines the movement performed by the user based on one or more processed real time signals.
- the application 19 extracts 505 a set of statistical or morphological features present in a rolling window of one or more processed signals. Features may include amplitude, mean value, variance, standard deviation of the signal, the number of valleys and/or peaks, the order of valleys and/or peaks, the amplitude of valleys and/or peaks, the frequency of valleys and/or peaks and/or the time period of valleys and/or peaks in one or more processed signals.
- the z-axis of the accelerometer 6 might record a repetitive pattern of a single peak over a time period n, followed by a single valley also over a time period n.
- the y-axis of the accelerometer 6 might record a repetitive pattern of a valley between 2 peaks during the same time period n.
- the extracted features are used by the classification algorithm to detect the movement being performed by the user.
- the application 19 applies a template matching algorithm to identify repetitive features (such as peaks and valleys in the signal) in a rolling window of one or more processed signals.
- the application 19 compares the repetitive features in the rolling window of one or more processed signals to a set of movement templates 27 stored in a movement template database 22 on the smart device 18 or the motion tracking device 24 . Based on the comparison, the application 19 then selects the movement templates 27 in the database 22 having repetitive features most similar to or most closely matching those present in one or more processed signals. Based on one or more or a combination of the selected templates the application 19 identifies and classifies 515 the movement being performed by the user.
- the application 19 compares the repetitive features present in the z-axis of the accelerometer 6 signal and the y-axis of the gyroscope 7 as a user is performing a curl, with the movement templates 27 stored in the movement template database 22 .
- the application 19 selects a z-axis acceleration signal movement template 27 and a y-axis gyroscope 7 signal movement template 27 similar to that of the recorded signals.
- the application 19 identifies 515 that the user is performing a curl, as the two movement templates 27 selected by the application 19 are known to be associated with a curl movement.
- template matching algorithms is cross correlation algorithm.
- Another example is dynamic time warping.
- the application 19 guides the user through a fitness routine. As the application 19 , is guiding the user through the fitness routine, the application 19 is aware of the repetitive movement being performed by the user. Thus, the application 19 , may verify the movement identified by the application 19 based on the recorded motion data 25 with the movement the application 19 expects the user to perform based on the fitness routine. In a second embodiment, as the application 19 is aware of the movement being performed by the user, the application 19 no longer needs to identify the movement being performed by the user based on the motion data, and hence begins to count the repetitive features present in one or more processed signals to determine the repetitions performed by the user.
- the application 19 may compare the recorded motion data to a subset of movement templates 27 in the movement template database 22 , wherein the subset of movement templates 27 represent templates related to the movements the application 19 expects the user to perform. For example, if the application 19 is aware that the user is currently performing a curl as part of a fitness routine, the application 19 would compare the recorded motion data 19 with that of movement templates 27 associated with the curl classification of movements.
- the application 19 determines 510 the statistical characteristics such as a mean or standard deviation associated with the repetitive features in one or more signals. For example, the application 19 may determine 510 the mean of the amplitude of the peaks recorded in one or more signals, while the user is performing curls. If the mean is found to be relatively greater than that of the expected threshold for characterizing real peaks, the application 19 may raise the weight for the next set of curls suggested to the user as a relatively higher mean implies that the user was able to perform a current curl easier (at a faster rate) than that is expected. In another embodiment, the application may determine the standard deviation of the amplitude and frequency of the peaks recorded in one or more signals, while the user is performing curls.
- the application may determine the standard deviation of the amplitude and frequency of the peaks recorded in one or more signals, while the user is performing curls.
- the standard deviation is found to be outside of an expected range of standard deviation values for a curl action, it is possible that even though the pattern of features may have been identified to match a curl, the user may not really be performing a curl, but may be performing a different motion similar to a curl.
- the statistical characteristics of the features in one or more signals provide additional information towards the identifying 515 the movement performed by the user.
- the application 19 uses machine learning algorithms to detect movements, and/or classify or identify 505 movements performed by a user in a rolling window of one or more signals based on the repetitive features or morphological features present in one or more signals.
- An example of a recognition and a learning algorithm that may be used is described in Ling Bao et. al, “Activity Recognition from user-Annotated Acceleration Data”, which is incorporated by reference herein in its entirety.
- the user may edit the interpreted data 28 generated by the motion tracking system 100 .
- the user may edit the number of repetitive movements or the count associated with a motion or exercise performed by the user as determined by the motion tracking system 100 .
- the edits performed by the user along with the motion data associated with the motion performed by the user are analyzed by the motion tracking system 100 .
- the motion tracking system 100 modifies the classification algorithm, the template matching algorithm, or the machine learning algorithm used to generate the interpreted data 28 .
- the algorithms are modified to capture the discrepancies in the interpreted data generated prior to the user editing the interpreted data, and thereby cater to the behavior and unique movements and tendencies associated with the user.
- the motion tracking system 100 determines that the user performed 11 counts of a repetitive motion.
- the user may edit the count from 11 to 10.
- the motion tracking system 100 identifies the discrepancy between the features of the motion data associated with the repetitive movement and one or more algorithms used to classify the repetitive movement and quantify the repetitive movement. For example, based on the template matching algorithm, the motion tracking system 100 may have classified a feature similar to that associated with the repetitive movement as a repetitive movement performed by the user. Based on the edit made by the user, the motion tracking system 100 modifies the applied algorithm to no longer classify and quantify the feature similar to that associated with the repetitive movement.
- the motion tracking system 100 receives an edit from the user correcting the user's heart rate measured as the user was performing a motion.
- the motion tracking system 100 may retrieve the edit made by the user and the motion data associated with the edit (e.g., motion data associated with the user using a cardio fitness machine as the motion tracking system 100 measured the user's heart rate).
- the motion tracking system 100 may modify the algorithm applied to quantify the user's heart rate (based on the heart rate data received).
- the motion tracking system 100 may then apply the modified algorithm when the user performs the motion or similar motion data is received in the future.
- the motion tracking system 100 groups the edit information received from a number of users based on one or more characteristics associated with the users such as data associated with the physical measurements of a user and/or the type of edits made by the user to the interpreted data 28 generated by the motion tracking system. Based on the types of edits made by a user and the grouping associated with the type of edits, the motion tracking system 100 may modify the algorithms applied to classify and quantify repetitive motions performed by the user. For example, users of a certain height range may perform particular motions prior to beginning a set of pull ups. These motions may accidentally be classified and quantified as a count or pull up performed by the user. The motion tracking system 100 may modify the algorithms classifying and quantifying pull ups performed by the user based on the height information associated with the user.
- Certain aspects of the embodiments include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the embodiments can be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by a variety of operating systems. The embodiments can also be in a computer program product which can be executed on a computing system.
- the embodiments also relate to an apparatus for performing the operations herein.
- This apparatus may be specially constructed for the purposes, e.g., a specific computer, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
- a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
- Memory can include any of the above and/or other devices that can store information/data/programs and can be transient or non-transient medium, where a non-transient or non-transitory medium can include memory/storage that stores information for more than a minimal duration.
- the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physical Education & Sports Medicine (AREA)
- Psychiatry (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Biodiversity & Conservation Biology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
Abstract
A motion tracking system monitors the motions performed by a user based on motion data received from one or more sensors. The motion tracking system may include a motion tracking device with one or more sensors, a smart device with one or more sensors and/or a server. As the user interacts with the motion tracking system or smart device the motion data generated by one or more sensors is processed by a software application. The software application may be present on the smart device, the server, and/or the motion tracking device. The software application generates interpreted data based on the motion data and contextual data such as the equipment being used by the user. The interpreted data is then provided to the user during and/or after the user has performed a motion or a set of motions. The feedback provided to the user may be visual, audio or tactile.
Description
- This application claims the benefit of U.S. Provisional Patent Application No. 61/792,601, filed on Mar. 15, 2013, U.S. Provisional Patent Application No. 61/873,339, filed on Sep. 3, 2013, and U.S. Provisional Patent Application No. 61/873,347, filed on Sep. 3, 2013, which are all incorporated by reference herein in their entirety.
- This application is related to U.S. patent application titled “System and Method for Identifying and Interpreting Repetitive Motions”, filed on Mar. 14, 2014, the contents of which are hereby incorporated by reference.
- The disclosure generally relates to the field of tracking user movements, and in particular to monitoring and quantifying repetitive and non-repetitive movements made by a user.
- Motion processing and wireless communication technology allows people to track things such as their sleeping patterns and the amount of steps they walk each day. However, motion capturing devices and functionality have not seen much success in the marketplace because of limits in the functions that can be performed and movement that can be monitored, for example.
- Embodiments include, a motion tracking system that monitors the motions performed by a user in real time, based on motion data received from one or more sensors. The motion tracking system may include a motion tracking device with one or more sensors, a smart device with one or more sensors and/or a server, for example. The user may wear the motion tracking device and or/carry the motion tracking device or the smart device while performing motions. As the user interacts with the motion tracking system or smart device the motion data generated by one or more sensors is processed by a software application. The software application may be present on the smart device, the server, and/or the motion tracking device.
- The software application generates interpreted data based on the motion data and contextual data such as the equipment being used by the user. The interpreted data may include the performance of the user as the user performs a motion and/or feedback provided to the user during or after the user performs a motion or set of motions. The software application identifies the movement being performed by the user based on features present in one or more signals of the motion data. The software application may count and or generate motion metrics associated with the performance of the user as a user performs a motion. The interpreted data is then provided to the user during and/or after the user has performed a motion or a set of motions. The feedback provided to the user may be visual, audio or tactile, for example.
- As the software application is monitoring the user's movements, evaluating and keeping track of qualitative and quantitative metrics such as the current exercise being performed by the user, the number of repetitions performed by the user and the form of the user, all in real time and/or after the user has performed the motion, a set of motions, multiple sets of motions and/or one or more routines. Thus, the user does not have to provide input to the application by interacting with the smart device or the motion tracking device. Hence, the user has the freedom to perform the workout at his/her own pace, without the interruption of periodically providing user input to the application via the smart device or the motion tracking device.
- The features and advantages described in the specification are not all inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.
- The drawings presented herein are for the purposes of illustration, the embodiments are not limited to the precise arrangements and instrumentalities shown.
-
FIG. 1 is a perspective view of a motion tracking system, according to one embodiment. -
FIG. 2 is a flowchart illustrating one implementation of the motion tracking system, according to one embodiment. -
FIG. 3 is a flowchart illustrating the motion tracking system monitoring user movements, according to one embodiment. -
FIG. 4 illustrates repeated and non-repeated movements present in the processed signal, according to one embodiment. -
FIG. 5 is a flowchart illustrating the motion tracking system identifying user movements based on motion data, according to one embodiment. - The figures depict various embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the embodiments described herein.
- Embodiments are now described with reference to the figures where like reference numbers indicate identical or functionally similar elements. Also in the figures, the left most digit(s) of each reference number corresponds to the figure in which the reference number is first used.
-
FIG. 1 is a perspective view of amotion tracking system 100, according to one embodiment. In one aspect of an embodiment, as discussed in detail with reference to the figures below, auser 23 wears amotion tracking device 24 whilesuch user 23 is performing motions such as weight training, walking and cardiovascular movements and/or lifting objects. Themotion tracking system 100 monitors the motion of a user in real time. In one embodiment, themotion tracking device 24 includes amotion processing unit 5 which measures arepetitive movement 32 or a non-repetitivemovement 33 performed by theuser 23. Themotion processing unit 5 includes one or more sensors, such as anaccelerometer 6, agyroscope 7 and/or amagnetometer 8. Themotion data 25 measured by the sensors and themotion processing unit 5 may be used to monitor the movements of a user in real time. - The
motion data 25 is transmitted to an auxiliarysmart device 18 running asoftware application 19. Theapplication 19 analyzes themotion data 25 and generates an interpreteddata 28 to provide to theuser 23. Theapplication 19 also provides theuser 23 with feedback regarding the user's movements. For example, theapplication 19 may analyzemotion data 25 related to a user performing an exercise and provide feedback to theuser 23 in real time. The feedback may include the quality of the form of the user's motion, recommendations for other exercises or the performance of the user.Motion data 25 is also, in one aspect, analyzed by theapplication 19 along withcontextual data 26. Thecontextual data 26 may be gathered from a number of sources such as other application data on the smart device 19 (e.g., geographical location, time of day, etc) or from capturing devices such as a camera or a RFID tag/reader 2. Associatingcontextual data 26 withmotion data 25 allows theapplication 19 on the auxiliarysmart device 18 to provide additional information to the user related to the health, fitness or motions being performed by the user. - In one embodiment the
motion tracking device 24 houses amicrocontroller 1.Microcontroller 1 may be a small computer on a single integrated circuit containing a processor core, memory, and programmable input/output peripherals which manage multiple inputs and outputs that take place within themotion tracking device 24.Microcontroller 1 may receive direct inputs fromuser input 11 to power themotion tracking device 24 on/off, to trigger data visualization sent to adisplay 10 and to turn down the volume on aspeaker 13. In one embodiment,microcontroller 1 is coupled to other components via a single printed circuit board or flexible circuit board. - In one embodiment the
motion processing unit 5 is connected to themicrocontroller 1 and a regulatedpower supply 17.Motion processing unit 5 includes multiple sensors which measureuser 23'srepetitive movements 32 and non-repetitivemovements 33. Each component within themotion processing unit 5 measures a type of motion. For example, theaccelerometer 6 detects changes in orientation and acceleration of themotion tracking device 24, thegyroscope 7 measures the angular velocity and themagnetometer 8 measures the strength and direction of magnetic fields. Hence, the sensors in themotion processing unit 5 allow themotion tracking device 24 to track the movements performed by theuser 23. Whenmotion data 25 is recorded by themotion processing unit 5, it may be sent to one or more locations. In one aspect of the present disclosure,motion data 25 is sent from themotion processing unit 5 to themicrocontroller 1, wheremotion data 25 may be temporarily stored in anonboard memory 9. In one embodiment,motion data 25, along with the possiblecontextual data 26, are sent tosmart device 18 via acommunications module 4. - In one aspect of the present disclosure,
motion data 25 may be sent directly tosmart device 18 by thecommunications module 4.Communications module 4 is, in one embodiment, a Bluetooth module, but could also include Wi-Fi, zigbee, or any other form of wireless communication, either in conjunction with or instead of Bluetooth. Thecommunications module 4 is coupled to other components such as themicrocontroller 1 and aregulated power supply 17. Theregulated power supply 17 regulates the power transferred to different components from abattery 16. - In one embodiment, a
recharge management 15 component acquires power from aUSB input 12 and delivers it to thebattery 16. In another embodiment, therecharge management 15 component acquires power from other forms of input and is not limited to acquiring power from theUSB input 12.Battery 16 may be, but is not limited to, a rechargeable or non-rechargeable lithium ion battery, a rechargeable or non-rechargeable nickel metal hydride battery, a rechargeable or a non-rechargeable alkaline battery. In one embodiment, thebattery 16 sends the power needed to theregulated power supply 17. The regulated power supply then distributes power to all components which need it. These components include but are not limited to themicrocontroller 1,communications module 4,motion processing unit 5,memory 9,display 10,speaker 13 and avibrator 14. In one aspect of the present disclosure, themotion tracking device 24 may be powered using solar cells mounted on a surface of themotion tracking device 24. - In one embodiment the
speaker 13 is connected to themicrocontroller 1 and/or theregulated power supply 17. Thespeaker 12 receives audio cues frommicrocontroller 1. Sound fromspeaker 13 is emitted through one or more speaker ports. Speaker ports 34 may be, but not limited to, perforations located on the surface of themotion tracking device 24.Microcontroller 1 may also use thevibrator 14 to send tactile cues to theuser 23.Vibrator 14 can be an off-axis motor which when triggered bymicrocontroller 1 creates a vibrating sensation foruser 23.Vibrator 14 is connected tomicrocontroller 1 andregulated power supply 17, power is pulled frombattery 16 to power the component. - In one embodiment, the
motion tracking device 24 is a wearable apparatus intended to be worn by theuser 23 while performingrepetitive movements 32.Motion tracking device 24 may be wrapped around a limb or part of theuser 23's body using a strap band and a strap connector (not shown inFIG. 1 ).Motion tracking device 24 has a surface which may be intended to communicate and/or display data to theuser 23 via components such asdisplay 10 and speaker ports 34.Display 10 is a visual screen that theuser 23 can read. Functions pertaining to display 10 may be, but are not limited to, displaying interpreteddata 28, managing interpreteddata 28, displaying battery life and managing the settings installed onmotion tracking device 24 such as the volume associated withspeaker 13. Thedisplay 10 may be, but is not limited to, an LED display, an LCD display, an electronic ink display, plasma display or ELD display and may be, but not limited to, being mounted on the surface of themotion tracking device 24. The speaker port is a collection of perforations that emit audio cues given off byspeaker 13. The speaker port may be, but is not limited to, being located on the surface of themotion tracking device 24, it may be located in other locations such as on a side wall of themotion tracking device 24.User inputs 11, for example, buttons, protrude through the surface 36 ofmotion tracking device 24.User inputs 11 may be located on any other exterior surface ofmotion tracking device 24 such as side wall. Functions ofuser inputs 11 may be, but are not limited to, scrolling through interpreteddata 28 ondisplay 10, turningmotion tracking device 24 on/off, managing interpreteddata 28 viadisplay 10, visualizing battery life, displaying notifications regardingmotion tracking device 24 and managing volume levels ofspeaker 13.Motion tracking device 24 is charged via charging port. The charging port may be, but is not limited to being, located on the side wall of themotion tracking device 24. The charging port may be amicro USB input 12, a mini USB port, an audio input, or any other means of transferring power. - The
motion tracking device 24 may be, but not limited to, being manufactured out of a flexible composite, so it may naturally convert from laid out flat, to wrapped around a limb. In one aspect,motion tracking device 24, including the strap bands and the surface of themotion tracking device 24, is injection-molded out of a water resistant silicone, capable of various ranges of motion without causing stress on the silicone or the internal components. According to one aspect of the present disclosure, the strap bands may be made of rugged textile, capable of various ranges of movement. The strap connectors 41 have contact surfaces which may be, but not limited to a Velcro™ adhesive, magnetic tape a snapping mechanism or any other components thereof. In one aspect of the present disclosure, the strap bands are embedded with magnets which then create the resulting connection between each strap band 40. - The
motion tracking device 24 may be of various lengths and sizes, dependent on the part of the body from whichmotion data 25 is being recorded. In one aspect of the present disclosure, strap bands 40 may be capable of stretching to meet the length requirements necessary to secure motion tracking device arounduser 23 via strap connector 41. - In one embodiment, the
motion tracking device 24 houses components for capturingcontextual data 26 such as a camera or aRFID tag reader 2. The camera captures images or videos of the environment the user is in or items the user is interacting with. For example if a user is performing a curl, the camera may capture an image of the dumbbell being used by the user to perform a curl, as the user is performing a curl. Themicrocontroller 1 receives the image from the camera and sends the image to thesoftware application 19. Thesoftware application 19 may process the captured image (contextual data 26) and generate interpreteddata 28 identifying the weight of the dumbbell being used by the user. In another example, anRFID tag reader 2 may capture an RFID tag associated with the dumbbell being used by the user to perform a curl. Themicrocontroller 1 receives the RFID tag identifier from theRFID tag reader 2 and sends the RFID tag to thesoftware application 19. Thesoftware application 19 may process the RFID tag (contextual data 26) and generate interpreteddata 28 identifying the weight of the dumbbell being used by the user, as identified by the RFID tag. In an alternate embodiment, the user may input the contextual information via thesoftware application 19 and/or themotion tracking device 24. - In one embodiment,
motion data 25 andcontextual data 26 are sent to thesoftware application 19 installed ontosmart device 18.Software application 19 interpretsmotion data 25 andcontextual data 26 into interpreteddata 28. In one embodiment, the interpreteddata 28 may include the user's 23 movements, pauses in movement, collections of movements and any other contextual information related to the user's movements. Interpreteddata 28 can also be interpretations ofcontextual data 26, which can also, in one aspect, include estimates of the calories burned by the user during a given exercise reflected by a given set ofmotion data 25 using a piece of equipment identified by a given set ofcontextual data 26. In another embodiment, the interpreteddata 28 includes the performance of the user during a set of motions and feedback provided to the user during and/or after the user performs a set of motions. - The
smart device 18 may be any device capable of accepting wireless data transfer such as a smartphone, tablet or a laptop. In one embodiment thesmart device 18 has computing power sufficient to run thesoftware application 19. Persons having skill in the art will realize that communication is not necessarily direct between themotion tracking device 24 and thesmart device 18, and could instead be indirect, via one or more intermediary devices and/or via a network such as the Internet. Thesoftware application 19 interacts with thesmart device 18 through a smart device API. Thesoftware application 19 receivesmotion data 25 from themotion tracking device 24 by using the smart device API to interact with thecommunication module 4. Thesoftware application 19 may be adapted to interact with a variety of smart device APIs. This would allow thesoftware application 19 to function on a variety ofsmart device platforms 18 each having their own smart device API. Hence, the user is not restricted to a specificsmart device 18 in order to be able to use theapplication 19. - In one embodiment the
software application 19 is hosted or installed on themotion tracking device 24. In this embodiment, thesoftware application 19 may be executed by the processor on themicrocontroller 1. Hence, the analysis of themotion data 25 may be performed by thesoftware application 19 on themotion tracking device 24, independent of thesmart device 18 or in combination with thesmart device 18 and/or a remote processing device, e.g.,server 21. In another embodiment, the software application may be installed on a device with at least one sensing component, such as a smartphone. Thesoftware application 19 in this embodiment, may usemotion data 25 provided by the sensors on the device to generate interpreteddata 28, and not on the pairing of thesmart device 18 and themotion tracking device 24. For example, theapplication 19 installed on asmartphone 18, may use the motion data, generated by theaccelerometer 6 on the smart phone to determine the number of steps taken by the user as theuser 23 walked from his/her house to work. Hence themotion tracking system 100 is not restricted to the coupling of asmart device 18 and amotion tracking device 24, and can be performed in any number of steps with any number of devices involving the transfer ofmotion data 25 to thesoftware application 19. In alternate embodiments, sensor information from multiple devices, e.g.,smart device 18 andmotion tracking device 24, can be used bysoftware application 19. - In one embodiment the interpreted
data 28 is sent from thesmart device 18 to a remote processing device (cloud based device and system), e.g.,server 21 via a wireless data transfer or a network. For ease of reference,server 21 will be used in this description, but any remote, e.g., cloud based, processing device including multiple devices such as a remote database, storage, memory, processor(s) can be used. Theserver 21 can be any remote processing device. TheServer 21 attaches/correlates/identifies the interpreteddata 28 to auser profile 29. Theuser 23 may then review, access and/or visualize the interpreteddata 28 history associated with theiruser profile 29 via any device capable of wireless data transfer such as, without limitation, a smart phone, a tablet, a motion tracking device, or a computer, using a dedicated software application or a web browser to display the interpreteddata 28. In another embodiment, the interpreteddata 28 is also relayed back to theuser 23 throughsoftware application 19 installed on thesmart device 18 or on themotion tracking device 24. Interpreteddata 28 may be displayed by thesoftware application 19 for theuser 23 to see during and/or following theuser 23 performing movements. Feedback regarding the interpreteddata 28 may be provided to theuser 23 in real time in a number of ways. In one example visual feedback is provided to theuser 23 either on thesmart device 18 or on thedisplay 10 of themotion tracking device 24. In another example, audio feedback is provided to the user through speakers on thesmart device 18 or on themotion tracking device 24. Tactile feedback may also be provided to the user through thevibrator 14. - In one embodiment, the
software application 19 may be stored on theserver 21. Thesoftware application 19 on theserver 21 may analyze themotion data 25 sent to the server and generate the interpreteddata 28 to associate with auser profile 29. For example, in the instance that theuser 23 would like to save the power consumed by thesmart device 18, thesmart device 18 may sendmotion data 25 received from themotion tracking device 24 to theserver 21 for processing. Hence, the processing of themotion data 25 by the software application is not limited to taking place on thesmart device 18 or on themotion tracking device 24. - In one embodiment, the
software application 19 or other code stored on thesmart device 18 or themotion tracking device 24 may regulate the power consumed by the sensors by turning on or off one or more sensors. In one example, the sensors are turned off when the user has not activated or moved thedevice 24. In another example, one or more sensors are turned on or off for particular movements performed by the user. -
FIG. 2 is a flowchart illustrating one implementation of themotion tracking system 100, according to one embodiment. In this embodiment the user is using themotion tracking system 100 as an artificial aide and a monitoring unit while performing a fitness routine. The user activates 205 themotion tracking device 24 or theapplication 19 on themotion tracking device 24 by eitherpressing user input 11 or moving themotion tracking device 24. In one example theapplication 19 on themotion tracking device 24 identifies that the user has activated the device based onmotion data 25 received from the sensors. - The user then begins the fitness routine by either following a routine suggested by the
application 19 or by following a routine the user would like to perform. For example, the routine suggested by theapplication 19 may include 3 sets of hammer curls using 30 pound dumbbells with a rest period of 60 seconds between each set, followed by 4 sets of 20 crunches with a rest period of 30 seconds between each set. As the user performs the routine, theapplication 19 monitors a number of characteristics related to the movements performed by the user based on themotion data 25. For example theapplication 19 determines and monitors 215 the type of exercise being performed by the user, the quality of the form of the user as the user is performing the exercise and/or the number of counts or repetitions performed by the user. In one embodiment, theapplication 19 suggests and monitors 215 the rest time observed by the user in-between sets of exercises as the user goes through the fitness routine. - The
application 19 may also providefeedback 220 to the user in real-time as the user performs the fitness routine. For example thevibrator 14 on themotion tracking device 24 may vibrate, notifying the user of bad form as the user is performing a curl. In another example the feedback includes, charts and tables displayed on thedisplay 10 of themotion tracking device 24 describing the performance of the user through the fitness routine. - In one embodiment the
application 19 sends 225 the interpreted data and a performance data to theserver 21. The performance data may include statistics describing the performance of the user throughout the fitness routine, or quantitative metrics (e.g., percentage of routine completed, goals reached, repetitions of each exercise, etc) evaluating the fitness routine performed by the user. Theserver 21 then associates or attaches 230 the performance data and/or the interpreteddata 28 to the user'suser profile 29. -
FIG. 3 is a flowchart illustrating themotion tracking system 100 monitoring user movements, according to one embodiment. Theapplication 19 monitors the movements made by the user based on the raw realtime motion data 25 obtained 305 from the sensors. The sensors generate raw realtime motion data 25 based on the movements of the user. For example, theaccelerometer 6 generates acceleration data and change in acceleration data based on the relative movement of thedevice - The
application 19, then processes 310 the real time motion data obtained 305 from one or more of the sensors or themotion tracking device 24. Processing 310 the raw real time data or signal removes the noise and other irrelevant features carried by the signal. In one embodiment a low pass filter is used to filter out the noise in the raw signal obtained 305 from the sensors. In another embodiment a moving average filter is used to filter out the noise in the raw signal obtained 305 from the sensors. It is understood that other filters can be used to increase the signal-to-noise ratio of the raw signals. - In one embodiment the
application 19 determines 315 a classification of the movement performed by the user based on one or more processed real time signals. Classifying the movement performed by the user is important as it helps the system identify and understand the movement being performed by the user. For example, theapplication 19 first determines 315 that the user is performing a curl, prior to identifying the characteristics associated with the user performing the curl, such as the form of the user's movements with respect to that of a correct curl movement. - In one embodiment, a classification algorithm may be a machine learning algorithm, a pattern recognition algorithm, a template matching algorithm, a statistical inference algorithm, and/or an artificial intelligence algorithm that operates based on a learning model. Examples of such algorithms are k-Nearest Neighbor (kNN), Support Vector Machines (SVM), Artificial Neural Networks (ANN), and Decision Trees.
- In one embodiment, after the application classifies 315 the movement being performed by the user, the
application 19 quantifies 320 characteristics of the movement being performed by a user such as the count of the number of repetitive movements made by the user to determine the repetitions of a movement performed by a user. For example, theapplication 19 determines the number of times a user has performed a curl during a given set of curls, based on the number of repetitive movements (that have been classified as a curl) performed by the user. In one embodiment, theapplication 19 determines the number of real peaks present in a rolling window of one or more signals. A real peak may be determined based on the amplitude of the peak relative to the whole signal and/or other contextual information such as the expected pattern of peaks or duration of peaks for the classified or identified movement being performed by the user. For example, theapplication 19 may have identified that the user is performing a curl. Based on this information, real peaks may be known to appear in the rolling window of the z-axis of theaccelerometer 6 signal above an amplitude of 0.6 G and over a time period n as a user is performing a curl. Similarly real peaks may be known to appear in the rolling window of the y-axis of thegyroscope 7 signal above an amplitude of 0.5 radians/sec and over a period of 2n as a user is performing a curl. Hence, theapplication 19 may count the number of real peaks present in the z-axis accelerometer 6 signal as 1 per time period of n, and those present in the y-axis gyroscope 7 signal as 2 per period of 2n, thereby counting the number of curls performed by the user. - In another embodiment, the
application 19 may quantify 320 other characteristics of the movement being performed by the user such as the speed of the movement being performed by the user. Theapplication 19 may determine the time period over which a peak or valley or morphological feature in one or more signals occurs to determine the rate at which each repetitive movement is performed by the user. Longer time periods may correspond to slower movement speeds, and shorter time periods may correspond to fast movement speeds. Theapplication 19 may thus quantify 320 a number of characteristics associated with the movements performed by a user based on the morphological features present in one or more signals. -
FIG. 4 illustrates repeated 405 and non-repeated 410 movements present in the processed signal, according to one embodiment. Referring toFIG. 4 with respect to the method illustrated inFIG. 3 , theapplication 19 counts therepetitive movements 405 performed by the user during a fitness routine. For example, if the repeatedmovements 405 were that of curls performed by the user, theapplication 19 would determine that the user performed 5 repeated movements or 5 curls. Theapplication 19, differentiates between thenon-repeated movements 410 represented by a portion of the processed signal and the repeatedmovements 405 represented by a different portion of the processed signal. - In one embodiment, the
application 19 identifies groups of repeatedmovements 405 performed by a user. For example, the fitness routine suggested by the application may include the user receiving instructions to perform 3 sets of 5 curls with a rest time of 30 seconds between each set. Theapplication 19, based on the processed real time signal first identifies and classifies the user's movements as curls. Then theapplication 19 is notified of the user performing the first set of curls, based on the user performingrepetitive curl movements 405. After theapplication 19 has recorded group 1 (415) comprising of 5 curls, theapplication 19 also monitors the transition time 1 (430) or the rest time, represented by thenon-repeated movements 410 between groups 1 (415) and 2 (420). Theapplication 19 then monitors group 2 (420) comprising of 5 curls, and the transition time 2 (435) between group 2 (420) and group 3 (425). Theapplication 19 identifies that the user has finished the 3 sets of curls once the application has finished monitoring group 3 (425), the last set of curls performed by the user. Hence, theapplication 19 monitors the fitness routine followed by the user based on the processed real time signal, representing the movements performed by a user. -
FIG. 5 is a flowchart illustrating themotion tracking system 100 identifying user movements based onmotion data 25, according to one embodiment. Theapplication 19 determines the movement performed by the user based on one or more processed real time signals. Theapplication 19 extracts 505 a set of statistical or morphological features present in a rolling window of one or more processed signals. Features may include amplitude, mean value, variance, standard deviation of the signal, the number of valleys and/or peaks, the order of valleys and/or peaks, the amplitude of valleys and/or peaks, the frequency of valleys and/or peaks and/or the time period of valleys and/or peaks in one or more processed signals. For example, while performing a curl, the z-axis of theaccelerometer 6 might record a repetitive pattern of a single peak over a time period n, followed by a single valley also over a time period n. The y-axis of theaccelerometer 6 might record a repetitive pattern of a valley between 2 peaks during the same time period n. The extracted features are used by the classification algorithm to detect the movement being performed by the user. - In one embodiment the
application 19 applies a template matching algorithm to identify repetitive features (such as peaks and valleys in the signal) in a rolling window of one or more processed signals. Theapplication 19 compares the repetitive features in the rolling window of one or more processed signals to a set ofmovement templates 27 stored in amovement template database 22 on thesmart device 18 or themotion tracking device 24. Based on the comparison, theapplication 19 then selects themovement templates 27 in thedatabase 22 having repetitive features most similar to or most closely matching those present in one or more processed signals. Based on one or more or a combination of the selected templates theapplication 19 identifies and classifies 515 the movement being performed by the user. For example, theapplication 19 compares the repetitive features present in the z-axis of theaccelerometer 6 signal and the y-axis of thegyroscope 7 as a user is performing a curl, with themovement templates 27 stored in themovement template database 22. Theapplication 19 selects a z-axis accelerationsignal movement template 27 and a y-axis gyroscope 7signal movement template 27 similar to that of the recorded signals. Theapplication 19 then identifies 515 that the user is performing a curl, as the twomovement templates 27 selected by theapplication 19 are known to be associated with a curl movement. One example of template matching algorithms is cross correlation algorithm. Another example is dynamic time warping. - In one embodiment, the
application 19 guides the user through a fitness routine. As theapplication 19, is guiding the user through the fitness routine, theapplication 19 is aware of the repetitive movement being performed by the user. Thus, theapplication 19, may verify the movement identified by theapplication 19 based on the recordedmotion data 25 with the movement theapplication 19 expects the user to perform based on the fitness routine. In a second embodiment, as theapplication 19 is aware of the movement being performed by the user, theapplication 19 no longer needs to identify the movement being performed by the user based on the motion data, and hence begins to count the repetitive features present in one or more processed signals to determine the repetitions performed by the user. In a third embodiment, as theapplication 19 is aware of the movement being performed by the user, theapplication 19 may compare the recorded motion data to a subset ofmovement templates 27 in themovement template database 22, wherein the subset ofmovement templates 27 represent templates related to the movements theapplication 19 expects the user to perform. For example, if theapplication 19 is aware that the user is currently performing a curl as part of a fitness routine, theapplication 19 would compare the recordedmotion data 19 with that ofmovement templates 27 associated with the curl classification of movements. - In one embodiment, the
application 19 determines 510 the statistical characteristics such as a mean or standard deviation associated with the repetitive features in one or more signals. For example, theapplication 19 may determine 510 the mean of the amplitude of the peaks recorded in one or more signals, while the user is performing curls. If the mean is found to be relatively greater than that of the expected threshold for characterizing real peaks, theapplication 19 may raise the weight for the next set of curls suggested to the user as a relatively higher mean implies that the user was able to perform a current curl easier (at a faster rate) than that is expected. In another embodiment, the application may determine the standard deviation of the amplitude and frequency of the peaks recorded in one or more signals, while the user is performing curls. If the standard deviation is found to be outside of an expected range of standard deviation values for a curl action, it is possible that even though the pattern of features may have been identified to match a curl, the user may not really be performing a curl, but may be performing a different motion similar to a curl. Hence, the statistical characteristics of the features in one or more signals provide additional information towards the identifying 515 the movement performed by the user. - In another embodiment, the
application 19 uses machine learning algorithms to detect movements, and/or classify or identify 505 movements performed by a user in a rolling window of one or more signals based on the repetitive features or morphological features present in one or more signals. An example of a recognition and a learning algorithm that may be used is described in Ling Bao et. al, “Activity Recognition from user-Annotated Acceleration Data”, which is incorporated by reference herein in its entirety. - In one example, in addition to calibrating the
motion tracking system 100, the user may edit the interpreteddata 28 generated by themotion tracking system 100. For example the user may edit the number of repetitive movements or the count associated with a motion or exercise performed by the user as determined by themotion tracking system 100. The edits performed by the user along with the motion data associated with the motion performed by the user are analyzed by themotion tracking system 100. In one example, based on the edits performed to the interpreteddata 28 themotion tracking system 100 modifies the classification algorithm, the template matching algorithm, or the machine learning algorithm used to generate the interpreteddata 28. The algorithms are modified to capture the discrepancies in the interpreted data generated prior to the user editing the interpreted data, and thereby cater to the behavior and unique movements and tendencies associated with the user. - In one example, the
motion tracking system 100 determines that the user performed 11 counts of a repetitive motion. The user may edit the count from 11 to 10. Themotion tracking system 100 identifies the discrepancy between the features of the motion data associated with the repetitive movement and one or more algorithms used to classify the repetitive movement and quantify the repetitive movement. For example, based on the template matching algorithm, themotion tracking system 100 may have classified a feature similar to that associated with the repetitive movement as a repetitive movement performed by the user. Based on the edit made by the user, themotion tracking system 100 modifies the applied algorithm to no longer classify and quantify the feature similar to that associated with the repetitive movement. - In another example, the
motion tracking system 100 receives an edit from the user correcting the user's heart rate measured as the user was performing a motion. Themotion tracking system 100 may retrieve the edit made by the user and the motion data associated with the edit (e.g., motion data associated with the user using a cardio fitness machine as themotion tracking system 100 measured the user's heart rate). Themotion tracking system 100 may modify the algorithm applied to quantify the user's heart rate (based on the heart rate data received). Themotion tracking system 100 may then apply the modified algorithm when the user performs the motion or similar motion data is received in the future. - In one embodiment, the
motion tracking system 100 groups the edit information received from a number of users based on one or more characteristics associated with the users such as data associated with the physical measurements of a user and/or the type of edits made by the user to the interpreteddata 28 generated by the motion tracking system. Based on the types of edits made by a user and the grouping associated with the type of edits, themotion tracking system 100 may modify the algorithms applied to classify and quantify repetitive motions performed by the user. For example, users of a certain height range may perform particular motions prior to beginning a set of pull ups. These motions may accidentally be classified and quantified as a count or pull up performed by the user. Themotion tracking system 100 may modify the algorithms classifying and quantifying pull ups performed by the user based on the height information associated with the user. - Reference in the specification to “one embodiment” or to “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment. The appearances of the phrase “in one embodiment” or “an embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Some portions of the detailed description are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps (instructions) leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. Furthermore, it is also convenient at times, to refer to certain arrangements of steps requiring physical manipulations or transformation of physical quantities or representations of physical quantities as modules or code devices, without loss of generality.
- However, all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or “determining” or the like, refer to the action and processes of a computer system, or similar electronic computing device (such as a specific computing machine), that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
- Certain aspects of the embodiments include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the embodiments can be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by a variety of operating systems. The embodiments can also be in a computer program product which can be executed on a computing system.
- The embodiments also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the purposes, e.g., a specific computer, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. Memory can include any of the above and/or other devices that can store information/data/programs and can be transient or non-transient medium, where a non-transient or non-transitory medium can include memory/storage that stores information for more than a minimal duration. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the method steps. The structure for a variety of these systems will appear from the description herein. In addition, the embodiments are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the embodiments as described herein, and any references herein to specific languages are provided for disclosure of enablement and best mode.
- In addition, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the embodiments, which is set forth in the claims.
- While particular embodiments and applications have been illustrated and described herein, it is to be understood that the embodiments are not limited to the precise construction and components disclosed herein and that various modifications, changes, and variations may be made in the arrangement, operation, and details of the methods and apparatuses of the embodiments without departing from the spirit and scope of the embodiments as defined in the appended claims.
Claims (20)
1. A method for monitoring movements corresponding to a user of a motion tracking device, the method comprising:
receiving motion data from the motion tracking device, the motion data representing motions performed by the user of the motion tracking device;
classifying, based on the motion data, a repetitive movement being performed by the user of the motion tracking device;
identifying, based on the motion data and the classification of the repetitive movement, one or more characteristics associated with the repetitive movement being performed by the user;
generating interpreted data based on the classified repetitive movement and a quantitative measure of one or more characteristics associated with the repetitive movement; and
displaying the interpreted data.
2. The method of claim 1 , wherein classifying, based on the motion data, the repetitive movement being performed by the user comprises:
identifying, repetitive features in the motion data; and
classifying, based on the identified repetitive features, the repetitive movement being performed by the user.
3. The method of claim 2 , wherein classifying, based on the identified repetitive features, the repetitive movement being performed by the user comprises:
applying a template matching algorithm to a portion of the motion data, the template matching algorithm comparing the identified repetitive features in the portion of the motion data to a set of movement templates;
identifying, based on the template matching algorithm, a movement template of the set of movement templates matching the identified repetitive features; and
classifying, based on the identified movement template, the repetitive movement being performed by the user.
4. The method of claim 3 , further comprising:
receiving modifications to the interpreted data from the user; and
modifying the template matching algorithm applied to the portion of the motion data or an algorithm applied to determine the quantitative measure of one or more characteristics associated with the repetitive movement, based on the modifications.
5. The method of claim 2 , further comprising:
determining statistical characteristics associated with the identified repetitive features;
verifying, based on the statistical characteristics, the classification of the identified repetitive movement being performed by the user; and
modifying, based on the verification or the statistical characteristics, the interpreted data.
6. The method of claim 2 , wherein a feature comprises, an amplitude associated with one or more signals of the motion data, a mean value associated with the one or more signals of the motion data, a number of valleys or peaks associated with one or more signals of the motion data, an order of valleys or peaks associated with one or more signals of the motion data, or a frequency of valleys or peaks associated with one or more signals.
7. The method of claim 1 , further comprising:
obtaining contextual data associated with the user of the motion tacking device; and
generating interpreted data based on the contextual data, the classified repetitive movement and a quantitative measure of the characteristics associated with the repetitive movement.
8. The method of claim 7 , wherein the contextual data comprises information identifying a fitness equipment being used by the user or a location of the user.
9. The method of claim 8 , wherein the interpreted data comprises feedback, the feedback associated with the repetitive movement performed by the user or the location of the user.
10. The method of claim 1 , wherein the one or more characteristics associated with the repetitive movement comprises a count identifying a number of repetitive movements performed by the user.
11. The method of claim 1 , wherein the one or more characteristics associated with the repetitive movement comprises a form with which the user is performing the repetitive movement.
12. The method of claim 1 , wherein the one or more characteristics associated with the repetitive movement comprises time periods between sets of a repetitive movement.
13. The method of claim 1 , wherein the interpreted data comprises a feedback, the feedback directed towards improving or altering the repetitive movement performed by the user.
14. A computer program product comprising a computer-readable medium having instructions encoded thereon that, when executed by a processor, cause the processor to:
receive motion data from the motion tracking device, the motion data representing motions performed by the user of the motion tracking device;
classify, based on the motion data, a repetitive movement being performed by the user of the motion tracking device;
identify, based on the motion data and the classification of the repetitive movement, one or more characteristics associated with the repetitive movement being performed by the user;
generate interpreted data, based on the classified repetitive movement and a quantitative measure of one or more characteristics associated with the repetitive movement; and
display the interpreted data.
15. The computer program product of claim 14 , wherein classify, based on the motion data, the repetitive movement being performed by the user comprises:
identify, repetitive features in the motion data; and
classify, based on the identified repetitive features, the repetitive movement being performed by the user.
16. The computer program product of claim 15 , wherein classify, based on the identified repetitive features, the repetitive movement being performed by the user comprises:
apply a template matching algorithm to a portion of the motion data, the template matching algorithm comparing the identified repetitive features in the portion of the motion data to a set of movement templates;
identify, based on the template matching algorithm, a movement template of the set of movement templates matching the identified repetitive features; and
classify, based on the identified movement template, the repetitive movement being performed by the user.
17. The computer program product of claim 15 , further comprises:
determine statistical characteristics associated with the identified repetitive features;
verify, based on the statistical characteristics, the classification of the identified repetitive movement being performed by the user; and
modify, based on the verification or the statistical characteristics, the interpreted data.
18. A system for monitoring movements of a user of a motion tracking device, the system comprising:
one or more sensors configured to generate motion data, the motion data representing motions performed by the user of the motion tracking device; and
a computer program product stored on a non-transitory computer-readable storage medium comprising computer-readable instructions for execution by a processor, the instructions when executed by the processor cause the processor to:
receive the motion data from the motion tracking device;
classify, based on the motion data, a repetitive movement being performed by the user of the motion tracking device;
identify, based on the motion data and the classification of the repetitive movement, one or more characteristics associated with the repetitive movement being performed by the user;
generate, based on the classified repetitive movement and a quantitative measure of one or more characteristics associated with the repetitive movement, interpreted data; and
provide, for display, the interpreted data.
19. The system of claim 18 , wherein classify, based on the motion data, the repetitive movement being performed by the user comprises:
identify, repetitive features in the motion data; and
classify, based on the identified repetitive features, the repetitive movement being performed by the user.
20. The system of claim 19 , wherein classify, based on the identified repetitive features, the repetitive movement being performed by the user comprises:
apply a template matching algorithm to a portion of the motion data, the template matching algorithm comparing the identified repetitive features in the portion of the motion data to a set of movement templates;
identify, based on the template matching algorithm, a movement template of the set of movement templates matching the identified repetitive features; and
classify, based on the identified movement template, the repetitive movement being performed by the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/214,046 US20140278219A1 (en) | 2013-03-15 | 2014-03-14 | System and Method For Monitoring Movements of a User |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361792601P | 2013-03-15 | 2013-03-15 | |
US201361873339P | 2013-09-03 | 2013-09-03 | |
US201361873347P | 2013-09-03 | 2013-09-03 | |
US14/214,046 US20140278219A1 (en) | 2013-03-15 | 2014-03-14 | System and Method For Monitoring Movements of a User |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140278219A1 true US20140278219A1 (en) | 2014-09-18 |
Family
ID=51527229
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/213,935 Active 2034-06-11 US9314666B2 (en) | 2013-03-15 | 2014-03-14 | System and method for identifying and interpreting repetitive motions |
US14/214,046 Abandoned US20140278219A1 (en) | 2013-03-15 | 2014-03-14 | System and Method For Monitoring Movements of a User |
US15/067,123 Active 2034-12-17 US10335637B2 (en) | 2013-03-15 | 2016-03-10 | System and method for identifying and interpreting repetitive motions |
US16/415,967 Active US10799760B2 (en) | 2013-03-15 | 2019-05-17 | System and method for identifying and interpreting repetitive motions |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/213,935 Active 2034-06-11 US9314666B2 (en) | 2013-03-15 | 2014-03-14 | System and method for identifying and interpreting repetitive motions |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/067,123 Active 2034-12-17 US10335637B2 (en) | 2013-03-15 | 2016-03-10 | System and method for identifying and interpreting repetitive motions |
US16/415,967 Active US10799760B2 (en) | 2013-03-15 | 2019-05-17 | System and method for identifying and interpreting repetitive motions |
Country Status (1)
Country | Link |
---|---|
US (4) | US9314666B2 (en) |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150209617A1 (en) * | 2014-01-27 | 2015-07-30 | Wanin Interantional Co., Ltd. | Fitness equipment combining with a cloud service system |
US20150352404A1 (en) * | 2014-06-06 | 2015-12-10 | Head Technology Gmbh | Swing analysis system |
DE102015113936A1 (en) | 2014-08-21 | 2016-02-25 | Affectomatics Ltd. | Rating of vehicles based on affective response |
US20160125348A1 (en) * | 2014-11-03 | 2016-05-05 | Motion Insight LLC | Motion Tracking Wearable Element and System |
DE102016101650A1 (en) | 2015-01-29 | 2016-08-04 | Affectomatics Ltd. | CORRECTION OF BIAS IN MEASURES OF THE AFFECTIVE RESPONSE |
US20170028256A1 (en) * | 2015-07-29 | 2017-02-02 | Athalonz, Llc | Arm fatigue analysis system |
US20170061817A1 (en) * | 2015-08-28 | 2017-03-02 | Icuemotion, Llc | System for movement skill analysis and skill augmentation and cueing |
WO2017040318A1 (en) * | 2015-08-28 | 2017-03-09 | Focus Ventures, Inc. | Automated motion of interest recognition, detection and self-learning |
US9641991B2 (en) * | 2015-01-06 | 2017-05-02 | Fitbit, Inc. | Systems and methods for determining a user context by correlating acceleration data from multiple devices |
US20170136339A1 (en) * | 2014-07-07 | 2017-05-18 | Leila Benedicte Habiche | Device for practising sport activities |
US9901776B2 (en) | 2011-08-29 | 2018-02-27 | Icuemotion Llc | Racket sport inertial sensor motion tracking analysis |
US20190201112A1 (en) * | 2017-12-28 | 2019-07-04 | Ethicon Llc | Computer implemented interactive surgical systems |
US10410297B2 (en) | 2014-11-03 | 2019-09-10 | PJS of Texas Inc. | Devices, systems, and methods of activity-based monitoring and incentivization |
US10652696B2 (en) * | 2014-07-30 | 2020-05-12 | Trusted Positioning, Inc. | Method and apparatus for categorizing device use case for on foot motion using motion sensor data |
US10668353B2 (en) | 2014-08-11 | 2020-06-02 | Icuemotion Llc | Codification and cueing system for sport and vocational activities |
RU2801426C1 (en) * | 2022-09-18 | 2023-08-08 | Эмиль Юрьевич Большаков | Method and system for real-time recognition and analysis of user movements |
US11819231B2 (en) | 2017-10-30 | 2023-11-21 | Cilag Gmbh International | Adaptive control programs for a surgical system comprising more than one type of cartridge |
US11832899B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical systems with autonomously adjustable control programs |
WO2023239025A1 (en) * | 2022-06-10 | 2023-12-14 | 삼성전자주식회사 | Electronic device and wearable device for providing evaluation information on user's exercise motion, and method for operating same |
US11844579B2 (en) | 2017-12-28 | 2023-12-19 | Cilag Gmbh International | Adjustments based on airborne particle properties |
US11844545B2 (en) | 2018-03-08 | 2023-12-19 | Cilag Gmbh International | Calcified vessel identification |
US11857152B2 (en) | 2017-12-28 | 2024-01-02 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
US11864845B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Sterile field interactive control displays |
US11864728B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
US11871901B2 (en) | 2012-05-20 | 2024-01-16 | Cilag Gmbh International | Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage |
US11890065B2 (en) | 2017-12-28 | 2024-02-06 | Cilag Gmbh International | Surgical system to limit displacement |
US11896443B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Control of a surgical system through a surgical barrier |
US11896322B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub |
US11903587B2 (en) | 2017-12-28 | 2024-02-20 | Cilag Gmbh International | Adjustment to the surgical stapling control based on situational awareness |
US11925350B2 (en) | 2019-02-19 | 2024-03-12 | Cilag Gmbh International | Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge |
US11931027B2 (en) | 2018-03-28 | 2024-03-19 | Cilag Gmbh Interntional | Surgical instrument comprising an adaptive control system |
US11969142B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws |
US11969216B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11986233B2 (en) | 2018-03-08 | 2024-05-21 | Cilag Gmbh International | Adjustment of complex impedance to compensate for lost power in an articulating ultrasonic device |
Families Citing this family (82)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9339691B2 (en) | 2012-01-05 | 2016-05-17 | Icon Health & Fitness, Inc. | System and method for controlling an exercise device |
US9254409B2 (en) | 2013-03-14 | 2016-02-09 | Icon Health & Fitness, Inc. | Strength training apparatus with flywheel and related methods |
US9314666B2 (en) * | 2013-03-15 | 2016-04-19 | Ficus Ventures, Inc. | System and method for identifying and interpreting repetitive motions |
WO2014194337A1 (en) * | 2013-05-30 | 2014-12-04 | Atlas Wearables, Inc. | Portable computing device and analyses of personal data captured therefrom |
JP6539272B2 (en) | 2013-08-07 | 2019-07-03 | ナイキ イノベイト シーブイ | Computer-implemented method, non-transitory computer-readable medium, and single device |
WO2015048884A1 (en) * | 2013-10-03 | 2015-04-09 | Push Design Solutions, Inc. | Systems and methods for monitoring lifting exercises |
US10136840B2 (en) * | 2013-10-14 | 2018-11-27 | Nike, Inc. | Fitness training system for merging energy expenditure calculations from multiple devices |
US9626478B2 (en) | 2013-10-24 | 2017-04-18 | Logitech Europe, S.A. | System and method for tracking biological age over time based upon heart rate variability |
US20150118669A1 (en) * | 2013-10-24 | 2015-04-30 | JayBird LLC | System and method for providing an intelligent goal recommendation for activity level |
US20160058378A1 (en) * | 2013-10-24 | 2016-03-03 | JayBird LLC | System and method for providing an interpreted recovery score |
EP3974036A1 (en) | 2013-12-26 | 2022-03-30 | iFIT Inc. | Magnetic resistance mechanism in a cable machine |
US10433612B2 (en) | 2014-03-10 | 2019-10-08 | Icon Health & Fitness, Inc. | Pressure sensor to quantify work |
US11056238B1 (en) * | 2014-03-27 | 2021-07-06 | Apple Inc. | Personality based wellness coaching |
US9288298B2 (en) * | 2014-05-06 | 2016-03-15 | Fitbit, Inc. | Notifications regarding interesting or unusual activity detected from an activity monitoring device |
US10401380B2 (en) * | 2014-05-22 | 2019-09-03 | The Trustees Of The University Of Pennsylvania | Wearable system for accelerometer-based detection and classification of firearm use |
WO2015191445A1 (en) | 2014-06-09 | 2015-12-17 | Icon Health & Fitness, Inc. | Cable system incorporated into a treadmill |
WO2015195965A1 (en) | 2014-06-20 | 2015-12-23 | Icon Health & Fitness, Inc. | Post workout massage device |
US9305441B1 (en) | 2014-07-11 | 2016-04-05 | ProSports Technologies, LLC | Sensor experience shirt |
US9724588B1 (en) | 2014-07-11 | 2017-08-08 | ProSports Technologies, LLC | Player hit system |
US9398213B1 (en) | 2014-07-11 | 2016-07-19 | ProSports Technologies, LLC | Smart field goal detector |
WO2016007970A1 (en) | 2014-07-11 | 2016-01-14 | ProSports Technologies, LLC | Whistle play stopper |
US9610491B2 (en) | 2014-07-11 | 2017-04-04 | ProSports Technologies, LLC | Playbook processor |
US9474933B1 (en) | 2014-07-11 | 2016-10-25 | ProSports Technologies, LLC | Professional workout simulator |
DE112015003279T5 (en) * | 2014-07-15 | 2017-04-06 | Asahi Kasei Kabushiki Kaisha | Input device, biosensor, program, computer-readable medium and mode setting method |
KR102130801B1 (en) * | 2014-07-22 | 2020-08-05 | 엘지전자 주식회사 | Apparatus for detecting wrist step and method thereof |
US10264175B2 (en) | 2014-09-09 | 2019-04-16 | ProSports Technologies, LLC | Facial recognition for event venue cameras |
US20160131677A1 (en) * | 2014-11-10 | 2016-05-12 | International Business Machines Corporation | Motion pattern based event detection using a wearable device |
US20160175646A1 (en) * | 2014-12-17 | 2016-06-23 | Vibrado Technologies, Inc. | Method and system for improving biomechanics with immediate prescriptive feedback |
US10258828B2 (en) | 2015-01-16 | 2019-04-16 | Icon Health & Fitness, Inc. | Controls for an exercise device |
US10197416B2 (en) * | 2015-01-21 | 2019-02-05 | Quicklogic Corporation | Multiple axis wrist worn pedometer |
US20160249832A1 (en) * | 2015-02-27 | 2016-09-01 | Amiigo, Inc. | Activity Classification Based on Classification of Repetition Regions |
US10372757B2 (en) | 2015-05-19 | 2019-08-06 | Spotify Ab | Search media content based upon tempo |
US10055413B2 (en) * | 2015-05-19 | 2018-08-21 | Spotify Ab | Identifying media content |
US20170039480A1 (en) * | 2015-08-06 | 2017-02-09 | Microsoft Technology Licensing, Llc | Workout Pattern Detection |
US20170038848A1 (en) * | 2015-08-07 | 2017-02-09 | Fitbit, Inc. | User identification via collected sensor data from a wearable fitness monitor |
US10940360B2 (en) | 2015-08-26 | 2021-03-09 | Icon Health & Fitness, Inc. | Strength exercise mechanisms |
US20170178532A1 (en) * | 2015-12-22 | 2017-06-22 | Mei Lu | Coaching Feedback Adjustment Mechanism |
US10970661B2 (en) | 2016-01-11 | 2021-04-06 | RaceFit International Company Limited | System and method for monitoring motion and orientation patterns associated to physical activities of users |
TWI621968B (en) * | 2016-02-05 | 2018-04-21 | 財團法人工業技術研究院 | Method for controlling electronic equipment and wearable device |
US10272317B2 (en) | 2016-03-18 | 2019-04-30 | Icon Health & Fitness, Inc. | Lighted pace feature in a treadmill |
US10561894B2 (en) | 2016-03-18 | 2020-02-18 | Icon Health & Fitness, Inc. | Treadmill with removable supports |
US10493349B2 (en) | 2016-03-18 | 2019-12-03 | Icon Health & Fitness, Inc. | Display on exercise device |
US10625137B2 (en) | 2016-03-18 | 2020-04-21 | Icon Health & Fitness, Inc. | Coordinated displays in an exercise device |
US10441840B2 (en) | 2016-03-18 | 2019-10-15 | Icon Health & Fitness, Inc. | Collapsible strength exercise machine |
US10293211B2 (en) | 2016-03-18 | 2019-05-21 | Icon Health & Fitness, Inc. | Coordinated weight selection |
US10252109B2 (en) | 2016-05-13 | 2019-04-09 | Icon Health & Fitness, Inc. | Weight platform treadmill |
US11113346B2 (en) | 2016-06-09 | 2021-09-07 | Spotify Ab | Search media content based upon tempo |
US10984035B2 (en) | 2016-06-09 | 2021-04-20 | Spotify Ab | Identifying media content |
US10441844B2 (en) | 2016-07-01 | 2019-10-15 | Icon Health & Fitness, Inc. | Cooling systems and methods for exercise equipment |
US10471299B2 (en) | 2016-07-01 | 2019-11-12 | Icon Health & Fitness, Inc. | Systems and methods for cooling internal exercise equipment components |
US10983894B2 (en) * | 2016-07-22 | 2021-04-20 | Intel Corporation | Autonomously adaptive performance monitoring |
WO2018045319A1 (en) | 2016-09-01 | 2018-03-08 | Catalyft Labs, Inc. | Multi-functional weight rack and exercise monitoring system for tracking exercise movements |
US20180071583A1 (en) * | 2016-09-15 | 2018-03-15 | FitTech Software LLC | Software Platform Configured to Provide Analytics and Recommendations |
US10671705B2 (en) | 2016-09-28 | 2020-06-02 | Icon Health & Fitness, Inc. | Customizing recipe recommendations |
JP6871708B2 (en) * | 2016-10-06 | 2021-05-12 | りか 高木 | Methods, systems, programs, and computer devices for identifying the causative site of compensatory movements, and methods and systems for eliminating compensatory movements. |
US10500473B2 (en) | 2016-10-10 | 2019-12-10 | Icon Health & Fitness, Inc. | Console positioning |
US10207148B2 (en) | 2016-10-12 | 2019-02-19 | Icon Health & Fitness, Inc. | Systems and methods for reducing runaway resistance on an exercise device |
US10376736B2 (en) | 2016-10-12 | 2019-08-13 | Icon Health & Fitness, Inc. | Cooling an exercise device during a dive motor runway condition |
TWI646997B (en) | 2016-11-01 | 2019-01-11 | 美商愛康運動與健康公司 | Distance sensor for console positioning |
US10661114B2 (en) | 2016-11-01 | 2020-05-26 | Icon Health & Fitness, Inc. | Body weight lift mechanism on treadmill |
US10967221B2 (en) * | 2016-11-29 | 2021-04-06 | James L. O'Sullivan | Device and method for monitoring exercise performance |
TWI680782B (en) | 2016-12-05 | 2020-01-01 | 美商愛康運動與健康公司 | Offsetting treadmill deck weight during operation |
US10424183B1 (en) * | 2017-01-20 | 2019-09-24 | Dp Technologies, Inc. | Smart seating system |
JP7005975B2 (en) * | 2017-07-14 | 2022-01-24 | セイコーエプソン株式会社 | Portable electronic devices |
US11451108B2 (en) | 2017-08-16 | 2022-09-20 | Ifit Inc. | Systems and methods for axial impact resistance in electric motors |
US10949810B2 (en) | 2017-11-22 | 2021-03-16 | International Business Machines Corporation | Health condition monitoring and action |
CN108096807A (en) * | 2017-12-11 | 2018-06-01 | 丁贤根 | A kind of exercise data monitoring method and system |
US10729965B2 (en) | 2017-12-22 | 2020-08-04 | Icon Health & Fitness, Inc. | Audible belt guide in a treadmill |
CN108460322A (en) * | 2017-12-28 | 2018-08-28 | 惠州市德赛工业研究院有限公司 | A kind of stroke recognition methods and application |
US20200005027A1 (en) * | 2018-06-27 | 2020-01-02 | Johnson Health Tech Co., Ltd. | Weight training intelligence system |
AU2018204669A1 (en) * | 2018-06-27 | 2020-01-30 | JointAction Group Pty Ltd | Monitors for movements of workers |
WO2020205276A1 (en) * | 2019-03-29 | 2020-10-08 | Alive Fitness, Llc | Methods and systems for exercise recognition and analysis |
CN111803903A (en) * | 2019-04-10 | 2020-10-23 | 深圳先进技术研究院 | Body-building action recognition method and system and electronic equipment |
US20220226695A1 (en) * | 2019-05-28 | 2022-07-21 | CoPilot Systems Inc. | Systems and methods for workout tracking and classification |
WO2021035208A1 (en) * | 2019-08-22 | 2021-02-25 | The Trustees Of Columbia University In The City Of New York | Limb motion tracking biofeedback platform and method of rehabilitation therapy for patients with spasticity |
US20210394020A1 (en) * | 2020-06-17 | 2021-12-23 | FitForm Technologies Inc. | Tracking three-dimensional motion during an activity |
TWI820347B (en) | 2020-09-04 | 2023-11-01 | 仁寶電腦工業股份有限公司 | Activity recognition method, activity recognition system, and handwriting identification system |
US11794073B2 (en) | 2021-02-03 | 2023-10-24 | Altis Movement Technologies, Inc. | System and method for generating movement based instruction |
EP4054157A1 (en) * | 2021-03-05 | 2022-09-07 | Sony Group Corporation | System and method for monitoring activity in a gym environment |
US11701546B1 (en) * | 2022-01-17 | 2023-07-18 | Tonal Systems, Inc. | Exercise machine struggle detection |
US20230293941A1 (en) * | 2022-03-21 | 2023-09-21 | Samsung Electronics Company, Ltd. | Systems and Method for Segmentation of Movement Repetitions and Extraction of Performance Metrics |
WO2023243863A1 (en) * | 2022-06-13 | 2023-12-21 | Samsung Electronics Co., Ltd. | Method and system for mitigating physical risks in an iot environment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100069203A1 (en) * | 2008-09-18 | 2010-03-18 | Omron Healthcare Co., Ltd. | Body motion discriminating apparatus and activity monitor |
US20120123226A1 (en) * | 2009-07-20 | 2012-05-17 | Koninklijke Philips Electronics N.V. | Method for operating a monitoring system |
US20120268592A1 (en) * | 2010-12-13 | 2012-10-25 | Nike, Inc. | Processing Data of a User Performing an Athletic Activity to Estimate Energy Expenditure |
US20130203475A1 (en) * | 2012-01-26 | 2013-08-08 | David H. Kil | System and method for processing motion-related sensor data with social mind-body games for health application |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2002255568B8 (en) * | 2001-02-20 | 2014-01-09 | Adidas Ag | Modular personal network systems and methods |
US7651442B2 (en) * | 2002-08-15 | 2010-01-26 | Alan Carlson | Universal system for monitoring and controlling exercise parameters |
CN101778653B (en) | 2007-08-08 | 2013-04-10 | 皇家飞利浦电子股份有限公司 | Process and system for monitoring exercise motions of a person |
US8113991B2 (en) * | 2008-06-02 | 2012-02-14 | Omek Interactive, Ltd. | Method and system for interactive fitness training program |
US8996332B2 (en) | 2008-06-24 | 2015-03-31 | Dp Technologies, Inc. | Program setting adjustments based on activity identification |
US10369353B2 (en) * | 2008-11-11 | 2019-08-06 | Medtronic, Inc. | Seizure disorder evaluation based on intracranial pressure and patient motion |
US8500604B2 (en) | 2009-10-17 | 2013-08-06 | Robert Bosch Gmbh | Wearable system for monitoring strength training |
US20110117528A1 (en) | 2009-11-18 | 2011-05-19 | Marciello Robert J | Remote physical therapy apparatus |
US8884872B2 (en) * | 2009-11-20 | 2014-11-11 | Nuance Communications, Inc. | Gesture-based repetition of key activations on a virtual keyboard |
WO2012021902A2 (en) | 2010-08-13 | 2012-02-16 | Net Power And Light Inc. | Methods and systems for interaction through gestures |
US9223936B2 (en) | 2010-11-24 | 2015-12-29 | Nike, Inc. | Fatigue indices and uses thereof |
US9457256B2 (en) | 2010-11-05 | 2016-10-04 | Nike, Inc. | Method and system for automated personal training that includes training programs |
DE102010060592A1 (en) | 2010-11-16 | 2012-05-16 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Training system, mobile terminal and training method for one person |
US9314666B2 (en) * | 2013-03-15 | 2016-04-19 | Ficus Ventures, Inc. | System and method for identifying and interpreting repetitive motions |
-
2014
- 2014-03-14 US US14/213,935 patent/US9314666B2/en active Active
- 2014-03-14 US US14/214,046 patent/US20140278219A1/en not_active Abandoned
-
2016
- 2016-03-10 US US15/067,123 patent/US10335637B2/en active Active
-
2019
- 2019-05-17 US US16/415,967 patent/US10799760B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100069203A1 (en) * | 2008-09-18 | 2010-03-18 | Omron Healthcare Co., Ltd. | Body motion discriminating apparatus and activity monitor |
US20120123226A1 (en) * | 2009-07-20 | 2012-05-17 | Koninklijke Philips Electronics N.V. | Method for operating a monitoring system |
US20120268592A1 (en) * | 2010-12-13 | 2012-10-25 | Nike, Inc. | Processing Data of a User Performing an Athletic Activity to Estimate Energy Expenditure |
US20130203475A1 (en) * | 2012-01-26 | 2013-08-08 | David H. Kil | System and method for processing motion-related sensor data with social mind-body games for health application |
Non-Patent Citations (1)
Title |
---|
Mathie et al. Classification of basic daily movements using a triaxial accelerometer,Med. Biol. Eng. Comput., 2004, 42, 679-687 * |
Cited By (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10610732B2 (en) | 2011-08-29 | 2020-04-07 | Icuemotion Llc | Inertial sensor motion tracking and stroke analysis system |
US9901776B2 (en) | 2011-08-29 | 2018-02-27 | Icuemotion Llc | Racket sport inertial sensor motion tracking analysis |
US11871901B2 (en) | 2012-05-20 | 2024-01-16 | Cilag Gmbh International | Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage |
US20150209617A1 (en) * | 2014-01-27 | 2015-07-30 | Wanin Interantional Co., Ltd. | Fitness equipment combining with a cloud service system |
US20150352404A1 (en) * | 2014-06-06 | 2015-12-10 | Head Technology Gmbh | Swing analysis system |
US20170136339A1 (en) * | 2014-07-07 | 2017-05-18 | Leila Benedicte Habiche | Device for practising sport activities |
US10220289B2 (en) * | 2014-07-07 | 2019-03-05 | Leila Benedicte Habiche | Device for practicing sport activities |
US10652696B2 (en) * | 2014-07-30 | 2020-05-12 | Trusted Positioning, Inc. | Method and apparatus for categorizing device use case for on foot motion using motion sensor data |
US10668353B2 (en) | 2014-08-11 | 2020-06-02 | Icuemotion Llc | Codification and cueing system for sport and vocational activities |
US11455834B2 (en) | 2014-08-11 | 2022-09-27 | Icuemotion Llc | Codification and cueing system for sport and vocational activities |
DE102015113931A1 (en) | 2014-08-21 | 2016-02-25 | Affectomatics Ltd. | Calculation of after-effects from affective reactions |
DE102015113936A1 (en) | 2014-08-21 | 2016-02-25 | Affectomatics Ltd. | Rating of vehicles based on affective response |
DE102015113942A1 (en) | 2014-08-21 | 2016-02-25 | Affectomatics Ltd. | Rating of holiday destinations based on affective response |
US20160125348A1 (en) * | 2014-11-03 | 2016-05-05 | Motion Insight LLC | Motion Tracking Wearable Element and System |
US10410297B2 (en) | 2014-11-03 | 2019-09-10 | PJS of Texas Inc. | Devices, systems, and methods of activity-based monitoring and incentivization |
US9641991B2 (en) * | 2015-01-06 | 2017-05-02 | Fitbit, Inc. | Systems and methods for determining a user context by correlating acceleration data from multiple devices |
DE102016101650A1 (en) | 2015-01-29 | 2016-08-04 | Affectomatics Ltd. | CORRECTION OF BIAS IN MEASURES OF THE AFFECTIVE RESPONSE |
DE102016101643A1 (en) | 2015-01-29 | 2016-08-04 | Affectomatics Ltd. | FILTERING BIAS RESTRICTED BY BIAS TO THE AFFECTIVE RESPONSE |
DE102016101661A1 (en) | 2015-01-29 | 2016-08-04 | Affectomatics Ltd. | BASED ON DATA PRIVACY CONSIDERATIONS BASED ON CROWD BASED EVALUATIONS CALCULATED ON THE BASIS OF MEASURES OF THE AFFECTIVE REACTION |
US11213205B2 (en) | 2015-07-29 | 2022-01-04 | Gary McCoy | Arm fatigue analysis system |
US10610101B2 (en) * | 2015-07-29 | 2020-04-07 | Athalonz, Llc | Arm fatigue analysis system |
US20170028256A1 (en) * | 2015-07-29 | 2017-02-02 | Athalonz, Llc | Arm fatigue analysis system |
WO2017040318A1 (en) * | 2015-08-28 | 2017-03-09 | Focus Ventures, Inc. | Automated motion of interest recognition, detection and self-learning |
US20170061817A1 (en) * | 2015-08-28 | 2017-03-02 | Icuemotion, Llc | System for movement skill analysis and skill augmentation and cueing |
EP3341093A4 (en) * | 2015-08-28 | 2019-05-08 | Icuemotion LLC | Systems and methods for movement skill analysis and skill augmentation and cueing |
CN108463271A (en) * | 2015-08-28 | 2018-08-28 | 伊虎智动有限责任公司 | System and method for motor skill analysis and technical ability enhancing and prompt |
US10854104B2 (en) * | 2015-08-28 | 2020-12-01 | Icuemotion Llc | System for movement skill analysis and skill augmentation and cueing |
US9654234B2 (en) | 2015-08-28 | 2017-05-16 | Focus Ventures, Inc. | System and method for automatically time labeling repetitive data |
US11367364B2 (en) | 2015-08-28 | 2022-06-21 | Icuemotion Llc | Systems and methods for movement skill analysis and skill augmentation |
WO2017040242A1 (en) | 2015-08-28 | 2017-03-09 | Icuemotion, Llc | Systems and methods for movement skill analysis and skill augmentation and cueing |
US11763697B2 (en) | 2015-08-28 | 2023-09-19 | Icuemotion Llc | User interface system for movement skill analysis and skill augmentation |
US11925373B2 (en) | 2017-10-30 | 2024-03-12 | Cilag Gmbh International | Surgical suturing instrument comprising a non-circular needle |
US11819231B2 (en) | 2017-10-30 | 2023-11-21 | Cilag Gmbh International | Adaptive control programs for a surgical system comprising more than one type of cartridge |
US11903587B2 (en) | 2017-12-28 | 2024-02-20 | Cilag Gmbh International | Adjustment to the surgical stapling control based on situational awareness |
US20190201112A1 (en) * | 2017-12-28 | 2019-07-04 | Ethicon Llc | Computer implemented interactive surgical systems |
US11844579B2 (en) | 2017-12-28 | 2023-12-19 | Cilag Gmbh International | Adjustments based on airborne particle properties |
US11969216B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11857152B2 (en) | 2017-12-28 | 2024-01-02 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
US11864845B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Sterile field interactive control displays |
US11864728B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
US11832899B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical systems with autonomously adjustable control programs |
US11890065B2 (en) | 2017-12-28 | 2024-02-06 | Cilag Gmbh International | Surgical system to limit displacement |
US11896443B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Control of a surgical system through a surgical barrier |
US11896322B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub |
US11969142B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws |
US11918302B2 (en) | 2017-12-28 | 2024-03-05 | Cilag Gmbh International | Sterile field interactive control displays |
US11844545B2 (en) | 2018-03-08 | 2023-12-19 | Cilag Gmbh International | Calcified vessel identification |
US11986233B2 (en) | 2018-03-08 | 2024-05-21 | Cilag Gmbh International | Adjustment of complex impedance to compensate for lost power in an articulating ultrasonic device |
US11931027B2 (en) | 2018-03-28 | 2024-03-19 | Cilag Gmbh Interntional | Surgical instrument comprising an adaptive control system |
US11998193B2 (en) | 2018-12-19 | 2024-06-04 | Cilag Gmbh International | Method for usage of the shroud as an aspect of sensing or controlling a powered surgical device, and a control algorithm to adjust its default operation |
US11925350B2 (en) | 2019-02-19 | 2024-03-12 | Cilag Gmbh International | Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge |
US12009095B2 (en) | 2022-02-03 | 2024-06-11 | Cilag Gmbh International | Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes |
WO2023239025A1 (en) * | 2022-06-10 | 2023-12-14 | 삼성전자주식회사 | Electronic device and wearable device for providing evaluation information on user's exercise motion, and method for operating same |
RU2801426C1 (en) * | 2022-09-18 | 2023-08-08 | Эмиль Юрьевич Большаков | Method and system for real-time recognition and analysis of user movements |
Also Published As
Publication number | Publication date |
---|---|
US10799760B2 (en) | 2020-10-13 |
US9314666B2 (en) | 2016-04-19 |
US20160192874A1 (en) | 2016-07-07 |
US10335637B2 (en) | 2019-07-02 |
US20140270375A1 (en) | 2014-09-18 |
US20190269970A1 (en) | 2019-09-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140278219A1 (en) | System and Method For Monitoring Movements of a User | |
US10022071B2 (en) | Automatic recognition, learning, monitoring, and management of human physical activities | |
US10824954B1 (en) | Methods and apparatus for learning sensor data patterns of physical-training activities | |
Liang et al. | Energy-efficient motion related activity recognition on mobile devices for pervasive healthcare | |
CN104135911B (en) | Activity classification in multi-axial cord movement monitoring device | |
Reddy et al. | Using mobile phones to determine transportation modes | |
Wang et al. | A hierarchical approach to real-time activity recognition in body sensor networks | |
US20180178061A1 (en) | Rehabilitation compliance devices | |
Ghasemzadeh et al. | Power-aware activity monitoring using distributed wearable sensors | |
Mitra et al. | KNOWME: a case study in wireless body area sensor network design | |
KR101939683B1 (en) | Apparatus and method for recognizing user activity | |
Wang et al. | Hear sign language: A real-time end-to-end sign language recognition system | |
US20190320920A1 (en) | Heart rate data processing in computing environment having a wearable device | |
KR102089002B1 (en) | Method and wearable device for providing feedback on action | |
Kodali et al. | Applications of deep neural networks for ultra low power IoT | |
US11819734B2 (en) | Video-based motion counting and analysis systems and methods for virtual fitness application | |
WO2015034824A1 (en) | System and method for identifying and interpreting repetitive motions | |
Radhakrishnan et al. | ERICA: enabling real-time mistake detection & corrective feedback for free-weights exercises | |
CN104919396A (en) | Leveraging physical handshaking in head mounted displays | |
Khan et al. | Robust human locomotion and localization activity recognition over multisensory | |
CN115516531A (en) | System and method for real-time interaction and guidance | |
Guo et al. | When your wearables become your fitness mate | |
Kabir et al. | CSI-IANet: An inception attention network for human-human interaction recognition based on CSI signal | |
Xie et al. | Genetic programming based activity recognition on a smartphone sensory data benchmark | |
Chawla et al. | Using Machine Learning Techniques for User Specific Activity Recognition. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FOCUS VENTURES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CANAVAN, CAVAN;HUGHES, GRANT;REEL/FRAME:035829/0481 Effective date: 20150608 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |