US20220219046A1 - Smart Gym - Google Patents

Smart Gym Download PDF

Info

Publication number
US20220219046A1
US20220219046A1 US17/712,004 US202217712004A US2022219046A1 US 20220219046 A1 US20220219046 A1 US 20220219046A1 US 202217712004 A US202217712004 A US 202217712004A US 2022219046 A1 US2022219046 A1 US 2022219046A1
Authority
US
United States
Prior art keywords
user
joint
exercise
processor circuitry
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/712,004
Inventor
Sean Yit Loh
Yoke Ming Tan
Michelle Ching Yee Lim
Woon Soon Wong
Wei Th'ng Yew
Cheah Cheat Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Priority to US17/712,004 priority Critical patent/US20220219046A1/en
Publication of US20220219046A1 publication Critical patent/US20220219046A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6895Sport equipment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/10Athletes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0247Pressure sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6892Mats
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/01User's weight
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/75Measuring physiological parameters of the user calorie expenditure

Definitions

  • Fitness or activity trackers are common devices that can be integrated into a number of wearables.
  • the trackers may be components of smart watch devices.
  • Devices that include trackers can monitor a user's physical activity as well as biometric data.
  • a fitness tracker may calculate a person's steps and movement throughout the day. This data may then be used to determine the person's caloric expenditure for a particular length of time.
  • FIG. 1 is a block diagram of a smart gym
  • FIG. 3 is a process flow diagram of a method that enables a smart gym
  • FIG. 4 is a block diagram of a system
  • FIG. 6 is a block diagram showing computer readable media that stores code for enabling a smart gym.
  • a caloric expenditure may refer to a number of calories burned by a person during a particular time window.
  • a caloric expenditure may refer to a number of calories burned by a person during a particular time window.
  • traditional fitness trackers use exercise profiles.
  • An exercise profile may indicate the particular activity to be performed. By having beforehand knowledge of an activity to be performed by a user, a fitness tracker can more accurately calculate the caloric expenditures of a user.
  • Exercise profiles may include, for example, swimming, running and high intensity interval training (HIIT).
  • Exercise profiles are limited to monitoring a wearer's biometric information, such as heart rate. As such, traditional fitness trackers lack the functionality to track free weight exercise or floor exercises. Free weight exercises are movements that are performed with free weights, as opposed to machine weights. Free weight exercises apply additional weight to muscles of the body while moving or configuring the body in and out of a skeletal configuration.
  • Floor exercises as used herein, refer to exercises that require the human body to achieve a skeletal configuration with the goal of stressing muscles in that configuration. This may be referred to as free-form exercises.
  • Traditional fitness trackers cannot accurately track total body movement for exercises such as free-weight exercises and floor exercises. While some traditional trackers can account for movement from a single point of reference, the traditional trackers cannot track the entire movement associated with moving in and out of a skeletal configuration. For example, the movement for doing a weighted barbell squat can be relatively small as detected by fitness tracker worn at a single point. Thus, a tracker based on movement will only detect a small movement. However, but the actual calories burnt during an entire weighted barbell squat can be significantly higher due to the weight added to the barbell. This inaccurate tracking of free weights and floor exercise can render traditional trackers useless, as free weight exercises and floor exercises are integral to a complete fitness routine. Thus, the inability to accurately track floor exercises leaves a void in interactive fitness. Traditional fitness trackers are also unable to track calories burnt and repetition counting for dumbbell weights and weight lifting.
  • the fitness data may be shared, in real time, with coaches or other professionals who can provide coaching based on the derived fitness data. Additionally, in embodiments, the present techniques enable augmented-reality applications for users who want to virtually train with public figures, such as a favorite athlete or celebrity.
  • FIG. 1 is a block diagram of a smart gym 100 .
  • the smart gym 100 includes a base 102 and one or more cameras 104 .
  • two cameras 104 A and 104 B are illustrated in the smart gym 100 .
  • the base 100 may a platform with a number of sensors embedded in the platform.
  • the platform may include a number of sensors to capture the weight of one or more users on the platform.
  • the platform is a weight scale or a plurality of weight scales with designated workout areas for one or more users.
  • the platform may include markings such as lines, circles, or any combination thereof to indicate one or more workout areas.
  • Images captured by the cameras 104 A and 104 B may be used to extract a skeleton frame that represents the person.
  • the various points on the skeleton frame may be analyzed to identify and track particular joint movements executed by the person.
  • a processing unit such as a vision processing unit (VPU) or a graphics processing unit (GPU), may be used to extract the skeleton frame of the person and identify and track the particular movements executed by the person.
  • the processing unit may include a neural compute engine that is a dedicated hardware accelerator for deep neural network deep-learning inferences. Configurations of the skeleton frame can be used to train on-device deep neural networks and computer vision applications executed by the VPU. Additionally, the configurations of the skeleton frame can be analyzed to compute calorie calculation.
  • FIG. 2 is a skeleton frame 200 .
  • the skeleton frame may be extracted from a plurality of images captured by cameras in a smart gym. In embodiments, the skeleton frame is extracted as a set of joints. As illustrated in FIG. 2 , the skeleton frame 200 has a total of 12 joints. Configurations that result from movement of the skeleton frame 200 may be tracked according to the movement of each joint 202 A, 202 B, 204 A, 204 B, 206 A, 206 B, 208 A, 208 B, 210 A, 210 B, 212 A, and 212 B. As illustrated, joint 202 A is a right shoulder joint, and joint 202 B is a left shoulder joint.
  • Joint 204 A is a right elbow joint, and joint 204 B is a left elbow joint.
  • Joint 206 A is a right wrist joint, and joint 206 B is a left wrist joint.
  • Joint 208 A is a right hip joint, and joint 208 B is a left hip joint.
  • Joint 210 A is a right knee joint, and joint 210 B is a left knee joint.
  • Joint 212 A is a right ankle joint, and joint 212 B is a left ankle joint.
  • a particular set of joints are illustrated. However, any number of joints may be found along the human skeleton. Additionally, any movement possible with the human skeleton associated with one or more joints that can be tracked according to the present techniques.
  • the particular movement possible may be defined by the type of joint.
  • the movements include, but are not limited to angular movements, such as flexion and extension, abduction, adduction, circumduction, rotation, medial rotation, lateral rotation, external rotation, internal rotation, inversion, eversion, protraction, retraction, elevation, depression, opposition, supination, pronation, or any combinations thereof.
  • abnormal joint movements may be hyperextension, hyperflexion.
  • FIG. 2 is not intended to indicate that the example skeleton frame 200 is to include all of the joints shown in FIG. 2 . Rather, the example system 200 can be implemented using fewer or additional components not illustrated in FIG. 2 (e.g., additional joints, other skeleton frame points, etc.)
  • the neck may serve as an additional joint location on the skeleton frame.
  • a series of points or a line representing the spine of the person may serve as another skeleton frame point that may be tracked.
  • cameras of the smart gym may be used to pin-point the 12 points (joints) to derive along the skeleton frame for movement identification.
  • the captured images and derived skeleton frame information may be captured and stored in the cloud.
  • the smart gym is a cloud storage solution that is used to store the personal information, exercise activity logs, duration of exercise, and the calories burned for a person.
  • the platform may include weight sensors.
  • a user may scan-in to begin free weight exercise or floor exercises. During the scan-in process, an initial user weight may be captured. After the scan-in, the weight sensors of the platform may be used to capture the weight of any accessories used by the user during the workout. For example, if the user picks up barbells to begin a free weight exercise, the platform can capture the total weight of the user, which will be increased by the weight of the barbells when compared to the weight of the user during the scan-in. The weight of the user during exercise may be captured simultaneously with the images of the user at a processing unit.
  • FIG. 3 is a process flow diagram of a method 300 that enables a smart gym.
  • the example method 300 can be implemented in the systems 400 of FIG. 4 , the computing device 500 of FIG. 5 , or the computer readable media 600 of FIG. 6 .
  • the method 300 can be implemented using the processing unit 406 , CPU 502 , VPU/GPU 508 , or the processor 602 .
  • the method begins.
  • the profile information of the user is obtained from the user or a data store. Accordingly, at block 306 , a user profile is obtained.
  • the user profile may include user identifications or identifiers such as name, age, and weight.
  • the user profile may also include other information used to calculate a calorie expenditure, such as sex, height, and the like.
  • the particular free weight, if any, being used in an exercise by the user is determined.
  • the free weight may be determined from an image captured via cameras in the smart gym.
  • the free weight may also be determined according to a weight difference of the user when compared to the original weight of the user during the scan-in process.
  • the workout area is monitored to detect movement. The movement detected may be the movement of a user in and out of a particular skeletal configuration. If movement of the user is detected, the process flow continues to block 312 . If movement is not detected, process flow returns to block 308 where the present techniques scan for the particular weight held by the user. At block 312 , movement capturing and analysis occurs.
  • the movements of the skeleton frame of a user can be extracted by tracking the plurality of joints captured by images of the user.
  • the movement capture may also capture movement of the user as far as jumping, sidestepping, and other movements that may move the entire skeleton frame within the workout area of the platform with little to no joint movement.
  • the weights used by the user during the movement are also captured by the cameras (image sensors) of the smart gym or by weight sensors in a platform of the smart gym. In this manner, the smart gym is able to extract the particular movements executed by the user and the weights used by the user during exercise.
  • a repetition counting module is executed.
  • the repetition counting module will count a number of repetitions that a user successfully completes of an exercise.
  • a repetition is defined as a complete movement that counts as one instance of a particular exercise.
  • exercises may be defined by known joint movements for each joint during the exercise.
  • a repetition of an exercise may be complete once the joints of the skeleton frame have satisfied each known joint movement of an exercise.
  • a user can define custom exercises. In a custom exercise, the user can define the required movements for each joint, along with a sequence for each movement of the exercise, and then store the custom exercise.
  • the repetition and counting module may track a user's movement during a workout session within the smart gym workout area and compare the user's movement with pre-defined exercises and custom exercises (known joint movements) to determine the particular exercise being performed. Once the exercise being performed is determined, the successful repetitions of the exercise may be counted.
  • one repetition of a squat exercise may be to lower the hips with the thighs parallel to the floor, and return to a standing position.
  • a known joint movement may be lowering the right hip joint 208 A and the left hip joint 208 B a predefined amount, along with joints 202 A, 202 B, 204 A, 204 B also lowering somewhat.
  • the camera may track movement of the joints 210 A, 210 B, 212 A, and 212 B to ensure the knee joints maintain proper form and do not extend past the toes located near ankle joints 212 A and 212 B.
  • a location of the toes may be captured by the camera or inferred by their proximity to the ankle joint.
  • each occurrence or instance of the current joint movements and the sequence of the movements by a user in the smart gym matches known joint movements stored as an exercise or a custom exercise
  • each occurrence or instance is counted as a repetition of the exercise.
  • repetition counting for each exercise is different. For example, when a person is doing hammer curls, repetitions of each instance of the exercise may be determined based on a distance between the wrist joint 206 A and shoulder joint 202 A as well as an angle created at the elbow joint 204 A during the exercise.
  • a calorie counting module is executed.
  • the calorie counting module may calculate the user's caloric expenditure based on the exercise being performed. In embodiments, the calculation of the number of calories burned by the user during exercise may be based on the user's profile and the particular weights, if any, being held by the user during exercise.
  • the captured information is synchronized to a cloud storage location.
  • the method 300 ends.
  • the processing unit may also derive the calories burned based on the a number of parameters, such as the (1) Weight of the user from User Profile and (2) the type of activity the user is doing (3) and the Intensity of the sports to determine the metabolic equivalent for a task (MET) to calculate the calories burnt. All information, including the calories burned, would be transmitted to a cloud storage location. The information can be viewed at any number of devices from the cloud storage location. For example, the data may be displayed in the user's phone. A person can use a mobile device to view fitness data such as what activities they completed, the total repetitions for each exercise, METs associated with each exercise, the duration of each exercise, the total duration of the workout, the calories burnt during each exercise and a total number of calories burned for the entire workout.
  • This fitness data may also be used as a training aid for professional athletes and coaches.
  • virtual coaching may be enabled via the smart gym using haptic, auditory, visual feedback.
  • the known joint movement that is tracked during each exercise may be reviewed by coaches, physicians, and other professionals to derive training goals or correct issues observable during exercise.
  • Algorithms can be used to analyze joint movement of a person.
  • a person can receive immediate feedback on exercises performed in the smart gym on any mobile device via the cloud-based data.
  • historic data can be maintained for a person and can be compared with new data captured via the smart gym. In this manner, a person has the information necessary to improve or change techniques in real time instead of waiting for coaches to download and review the footage.
  • a person's workout can be changed instantaneously in response to metrics observed during a current workout in the smart gym. For example, consider a person with a workout that includes several lower body exercises. The person may have trouble completing repetitions of an exercise with the usual proper form (as indicated from the historic data) due to an injury or fatigue. However, the change in form may not be visible to humans. The change may be as small as a change squat depth of a few centimeters. The smart gym may alert the person or coach of the change in the person's typical movement. In response to this alert, immediate changes to the workout may be made to prevent any further injury or fatigue.
  • data captured by the smart gym may be used to seed augmented-reality applications for users who want to virtually train with their favorite athletes, celebrities, or friends and family.
  • the cameras may capture the user during a workout and render the user in another environment.
  • the display may render the user in a second environment.
  • the second environment is a secondary workout space.
  • the second environment may be a training environment with third parties, such as coaches or friends.
  • the free weights, flooring, or other accessories used during the workout are enhanced by computer generated perceptual information.
  • the perceptual information may include, for example, visual, auditory, haptic, somasensory, and olafactory information.
  • FIG. 4 is a block diagram of a system 400 that enables a smart gym. Similar to the example of FIG. 3 , when a person 404 enters the free-weight platform and scan-in, his or her user profile such as name, age & weight will be captured. This information may be obtained from a stored profile, or the user can create a profile by entering this information into an application prior to the exercise. When movement is detected on a platform in the smart gym, the camera 402 will capture the person's 404 movement and the processing unit 406 will begin analysis. Images captured by the camera will be analyzed and matched against a database of exercises. The exercises may include pre-defined exercises and custom exercises. In embodiments, the exercises stored in the database may include exercises previously performed by the person.
  • the processing unit may also count repetitions of the exercise, as well as the total weight used during the exercise. During repetition counting, the camera 402 in the free weight area will capture the image of the person 404 . The camera then sends the human image to the processing unit 406 to create the skeleton frame that corresponds to the person. In embodiments, the skeleton frame may have with 12 points or joints. Thus, the camera 402 will capture movement of the user 404 and processing unit 406 will begin analysis. The processing unit may derive a skeletal frame 408 of the user and track movement according to the movement of joints of the skeletal frame.
  • the MET is a unit that estimates the amount of energy used by the body during physical activity, as compared to resting metabolism.
  • any measure of the rate at which a person expends energy relative to the weight of the person while performing activities can be used.
  • the use of a person's weight when calculating this measure enables the measure to be standardized so that comparisons can be made between the fitness and activity levels between different people.
  • the MET is standardized so it can apply to people of varying body weight and compare different activities.
  • MET can be expressed in terms of oxygen use or kilocalories (what is commonly referred to as calories). Generally, the harder a person works during a given activity, the more oxygen is consumed and the higher the MET. Table 1 illustrates general MET ranges:
  • Table 2 provides examples of moderate physical activity and vigorous physical activity:
  • Table 3 provides exemplary MET values for a number of activities:
  • the MET values may be stored in a database and used as a lookup table when calculating a total caloric expenditure.
  • the total calories burned may be calculated as follows:
  • the total calories burned during an exercise is the duration of the exercise in minutes, multiplied by the MET for the exercise, multiplied by 3.5, multiplied by the person's weight in kilograms; divided by 200.
  • An intensity of a workout may be calculated as follows:
  • a user enters a smart gym and lifts dumbbells for 5 minutes during a lunch break.
  • his or her user profile is captured or retrieved.
  • the user profile may also include sex, height, and any other physical information about the person.
  • the person then ventures to the workout area and obtains a dumbbell and begins a workout on a smart gym platform.
  • the person's activity is captured and analyzed to determine that the person is lifting a dumbbell.
  • the platform is triggered by the user when the user steps on the platform.
  • the computing device 500 may be, for example, a laptop computer, desktop computer, tablet computer, mobile device, or wearable device, among others.
  • the computing device 500 may include a central processing unit (CPU) 502 that is configured to execute stored instructions, as well as a memory device 504 that stores instructions that are executable by the CPU 502 .
  • the CPU 502 may be coupled to the memory device 504 by a bus 506 .
  • the CPU 502 can be a single core processor, a multi-core processor, a computing cluster, or any number of other configurations.
  • the computing device 500 may include more than one CPU 502 .
  • the CPU 502 may be a system-on-chip (SoC) with a multi-core processor architecture.
  • SoC system-on-chip
  • the CPU 502 can be a specialized digital signal processor (DSP) used for image processing.
  • DSP digital signal processor
  • the memory device 504 can include random access memory (RAM), read only memory (ROM), flash memory, or any other suitable memory systems.
  • the memory device 504 may include dynamic random-access memory (DRAM).
  • the computing device 500 may also include a vision processing unit or graphics processing unit (GPU) 508 .
  • the CPU 502 may be coupled through the bus 506 to the GPU 508 .
  • the GPU 508 may be configured to perform any number of graphics operations within the computing device 500 .
  • the GPU 508 may be configured to render or manipulate graphics images, graphics frames, videos, or the like, to be displayed to a viewer of the computing device 500 .
  • the CPU 502 may also be connected through the bus 506 to an input/output (I/O) device interface 512 configured to connect the computing device 500 to one or more I/O devices 514 .
  • the I/O devices 514 may include, for example, a keyboard and a pointing device, wherein the pointing device may include a touchpad or a touchscreen, among others.
  • the I/O devices 514 may be built-in components of the computing device 500 , or may be devices that are externally connected to the computing device 500 .
  • the memory 504 may be communicatively coupled to I/O devices 514 through direct memory access (DMA).
  • DMA direct memory access
  • the CPU 502 may also be linked through the bus 506 to a display interface 516 configured to connect the computing device 500 to a display device 516 .
  • the display devices 518 may include a display screen that is a built-in component of the computing device 500 .
  • the display devices 518 may also include a computer monitor, television, or projector, among others, that is internal to or externally connected to the computing device 500 .
  • the display device 516 may also include a head mounted display.
  • the computing device 500 also includes a storage device 520 .
  • the storage device 520 is a physical memory such as a hard drive, an optical drive, a thumbdrive, an array of drives, a solid-state drive, or any combinations thereof.
  • the storage device 520 may also include remote storage drives.
  • the computing device 500 may also include a network interface controller (NIC) 522 .
  • the NIC 522 may be configured to connect the computing device 500 through the bus 506 to a network 524 .
  • the network 524 may be a wide area network (WAN), local area network (LAN), or the Internet, among others.
  • the device may communicate with other devices through a wireless technology.
  • the device may communicate with other devices via a wireless local area network connection.
  • the device may connect and communicate with other devices via Bluetooth® or similar technology.
  • the computing device 500 further includes a smart gym manager 528 .
  • the smart gym manager 528 may be configured to enable monitoring and tracking of free weight exercise or floor exercises along with a calculation of the number of calories burned during exercise.
  • images captured by a plurality of cameras 526 may be processed with data captured by the cameras 526 such that a user can virtually train with a third-party, such as athletes, trainers, and coaches.
  • the smart gym manager 528 includes an identification unit 530 , an equipment recognition models 532 , a movement capture and analysis module 534 , a repetition counting module 536 , and a calorie counting module 538 .
  • the identification unit 530 may be configured to identify a person within the smart gym workout area. In particular, the identification unit may retrieve a user's profile data in response to the person entering the workout area. The person may self-identify prior to entering the workout area by providing all profile information or by providing authentication so that the identification unit 530 can retrieve the person's user profile from a data store.
  • An equipment recognition module 532 may be configured to identify the particular equipment used by the person during a workout. For example, the equipment recognition module 532 may capture the weights of the person during exercise while holding weighted exercise accessories, such as dumbbells or more bills. The equipment used may be recognized by determining the weights of the equipment. The equipment used may also be recognized by capturing the equipment via the cameras 526 and using object identification to identify each weight. Moreover, each piece of equipment may include identifiers on the equipment that can be captured via the cameras 526 and identify via a matching process.
  • a movement capture and analysis module 534 may be configured to extract a skeleton frame corresponding to the person within the workout area.
  • the skeleton frame may be expressed as a set of joints with relative locations.
  • the movements of the skeleton frame of the person is tracked in a series of images captured while the person is exercising.
  • a repetition counting module 536 may be configured to count a number of repetitions of an exercise that a person successfully completes.
  • the particular exercise being performed may be initially determined by comparing the actual movements of the person to known joint movements stored in a database of the electronic device 500 .
  • custom exercises may be defined in the exercise database of the electronic device 500 .
  • the repetition counting module 536 When the current movements of the person engaging in exercise matches the known joint movements of an exercise stored in the exercise database, the repetition counting module 536 then counts each successful repetition of the exercise.
  • a successful repetition of the exercise may include satisfying each particular movement in a particular sequence associated with the exercise.
  • a calorie counting module 538 may be configured to calculate the caloric expenditure of the person during the exercise. MET values for various exercises may be stored in a lookup table. The total caloric expenditure may be calculated as the duration of the exercise in minutes, multiplied by the MET for the exercise, multiplied by 3.5, multiplied by the person's weight in kilograms; divided by 200.
  • the block diagram of FIG. 5 is not intended to indicate that the computing device 500 is to include all of the components shown in FIG. 5 . Rather, the computing device 500 can include fewer or additional components not illustrated in FIG. 5 , such as additional buffers, additional processors, and the like.
  • the computing device 500 may include any number of additional components not shown in FIG. 5 , depending on the details of the specific implementation. Furthermore, any of the functionalities of the smart gym manager 528 , identification unit 530 , equipment recognition module 532 , movement capture and analysis module 534 , repetition counting module 536 , and calorie counting module 538 may be partially, or entirely, implemented in hardware and/or in the processor 502 .
  • FIG. 6 is a block diagram showing computer readable media 600 that stores code for enabling a smart gym.
  • the computer readable media 600 may be accessed by a processor 602 over a computer bus 604 .
  • the computer readable medium 600 may include code configured to direct the processor 602 to perform the methods described herein.
  • the computer readable media 600 may be non-transitory computer readable media.
  • the computer readable media 600 may be storage media.
  • the various software components discussed herein may be stored on one or more computer readable media 600 , as indicated in FIG. 6 .
  • an identification module 606 may be stored on the computer readable media 600 .
  • equipment recognition module 608 may be stored on the computer readable media 600 .
  • movement capture and analysis module 610 may be stored on the computer readable media 600 .
  • repetition counting module 612 may be stored on the computer readable media 600 .
  • calorie counting module 614 may be stored on the computer readable media 600 .
  • the identification module 606 may be configured to identify a person within the smart gym workout area.
  • the equipment recognition module 608 may be configured to identify the particular equipment used by the person during a workout.
  • the movement capture and analysis module 610 may be configured to extract a skeleton frame corresponding to the person within the workout area.
  • the repetition counting module 612 may be configured to count a number of repetitions of an exercise that a person successfully completes.
  • the calorie counting module 614 may be configured to calculate the caloric expenditure of the person during the exercise.
  • FIG. 6 The block diagram of FIG. 6 is not intended to indicate that the computer readable media 600 is to include all of the components shown in FIG. 6 . Further, the computer readable media 600 may include any number of additional components not shown in FIG. 6 , depending on the details of the specific implementation.
  • Example 1 is an apparatus.
  • the apparatus includes a platform, wherein the platform comprises at least a weight senor; at least one camera, wherein the camera is configured to capture movements on the platform; a processor, wherein the processor is configured to: capture a person that has entered the platform area; derive a skeleton frame for the person in the platform area; track joint movements of the skeleton frame; identify one or more exercises performed by the tracked movement and count the number of repetitions performed; and calculate a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the exercise.
  • Example 2 includes the apparatus of example 1, including or excluding optional features.
  • an exercise is identified by comparing the joint movement to a known joint movement associated with the exercise.
  • the known joint movement is an average of the user's previous movements.
  • the known joint movement is reviewed to derive training goals.
  • Example 3 includes the apparatus of any one of examples 1 to 2, including or excluding optional features.
  • the apparatus includes rendering the tracked movement of the skeleton frame on a display; and using augmented reality to place a third party on the screen with the rendering of the person.
  • Example 4 includes the apparatus of any one of examples 1 to 3, including or excluding optional features.
  • the exercise is identified via machine leaning.
  • Example 5 includes the apparatus of any one of examples 1 to 4, including or excluding optional features.
  • calculating the caloric expenditure is based on a metabolic equivalent for a task (MET).
  • Example 6 includes the apparatus of any one of examples 1 to 5, including or excluding optional features.
  • virtual coaching is enabled via the smart gym with haptic, auditory, visual feedback.
  • Example 7 includes the apparatus of any one of examples 1 to 6, including or excluding optional features.
  • in person coaching is enabled via the smart gym with post exercise playback or analysis.
  • Example 8 is a method. The method includes obtaining a user profile corresponding to a person in the workout area; extracting a skeleton frame from images captured of the person; tracking joint movements of the skeleton frame; identifying an exercise performed by the tracked movement and counting a number of repetitions of the exercise performed; and calculating a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the exercise.
  • Example 9 includes the method of example 8, including or excluding optional features.
  • an exercise is identified by comparing the joint movement to a known joint movement associated with the exercise.
  • the known joint movement is an average of the user's previous movements.
  • the known joint movement is reviewed to derive training goals.
  • Example 10 includes the method of any one of examples 8 to 9, including or excluding optional features.
  • the method includes rendering the tracked movement of the skeleton frame on a display; and using augmented reality to place a third party on the screen with the rendering of the person.
  • Example 11 includes the method of any one of examples 8 to 10, including or excluding optional features.
  • the exercise is identified via machine leaning.
  • Example 12 includes the method of any one of examples 8 to 11, including or excluding optional features.
  • the caloric expenditure is based on a MET value obtained from a lookup table.
  • Example 13 includes the method of any one of examples 8 to 12, including or excluding optional features.
  • virtual coaching is enabled via the smart gym with haptic, auditory, visual feedback.
  • Example 14 includes the method of any one of examples 8 to 13, including or excluding optional features.
  • in person coaching is enabled via the smart gym with post exercise playback or analysis.
  • Example 15 is at least one computer readable medium that enables a smart gym having instructions stored therein that.
  • the computer-readable medium includes instructions that direct the processor to obtain a user profile corresponding to a person in the workout area; extract a skeleton frame from images captured of the person; track joint movements of the skeleton frame; identify an exercise performed by the tracked movement and counting a number of repetitions of the exercise performed; and calculate a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the exercise.
  • Example 16 includes the computer-readable medium of example 15, including or excluding optional features.
  • an exercise is identified by comparing the joint movement to a known joint movement associated with the exercise.
  • the known joint movement is an average of the user's previous movements.
  • the known joint movement is reviewed to derive training goals.
  • Example 17 includes the computer-readable medium of any one of examples 15 to 16, including or excluding optional features.
  • the computer-readable medium includes rendering the tracked movement of the skeleton frame on a display; and using augmented reality to place a third party on the screen with the rendering of the person.
  • Example 18 includes the computer-readable medium of any one of examples 15 to 17, including or excluding optional features.
  • the exercise is identified via machine leaning.
  • Example 19 includes the computer-readable medium of any one of examples 15 to 18, including or excluding optional features.
  • the caloric expenditure is based on a MET value obtained from a lookup table.
  • the elements in some cases may each have a same reference number or a different reference number to suggest that the elements represented could be different and/or similar.
  • an element may be flexible enough to have different implementations and work with some or all of the systems shown or described herein.
  • the various elements shown in the figures may be the same or different. Which one is referred to as a first element and which is called a second element is arbitrary.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physiology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

An apparatus for a smart gym is described herein. An electronic device includes a sensor to detect movement of a user in a workout area, at least one memory, instructions, and processor circuitry to execute the instructions to: generate a skeleton frame representative of the user in the workout area; analyze the detected movements of the user; identify a posture of the user performing an exercise; and output a signal to cause posture feedback to be displayed, the posture-analysis feedback indicative of proper form of a skeleton frame of the user when performing an exercise.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This patent arises from a continuation of U.S. patent application Ser. No. 16/720,775, which was filed on Dec. 19, 2019, and titled “Smart Gym,” which is incorporated by reference herein in its entirety. Priority to U.S. patent application Ser. No. 16/720,775 is hereby claimed.
  • BACKGROUND
  • Fitness or activity trackers are common devices that can be integrated into a number of wearables. For example, the trackers may be components of smart watch devices. Devices that include trackers can monitor a user's physical activity as well as biometric data. For example, a fitness tracker may calculate a person's steps and movement throughout the day. This data may then be used to determine the person's caloric expenditure for a particular length of time.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a smart gym;
  • FIG. 2 is a skeleton frame;
  • FIG. 3 is a process flow diagram of a method that enables a smart gym;
  • FIG. 4 is a block diagram of a system;
  • FIG. 5 is a block diagram is shown illustrating a computing device that enables a smart gym; and
  • FIG. 6 is a block diagram showing computer readable media that stores code for enabling a smart gym.
  • The same numbers are used throughout the disclosure and the figures to reference like components and features. Numbers in the 100 series refer to features originally found in FIG. 1; numbers in the 200 series refer to features originally found in FIG. 2B; and so on.
  • DESCRIPTION OF THE EMBODIMENTS
  • As discussed above, traditional fitness or activity trackers are commonly integrated into wearable devices, such as smart watches. Data captured by a fitness tracker can be used to calculate a caloric expenditure for a person. A caloric expenditure may refer to a number of calories burned by a person during a particular time window. To accurately calculate caloric expenditures, traditional fitness trackers use exercise profiles. An exercise profile may indicate the particular activity to be performed. By having beforehand knowledge of an activity to be performed by a user, a fitness tracker can more accurately calculate the caloric expenditures of a user. Exercise profiles may include, for example, swimming, running and high intensity interval training (HIIT).
  • Exercise profiles are limited to monitoring a wearer's biometric information, such as heart rate. As such, traditional fitness trackers lack the functionality to track free weight exercise or floor exercises. Free weight exercises are movements that are performed with free weights, as opposed to machine weights. Free weight exercises apply additional weight to muscles of the body while moving or configuring the body in and out of a skeletal configuration. Floor exercises, as used herein, refer to exercises that require the human body to achieve a skeletal configuration with the goal of stressing muscles in that configuration. This may be referred to as free-form exercises.
  • Traditional fitness trackers cannot accurately track total body movement for exercises such as free-weight exercises and floor exercises. While some traditional trackers can account for movement from a single point of reference, the traditional trackers cannot track the entire movement associated with moving in and out of a skeletal configuration. For example, the movement for doing a weighted barbell squat can be relatively small as detected by fitness tracker worn at a single point. Thus, a tracker based on movement will only detect a small movement. However, but the actual calories burnt during an entire weighted barbell squat can be significantly higher due to the weight added to the barbell. This inaccurate tracking of free weights and floor exercise can render traditional trackers useless, as free weight exercises and floor exercises are integral to a complete fitness routine. Thus, the inability to accurately track floor exercises leaves a void in interactive fitness. Traditional fitness trackers are also unable to track calories burnt and repetition counting for dumbbell weights and weight lifting.
  • The present techniques enable a smart gym. As described herein, the smart gym enables an accurate calorie expenditure calculation when performing exercises. In particular, the present techniques enable an accurate calorie expenditure calculation when tracking free weight exercises or floor exercises. Parameters such as skeletal movements and the weight held during the skeletal movements may be tracked via a plurality of sensors. The sensor may include image sensors and weight sensors. Fitness data, including a calorie expenditure, may be derived from the parameters. Additionally, repetitions of the free weights may be counted for the free weight exercise or floor exercises. In embodiments, the present techniques enable virtual coaching. For example, the smart gym enables tracking during free weight exercise or floor exercises as a training aid for professional athletes and coaches. The fitness data may be shared, in real time, with coaches or other professionals who can provide coaching based on the derived fitness data. Additionally, in embodiments, the present techniques enable augmented-reality applications for users who want to virtually train with public figures, such as a favorite athlete or celebrity.
  • FIG. 1 is a block diagram of a smart gym 100. The smart gym 100 includes a base 102 and one or more cameras 104. In the example of FIG. 1, two cameras 104A and 104B are illustrated in the smart gym 100. The base 100 may a platform with a number of sensors embedded in the platform. Thus, the platform may include a number of sensors to capture the weight of one or more users on the platform. In embodiments, the platform is a weight scale or a plurality of weight scales with designated workout areas for one or more users. The platform may include markings such as lines, circles, or any combination thereof to indicate one or more workout areas. Thus, the platform may designate a workout area for one or more users, where all free weight exercise or floor exercises are performed on the platform within the workout area. As illustrated, the two cameras 104A and 104B are located above the workout area in opposing corners to capture a person engaging in free weight exercise or floor exercises. In embodiments, the cameras are wide angle cameras that can capture the entire workout areas with a wide angle. The cameras may capture movements made by a person while within the workout area.
  • Images captured by the cameras 104A and 104B may be used to extract a skeleton frame that represents the person. The various points on the skeleton frame may be analyzed to identify and track particular joint movements executed by the person. In embodiments, a processing unit, such as a vision processing unit (VPU) or a graphics processing unit (GPU), may be used to extract the skeleton frame of the person and identify and track the particular movements executed by the person. In embodiments, the processing unit may include a neural compute engine that is a dedicated hardware accelerator for deep neural network deep-learning inferences. Configurations of the skeleton frame can be used to train on-device deep neural networks and computer vision applications executed by the VPU. Additionally, the configurations of the skeleton frame can be analyzed to compute calorie calculation.
  • FIG. 2 is a skeleton frame 200. The skeleton frame may be extracted from a plurality of images captured by cameras in a smart gym. In embodiments, the skeleton frame is extracted as a set of joints. As illustrated in FIG. 2, the skeleton frame 200 has a total of 12 joints. Configurations that result from movement of the skeleton frame 200 may be tracked according to the movement of each joint 202A, 202B, 204A, 204B, 206A, 206B, 208A, 208B, 210A, 210B, 212A, and 212B. As illustrated, joint 202A is a right shoulder joint, and joint 202B is a left shoulder joint. Joint 204A is a right elbow joint, and joint 204B is a left elbow joint. Joint 206A is a right wrist joint, and joint 206B is a left wrist joint. Joint 208A is a right hip joint, and joint 208B is a left hip joint. Joint 210A is a right knee joint, and joint 210B is a left knee joint. Joint 212A is a right ankle joint, and joint 212B is a left ankle joint. For ease of description, a particular set of joints are illustrated. However, any number of joints may be found along the human skeleton. Additionally, any movement possible with the human skeleton associated with one or more joints that can be tracked according to the present techniques.
  • The particular movement possible may be defined by the type of joint. The movements include, but are not limited to angular movements, such as flexion and extension, abduction, adduction, circumduction, rotation, medial rotation, lateral rotation, external rotation, internal rotation, inversion, eversion, protraction, retraction, elevation, depression, opposition, supination, pronation, or any combinations thereof. Additionally, abnormal joint movements may be hyperextension, hyperflexion.
  • The diagram of FIG. 2 is not intended to indicate that the example skeleton frame 200 is to include all of the joints shown in FIG. 2. Rather, the example system 200 can be implemented using fewer or additional components not illustrated in FIG. 2 (e.g., additional joints, other skeleton frame points, etc.) For example, the neck may serve as an additional joint location on the skeleton frame. Moreover, a series of points or a line representing the spine of the person may serve as another skeleton frame point that may be tracked.
  • In embodiments, cameras of the smart gym may be used to pin-point the 12 points (joints) to derive along the skeleton frame for movement identification. In embodiments, the captured images and derived skeleton frame information may be captured and stored in the cloud. Accordingly, the smart gym is a cloud storage solution that is used to store the personal information, exercise activity logs, duration of exercise, and the calories burned for a person.
  • In embodiments, movements of the skeleton frame are combined with other sensor data. For example, the platform may include weight sensors. A user may scan-in to begin free weight exercise or floor exercises. During the scan-in process, an initial user weight may be captured. After the scan-in, the weight sensors of the platform may be used to capture the weight of any accessories used by the user during the workout. For example, if the user picks up barbells to begin a free weight exercise, the platform can capture the total weight of the user, which will be increased by the weight of the barbells when compared to the weight of the user during the scan-in. The weight of the user during exercise may be captured simultaneously with the images of the user at a processing unit.
  • FIG. 3 is a process flow diagram of a method 300 that enables a smart gym. The example method 300 can be implemented in the systems 400 of FIG. 4, the computing device 500 of FIG. 5, or the computer readable media 600 of FIG. 6.
  • For example, the method 300 can be implemented using the processing unit 406, CPU 502, VPU/GPU 508, or the processor 602. At block 302, the method begins. At block 304, it is determined if a user is scanned in. If the user is scanned in, process flow continues to block 306. If the user is not scanned in, process flow returns to block 302 where the method starts and the user is pulled until the user is scanned in. During a scan-in, the profile information of the user is obtained from the user or a data store. Accordingly, at block 306, a user profile is obtained. The user profile may include user identifications or identifiers such as name, age, and weight. The user profile may also include other information used to calculate a calorie expenditure, such as sex, height, and the like.
  • At block 308, the particular free weight, if any, being used in an exercise by the user is determined. The free weight may be determined from an image captured via cameras in the smart gym. The free weight may also be determined according to a weight difference of the user when compared to the original weight of the user during the scan-in process. At block 310, the workout area is monitored to detect movement. The movement detected may be the movement of a user in and out of a particular skeletal configuration. If movement of the user is detected, the process flow continues to block 312. If movement is not detected, process flow returns to block 308 where the present techniques scan for the particular weight held by the user. At block 312, movement capturing and analysis occurs. During movement capture and analysis, the movements of the skeleton frame of a user can be extracted by tracking the plurality of joints captured by images of the user. The movement capture may also capture movement of the user as far as jumping, sidestepping, and other movements that may move the entire skeleton frame within the workout area of the platform with little to no joint movement. Simultaneously, the weights used by the user during the movement are also captured by the cameras (image sensors) of the smart gym or by weight sensors in a platform of the smart gym. In this manner, the smart gym is able to extract the particular movements executed by the user and the weights used by the user during exercise.
  • At block 314, a repetition counting module is executed. The repetition counting module will count a number of repetitions that a user successfully completes of an exercise. As used here in, a repetition (rep) is defined as a complete movement that counts as one instance of a particular exercise. In embodiments, exercises may be defined by known joint movements for each joint during the exercise. A repetition of an exercise may be complete once the joints of the skeleton frame have satisfied each known joint movement of an exercise. In embodiments, a user can define custom exercises. In a custom exercise, the user can define the required movements for each joint, along with a sequence for each movement of the exercise, and then store the custom exercise. The repetition and counting module may track a user's movement during a workout session within the smart gym workout area and compare the user's movement with pre-defined exercises and custom exercises (known joint movements) to determine the particular exercise being performed. Once the exercise being performed is determined, the successful repetitions of the exercise may be counted.
  • For example, one repetition of a squat exercise may be to lower the hips with the thighs parallel to the floor, and return to a standing position. Thus, a known joint movement may be lowering the right hip joint 208A and the left hip joint 208B a predefined amount, along with joints 202A, 202B, 204A, 204B also lowering somewhat. Additionally, the camera may track movement of the joints 210A, 210B, 212A, and 212B to ensure the knee joints maintain proper form and do not extend past the toes located near ankle joints 212A and 212B. In this example, a location of the toes may be captured by the camera or inferred by their proximity to the ankle joint. When an occurrence or instance of the current joint movements and the sequence of the movements by a user in the smart gym matches known joint movements stored as an exercise or a custom exercise, each occurrence or instance is counted as a repetition of the exercise. In embodiments, repetition counting for each exercise is different. For example, when a person is doing hammer curls, repetitions of each instance of the exercise may be determined based on a distance between the wrist joint 206A and shoulder joint 202A as well as an angle created at the elbow joint 204A during the exercise.
  • At block 316, a calorie counting module is executed. The calorie counting module may calculate the user's caloric expenditure based on the exercise being performed. In embodiments, the calculation of the number of calories burned by the user during exercise may be based on the user's profile and the particular weights, if any, being held by the user during exercise. At block 318, the captured information is synchronized to a cloud storage location. At block 320 the method 300 ends.
  • This process flow diagram is not intended to indicate that the blocks of the example method 300 are to be executed in any particular order, or that all of the blocks are to be included in every case. Further, any number of additional blocks not shown may be included within the example method 300, depending on the details of the specific implementation.
  • The processing unit may also derive the calories burned based on the a number of parameters, such as the (1) Weight of the user from User Profile and (2) the type of activity the user is doing (3) and the Intensity of the sports to determine the metabolic equivalent for a task (MET) to calculate the calories burnt. All information, including the calories burned, would be transmitted to a cloud storage location. The information can be viewed at any number of devices from the cloud storage location. For example, the data may be displayed in the user's phone. A person can use a mobile device to view fitness data such as what activities they completed, the total repetitions for each exercise, METs associated with each exercise, the duration of each exercise, the total duration of the workout, the calories burnt during each exercise and a total number of calories burned for the entire workout.
  • This fitness data may also be used as a training aid for professional athletes and coaches. In embodiments, virtual coaching may be enabled via the smart gym using haptic, auditory, visual feedback. For example, the known joint movement that is tracked during each exercise may be reviewed by coaches, physicians, and other professionals to derive training goals or correct issues observable during exercise. Algorithms can be used to analyze joint movement of a person. In embodiments, a person can receive immediate feedback on exercises performed in the smart gym on any mobile device via the cloud-based data. Moreover, historic data can be maintained for a person and can be compared with new data captured via the smart gym. In this manner, a person has the information necessary to improve or change techniques in real time instead of waiting for coaches to download and review the footage. Further, a person's workout can be changed instantaneously in response to metrics observed during a current workout in the smart gym. For example, consider a person with a workout that includes several lower body exercises. The person may have trouble completing repetitions of an exercise with the usual proper form (as indicated from the historic data) due to an injury or fatigue. However, the change in form may not be visible to humans. The change may be as small as a change squat depth of a few centimeters. The smart gym may alert the person or coach of the change in the person's typical movement. In response to this alert, immediate changes to the workout may be made to prevent any further injury or fatigue.
  • Given that the present techniques are based on movement capturing and analysis, they can be further extended in the future to include posture analysis as feedback to users to improve their workout. Additionally, in embodiments, data captured by the smart gym may be used to seed augmented-reality applications for users who want to virtually train with their favorite athletes, celebrities, or friends and family. In an augmented reality application, the cameras may capture the user during a workout and render the user in another environment. For example, while the user may physically be in a smart gym as described herein, the display may render the user in a second environment. In some cases, the second environment is a secondary workout space. The second environment may be a training environment with third parties, such as coaches or friends. In some examples, in an augmented reality application, the free weights, flooring, or other accessories used during the workout are enhanced by computer generated perceptual information. The perceptual information may include, for example, visual, auditory, haptic, somasensory, and olafactory information.
  • FIG. 4 is a block diagram of a system 400 that enables a smart gym. Similar to the example of FIG. 3, when a person 404 enters the free-weight platform and scan-in, his or her user profile such as name, age & weight will be captured. This information may be obtained from a stored profile, or the user can create a profile by entering this information into an application prior to the exercise. When movement is detected on a platform in the smart gym, the camera 402 will capture the person's 404 movement and the processing unit 406 will begin analysis. Images captured by the camera will be analyzed and matched against a database of exercises. The exercises may include pre-defined exercises and custom exercises. In embodiments, the exercises stored in the database may include exercises previously performed by the person.
  • The processing unit may also count repetitions of the exercise, as well as the total weight used during the exercise. During repetition counting, the camera 402 in the free weight area will capture the image of the person 404. The camera then sends the human image to the processing unit 406 to create the skeleton frame that corresponds to the person. In embodiments, the skeleton frame may have with 12 points or joints. Thus, the camera 402 will capture movement of the user 404 and processing unit 406 will begin analysis. The processing unit may derive a skeletal frame 408 of the user and track movement according to the movement of joints of the skeletal frame.
  • The diagram of FIG. 4 is not intended to indicate that the example system 400 is to include all of the components shown in FIG. 4. Rather, the example system 400 can be implemented using fewer or additional components not illustrated in FIG. 4 (e.g., additional cameras, neural networks, processing units, multiple people, multiple skeleton frames, etc.)
  • For ease of description, the present techniques describe exercise intensities according to a metabolic equivalent for task (MET). The MET is a unit that estimates the amount of energy used by the body during physical activity, as compared to resting metabolism. However, any measure of the rate at which a person expends energy relative to the weight of the person while performing activities can be used. The use of a person's weight when calculating this measure enables the measure to be standardized so that comparisons can be made between the fitness and activity levels between different people. In embodiments, the MET is standardized so it can apply to people of varying body weight and compare different activities. MET can be expressed in terms of oxygen use or kilocalories (what is commonly referred to as calories). Generally, the harder a person works during a given activity, the more oxygen is consumed and the higher the MET. Table 1 illustrates general MET ranges:
  • TABLE 1
    Under 3 MET Light-intensity activities
    3 to 6 MET Moderate-intensity aerobic physical activities,
    Burns 3.5 to 7 Calories per minute (kcal/min)
    Over 6 MET Vigorous-intensity aerobic physical activities
  • Table 2 provides examples of moderate physical activity and vigorous physical activity:
  • TABLE 2
    Moderate physical activity Vigorous physical activity
    Walking Running or jogging
    On a treadmill at a Swimming laps
    speed of about 3 mph Playing basketball or soccer
    Water aerobics Doing calisthenics like push-ups
    Ballroom dancing and jumping jack
    Playing doubles tennis Playing tennis
  • Table 3 provides exemplary MET values for a number of activities:
  • TABLE 3
    Activity Specific Motion Intensity METs
    Conditioning bicycling, stationary, RPM/Spin Moderate 8.5
    exercise bike class
    calisthenics (e.g., pushups, sit Vigorous 8
    ups, pull-ups, jumping jacks)
    calisthenics (e.g., pushups, Moderate 3.8
    sit ups, pull-ups, lunges)
    calisthenics (e.g., sit-ups, Light 2.8
    abdominal crunches)
    Elliptical trainer Moderate 5
    resistance training (weight Vigorous 6
    lifting, free weight, nautilus
    or universal), power lifting
    or body building
    resistance (weight) training, Moderate 5
    squats, slow or explosive effort
    resistance (weight) training, Light 3.5
    multiple exercises, 8-15
    repetitions at variedresistance
    rope skipping, general Vigorous 12.3
    rowing, stationary Vigorous 6
    ergometer, general
    rowing, stationary, general Moderate 4.8
    Running jogging, general Light 7
    Running Moderate 8
    running, marathon Vigorous 13.3
    Walking walking for pleasure Light 3.5
    Water swimming, breaststroke, Vigorous 10.3
    activities general, training or
    competition
    swimming, breaststroke, Light 5.3
    recreational
    swimming, leisurely, Moderate 6
    not lap swimming
  • The MET values may be stored in a database and used as a lookup table when calculating a total caloric expenditure. In embodiments, the total calories burned may be calculated as follows:
  • Total caloric expenditure = Duration * MET * 3.5 * Weight 200
  • Thus, the total calories burned during an exercise is the duration of the exercise in minutes, multiplied by the MET for the exercise, multiplied by 3.5, multiplied by the person's weight in kilograms; divided by 200. An intensity of a workout may be calculated as follows:
  • Intensity = Duration Repetition Count
  • As indicated by the above equation, the intensity of an exercise may be calculated as the duration of the exercise in minutes divided by the repetition count for that exercise. Table 4 describes the intensity range for light, moderate, and vigorous activities:
  • TABLE 4
    Intensity Number Activity
    0.51-1.0 Light
    0.11-0.5 Moderate
    ≤0.1 Vigorous
  • At rest or sitting idly, the average person expends 1 MET, which equals 1 kilocalorie per kilogram of body weight times minutes of activity and 3.5 milliliters of oxygen per kilogram of body weight times minutes of activity. By using MET, the exertion required for different activities can be compared. At 2 MET a person uses using twice the calories per minute that a person can do at rest. The number of calories burned each minute depends on a person's body weight. A person who weighs more will burn more calories per minute.
  • The harder a person body works during any given activity the more oxygen is consumed and the higher the MET level. Tables 1-3 illustrate sample MET data. Generally, light intensity aerobic physical activities are considered to be under three METs. Moderate intensity aerobic physical activities are considered to be between three and six METs. Typically, activities over six METs are considered vigorous intensity aerobic physical activities. Generally, the weight training or resistance training that occurs during free weight exercises is considered moderate or vigorous intensity aerobic physical activity. As such, the resistance training that occurs during free weight exercises is interval to a fitness program. Thus, accurate tracking of free weight exercises is paramount to the training of athletes as well as individuals looking to get in better physical shape.
  • An intensity of a workout is determined by calculating the duration in minutes of the exercise divided by the number of repetitions of the exercise. For example, if a weight is lifted for seventy successful repetitions over ten minutes, the intensity would be as follows:

  • Intensity=10/70=0.14 (Moderate)
  • Consider the following use case. A user enters a smart gym and lifts dumbbells for 5 minutes during a lunch break. When the user scans-in to the smart gym, his or her user profile is captured or retrieved. In the present example, the user profile includes the following information: Name=Jane Doe, Age=30, Weight=52 kg. In embodiments, the user profile may also include sex, height, and any other physical information about the person. The person then ventures to the workout area and obtains a dumbbell and begins a workout on a smart gym platform. The person's activity is captured and analyzed to determine that the person is lifting a dumbbell. The platform is triggered by the user when the user steps on the platform. In response to the trigger, the platform captures a total weight of the user and the dumbbell as 57 kg. A processing unit of the smart gym system may calculate that the person is lifting 5 kg of dumbbell weight, since the person's weight is known to be 52 kg. A camera captures that the person did a total of 500 repetitions of lifting the dumbbell. The intensity of the workout may be calculated as
  • Intensity of Workout = Duration ( in minutes ) / repetition count = 5 / 50 = 0.1 ( Vigorous )
  • So, Jane Doe's intensity of the five-minute workout in the smart gym is considered vigorous as found in Table 4. Referring to Table 3, the number for METs for the resistance training performed by Jane Doe is 6.
  • Total calories burnt = Duration ( in minutes ) * ( MET * 3.5 * weight in kg ) / 200 = 5 * ( 6 * 3.5 * 57 ) / 200 = 5985 / 200 = 29.93 Calories burnt
  • In conclusion, Jane's 5 minutes dumbbell workout during lunch is as follows: She lifted 5kg of dumbbell for 50 repetitions in 5 minutes which is vigorous. Her MET number is 6 and total calories burnt is 29.93
  • Referring now to FIG. 5, a block diagram is shown illustrating a computing device that enables a smart gym. The computing device 500 may be, for example, a laptop computer, desktop computer, tablet computer, mobile device, or wearable device, among others. The computing device 500 may include a central processing unit (CPU) 502 that is configured to execute stored instructions, as well as a memory device 504 that stores instructions that are executable by the CPU 502. The CPU 502 may be coupled to the memory device 504 by a bus 506. Additionally, the CPU 502 can be a single core processor, a multi-core processor, a computing cluster, or any number of other configurations. Furthermore, the computing device 500 may include more than one CPU 502. In some examples, the CPU 502 may be a system-on-chip (SoC) with a multi-core processor architecture. In some examples, the CPU 502 can be a specialized digital signal processor (DSP) used for image processing. The memory device 504 can include random access memory (RAM), read only memory (ROM), flash memory, or any other suitable memory systems. For example, the memory device 504 may include dynamic random-access memory (DRAM).
  • The computing device 500 may also include a vision processing unit or graphics processing unit (GPU) 508. As shown, the CPU 502 may be coupled through the bus 506 to the GPU 508. The GPU 508 may be configured to perform any number of graphics operations within the computing device 500. For example, the GPU 508 may be configured to render or manipulate graphics images, graphics frames, videos, or the like, to be displayed to a viewer of the computing device 500.
  • The CPU 502 may also be connected through the bus 506 to an input/output (I/O) device interface 512 configured to connect the computing device 500 to one or more I/O devices 514. The I/O devices 514 may include, for example, a keyboard and a pointing device, wherein the pointing device may include a touchpad or a touchscreen, among others. The I/O devices 514 may be built-in components of the computing device 500, or may be devices that are externally connected to the computing device 500. In some examples, the memory 504 may be communicatively coupled to I/O devices 514 through direct memory access (DMA).
  • The CPU 502 may also be linked through the bus 506 to a display interface 516 configured to connect the computing device 500 to a display device 516. The display devices 518 may include a display screen that is a built-in component of the computing device 500. The display devices 518 may also include a computer monitor, television, or projector, among others, that is internal to or externally connected to the computing device 500. The display device 516 may also include a head mounted display.
  • The computing device 500 also includes a storage device 520. The storage device 520 is a physical memory such as a hard drive, an optical drive, a thumbdrive, an array of drives, a solid-state drive, or any combinations thereof. The storage device 520 may also include remote storage drives.
  • The computing device 500 may also include a network interface controller (NIC) 522. The NIC 522 may be configured to connect the computing device 500 through the bus 506 to a network 524. The network 524 may be a wide area network (WAN), local area network (LAN), or the Internet, among others. In some examples, the device may communicate with other devices through a wireless technology. For example, the device may communicate with other devices via a wireless local area network connection. In some examples, the device may connect and communicate with other devices via Bluetooth® or similar technology.
  • The computing device 500 further includes a smart gym manager 528. The smart gym manager 528 may be configured to enable monitoring and tracking of free weight exercise or floor exercises along with a calculation of the number of calories burned during exercise. In particular, images captured by a plurality of cameras 526 may be processed with data captured by the cameras 526 such that a user can virtually train with a third-party, such as athletes, trainers, and coaches. The smart gym manager 528 includes an identification unit 530, an equipment recognition models 532, a movement capture and analysis module 534, a repetition counting module 536, and a calorie counting module 538.
  • The identification unit 530 may be configured to identify a person within the smart gym workout area. In particular, the identification unit may retrieve a user's profile data in response to the person entering the workout area. The person may self-identify prior to entering the workout area by providing all profile information or by providing authentication so that the identification unit 530 can retrieve the person's user profile from a data store. An equipment recognition module 532 may be configured to identify the particular equipment used by the person during a workout. For example, the equipment recognition module 532 may capture the weights of the person during exercise while holding weighted exercise accessories, such as dumbbells or more bills. The equipment used may be recognized by determining the weights of the equipment. The equipment used may also be recognized by capturing the equipment via the cameras 526 and using object identification to identify each weight. Moreover, each piece of equipment may include identifiers on the equipment that can be captured via the cameras 526 and identify via a matching process.
  • A movement capture and analysis module 534 may be configured to extract a skeleton frame corresponding to the person within the workout area. The skeleton frame may be expressed as a set of joints with relative locations. The movements of the skeleton frame of the person is tracked in a series of images captured while the person is exercising. A repetition counting module 536 may be configured to count a number of repetitions of an exercise that a person successfully completes. The particular exercise being performed may be initially determined by comparing the actual movements of the person to known joint movements stored in a database of the electronic device 500. In embodiments, custom exercises may be defined in the exercise database of the electronic device 500. When the current movements of the person engaging in exercise matches the known joint movements of an exercise stored in the exercise database, the repetition counting module 536 then counts each successful repetition of the exercise. A successful repetition of the exercise may include satisfying each particular movement in a particular sequence associated with the exercise. A calorie counting module 538 may be configured to calculate the caloric expenditure of the person during the exercise. MET values for various exercises may be stored in a lookup table. The total caloric expenditure may be calculated as the duration of the exercise in minutes, multiplied by the MET for the exercise, multiplied by 3.5, multiplied by the person's weight in kilograms; divided by 200.
  • The block diagram of FIG. 5 is not intended to indicate that the computing device 500 is to include all of the components shown in FIG. 5. Rather, the computing device 500 can include fewer or additional components not illustrated in FIG. 5, such as additional buffers, additional processors, and the like. The computing device 500 may include any number of additional components not shown in FIG. 5, depending on the details of the specific implementation. Furthermore, any of the functionalities of the smart gym manager 528, identification unit 530, equipment recognition module 532, movement capture and analysis module 534, repetition counting module 536, and calorie counting module 538 may be partially, or entirely, implemented in hardware and/or in the processor 502. For example, the functionality may be implemented with an application specific integrated circuit, in logic implemented in the processor 502, or in any other device. For example, the functionality of the smart gym manager 528 may be implemented with an application specific integrated circuit, in logic implemented in a processor, in logic implemented in a specialized graphics processing unit such as the VPU/GPU 508, or in any other device.
  • FIG. 6 is a block diagram showing computer readable media 600 that stores code for enabling a smart gym. The computer readable media 600 may be accessed by a processor 602 over a computer bus 604. Furthermore, the computer readable medium 600 may include code configured to direct the processor 602 to perform the methods described herein. In some embodiments, the computer readable media 600 may be non-transitory computer readable media. In some examples, the computer readable media 600 may be storage media.
  • The various software components discussed herein may be stored on one or more computer readable media 600, as indicated in FIG. 6. For example, an identification module 606, equipment recognition module 608, a movement capture and analysis module 610, a repetition counting module 612, and a calorie counting module 614 may be stored on the computer readable media 600.
  • The identification module 606 may be configured to identify a person within the smart gym workout area. The equipment recognition module 608 may be configured to identify the particular equipment used by the person during a workout. The movement capture and analysis module 610 may be configured to extract a skeleton frame corresponding to the person within the workout area. The repetition counting module 612 may be configured to count a number of repetitions of an exercise that a person successfully completes. The calorie counting module 614 may be configured to calculate the caloric expenditure of the person during the exercise.
  • The block diagram of FIG. 6 is not intended to indicate that the computer readable media 600 is to include all of the components shown in FIG. 6. Further, the computer readable media 600 may include any number of additional components not shown in FIG. 6, depending on the details of the specific implementation.
  • Example 1 is an apparatus. The apparatus includes a platform, wherein the platform comprises at least a weight senor; at least one camera, wherein the camera is configured to capture movements on the platform; a processor, wherein the processor is configured to: capture a person that has entered the platform area; derive a skeleton frame for the person in the platform area; track joint movements of the skeleton frame; identify one or more exercises performed by the tracked movement and count the number of repetitions performed; and calculate a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the exercise.
  • Example 2 includes the apparatus of example 1, including or excluding optional features. In this example, an exercise is identified by comparing the joint movement to a known joint movement associated with the exercise. Optionally, the known joint movement is an average of the user's previous movements. Optionally, the known joint movement is reviewed to derive training goals.
  • Example 3 includes the apparatus of any one of examples 1 to 2, including or excluding optional features. In this example, the apparatus includes rendering the tracked movement of the skeleton frame on a display; and using augmented reality to place a third party on the screen with the rendering of the person.
  • Example 4 includes the apparatus of any one of examples 1 to 3, including or excluding optional features. In this example, the exercise is identified via machine leaning.
  • Example 5 includes the apparatus of any one of examples 1 to 4, including or excluding optional features. In this example, calculating the caloric expenditure is based on a metabolic equivalent for a task (MET).
  • Example 6 includes the apparatus of any one of examples 1 to 5, including or excluding optional features. In this example, virtual coaching is enabled via the smart gym with haptic, auditory, visual feedback.
  • Example 7 includes the apparatus of any one of examples 1 to 6, including or excluding optional features. In this example, in person coaching is enabled via the smart gym with post exercise playback or analysis.
  • Example 8 is a method. The method includes obtaining a user profile corresponding to a person in the workout area; extracting a skeleton frame from images captured of the person; tracking joint movements of the skeleton frame; identifying an exercise performed by the tracked movement and counting a number of repetitions of the exercise performed; and calculating a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the exercise.
  • Example 9 includes the method of example 8, including or excluding optional features. In this example, an exercise is identified by comparing the joint movement to a known joint movement associated with the exercise. Optionally, the known joint movement is an average of the user's previous movements. Optionally, the known joint movement is reviewed to derive training goals.
  • Example 10 includes the method of any one of examples 8 to 9, including or excluding optional features. In this example, the method includes rendering the tracked movement of the skeleton frame on a display; and using augmented reality to place a third party on the screen with the rendering of the person.
  • Example 11 includes the method of any one of examples 8 to 10, including or excluding optional features. In this example, the exercise is identified via machine leaning.
  • Example 12 includes the method of any one of examples 8 to 11, including or excluding optional features. In this example, the caloric expenditure is based on a MET value obtained from a lookup table.
  • Example 13 includes the method of any one of examples 8 to 12, including or excluding optional features. In this example, virtual coaching is enabled via the smart gym with haptic, auditory, visual feedback.
  • Example 14 includes the method of any one of examples 8 to 13, including or excluding optional features. In this example, in person coaching is enabled via the smart gym with post exercise playback or analysis.
  • Example 15 is at least one computer readable medium that enables a smart gym having instructions stored therein that. The computer-readable medium includes instructions that direct the processor to obtain a user profile corresponding to a person in the workout area; extract a skeleton frame from images captured of the person; track joint movements of the skeleton frame; identify an exercise performed by the tracked movement and counting a number of repetitions of the exercise performed; and calculate a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the exercise.
  • Example 16 includes the computer-readable medium of example 15, including or excluding optional features. In this example, an exercise is identified by comparing the joint movement to a known joint movement associated with the exercise. Optionally, the known joint movement is an average of the user's previous movements. Optionally, the known joint movement is reviewed to derive training goals.
  • Example 17 includes the computer-readable medium of any one of examples 15 to 16, including or excluding optional features. In this example, the computer-readable medium includes rendering the tracked movement of the skeleton frame on a display; and using augmented reality to place a third party on the screen with the rendering of the person.
  • Example 18 includes the computer-readable medium of any one of examples 15 to 17, including or excluding optional features. In this example, the exercise is identified via machine leaning.
  • Example 19 includes the computer-readable medium of any one of examples 15 to 18, including or excluding optional features. In this example, the caloric expenditure is based on a MET value obtained from a lookup table.
  • Not all components, features, structures, characteristics, etc. described and illustrated herein need be included in a particular aspect or aspects. If the specification states a component, feature, structure, or characteristic “may”, “might”, “can” or “could” be included, for example, that particular component, feature, structure, or characteristic is not required to be included. If the specification or claim refers to “a” or “an” element, that does not mean there is only one of the element. If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
  • It is to be noted that, although some aspects have been described in reference to particular implementations, other implementations are possible according to some aspects. Additionally, the arrangement and/or order of circuit elements or other features illustrated in the drawings and/or described herein need not be arranged in the particular way illustrated and described. Many other arrangements are possible according to some aspects.
  • In each system shown in a figure, the elements in some cases may each have a same reference number or a different reference number to suggest that the elements represented could be different and/or similar. However, an element may be flexible enough to have different implementations and work with some or all of the systems shown or described herein. The various elements shown in the figures may be the same or different. Which one is referred to as a first element and which is called a second element is arbitrary.
  • It is to be understood that specifics in the aforementioned examples may be used anywhere in one or more aspects. For instance, all optional features of the computing device described above may also be implemented with respect to either of the methods or the computer-readable medium described herein. Furthermore, although flow diagrams and/or state diagrams may have been used herein to describe aspects, the techniques are not limited to those diagrams or to corresponding descriptions herein. For example, flow need not move through each illustrated box or state or in exactly the same order as illustrated and described herein.
  • The present techniques are not restricted to the particular details listed herein. Indeed, those skilled in the art having the benefit of this disclosure will appreciate that many other variations from the foregoing description and drawings may be made within the scope of the present techniques. Accordingly, it is the following claims including any amendments thereto that define the scope of the present techniques.

Claims (44)

What is claimed is:
1. An electronic device comprising:
a sensor to detect movement of a user in a workout area;
at least one memory;
instructions; and
processor circuitry to execute the instructions to:
generate a skeleton frame representative of the user in the workout area;
analyze the detected movements of the user;
identify a posture of the user performing an exercise; and
output a signal to cause posture feedback to be displayed, the posture-analysis feedback indicative of proper form of a skeleton frame of the user when performing an exercise.
2. The electronic device of claim 1, wherein the sensor includes a camera.
3. The electronic device of claim 1, wherein the processor circuitry is to extract the skeleton frame as a set of joints.
4. The electronic device of claim 3, wherein the set of joints includes a right shoulder joint, a left shoulder joint, a right elbow joint, a left elbow joint, a right wrist joint, a left wrist joint, a right hip joint, a left hip joint, a right knee joint, a left knee joint, a right angle joint, and a left ankle joint.
5. The electronic device of claim 1, further including a platform to define the workout area.
6. The electronic device of claim 5, wherein the platform includes a mat.
7. The electronic device of claim 1, wherein the processor circuitry is to count a number of repetitions that the user completes during an exercise.
8. The electronic device of claim 1, wherein the processor circuitry is to determine calories burned based on a number of parameters including at least one of (1) a weight of the user from the user profile, (2) a type of activity performed by the user, or (3) a metabolic equivalent for a task (MET).
9. The electronic device of claim 1, wherein the processor circuitry is to calculate a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the one or more exercises.
10. The electronic device of claim 1, wherein the processor circuitry is to identify an object held by the user.
11. The electronic device of claim 1, wherein the processor circuitry is to cause the tracked movements of the skeleton frame to be presented on a display.
12. The electronic device of claim 1, wherein posture feedback includes a demonstration of the exercise performed with proper form.
13. The electronic device of claim 1, wherein the posture feedback is presented by a live coach.
14. The electronic device of claim 1, wherein the posture feedback is presented by a virtual coach.
15. The electronic device of claim 1, further including a user input interface to enable the user to input information.
16. The electronic device of claim 1, wherein the processor circuitry is to detect an amount of weight used in the exercise.
17. At least one computer readable medium comprising instructions that, when executed, cause processor circuitry to at least:
detect movement of a user in a workout area based on an output from a sensor;
generate a skeleton frame representative of the user in the workout area;
analyze the detected movements of the user;
identify a posture of the user; and
cause posture feedback to be displayed, the posture-analysis feedback indicative of proper form of the skeleton frame of the user when performing an exercise.
18. The at least one computer readable medium of claim 17, wherein the sensor includes a camera.
19. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to extract the skeleton frame as a set of joints.
20. The at least one computer readable medium of claim 19, wherein the set of joints includes a right shoulder joint, a left shoulder joint, a right elbow joint, a left elbow joint, a right wrist joint, a left wrist joint, a right hip joint, a left hip joint, a right knee joint, a left knee joint, a right angle joint, and a left ankle joint.
21. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to count a number of repetitions that the user completes during an exercise.
22. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to determine calories burned based on a number of parameters including at least one of (1) a weight of the user from the user profile, (2) a type of activity performed by the user, or (3) a metabolic equivalent for a task (MET).
23. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to calculate a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the one or more exercises.
24. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to identify an object held by the user.
25. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to cause the tracked movements of the skeleton frame to be presented on a display.
26. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry, to provide a demonstration of the exercise performed with proper form when providing posture feedback.
27. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to present a live coach when providing the posture feedback.
28. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to present a virtual coach when providing the posture feedback by a virtual coach.
29. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to enable a user to input information.
30. The at least one computer readable medium of claim 17, wherein the instructions cause the processor circuitry to detect an amount of weight used in the exercise.
31. A method comprising:
detecting, via a sensor, movement of a user in a workout area;
generating a skeleton frame representative of the user in the workout area;
analyzing the detected movements of the user;
identifying a posture of the user; and
displaying posture feedback, the posture-analysis feedback indicative of proper form of the skeleton frame of the user when performing an exercise.
32. The method of claim 31, further including extracting the skeleton frame as a set of joints.
33. The method of claim 32, further including identifying the set of joints as a right shoulder joint, a left shoulder joint, a right elbow joint, a left elbow joint, a right wrist joint, a left wrist joint, a right hip joint, a left hip joint, a right knee joint, a left knee joint, a right angle joint, and a left ankle joint.
34. The method of claim 31, further including identing a platform defining the workout area.
35. The method of claim 31, further including counting a number of repetitions that the user completes during an exercise.
36. The method of claim 31, further including determining calories burned based on a number of parameters including at least one of (1) a weight of the user from the user profile, (2) a type of activity performed by the user, or (3) a metabolic equivalent for a task (MET).
37. The method of claim 31, further including calculating a caloric expenditure based on the number of repetitions performed, a weight of the person, and a weight used when performing the one or more exercises.
38. The method of claim 31, further including identifying an object held by the user.
39. The method of claim 31, further including displaying the tracked movements of the skeleton frame.
40. The method of claim 31, further including displaying a demonstration of the exercise performed with proper form when providing posture feedback.
41. The method of claim 31, further including displaying the posture feedback via a live coach.
42. The method of claim 31, further including displaying the posture feedback via a virtual coach.
43. The method of claim 31, further including displaying a user input interface to enable the user to input information.
44. The method of claim 31, further including detecting an amount of weight used in the exercise.
US17/712,004 2019-12-19 2022-04-01 Smart Gym Pending US20220219046A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/712,004 US20220219046A1 (en) 2019-12-19 2022-04-01 Smart Gym

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/720,775 US11351419B2 (en) 2019-12-19 2019-12-19 Smart gym
US17/712,004 US20220219046A1 (en) 2019-12-19 2022-04-01 Smart Gym

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/720,775 Continuation US11351419B2 (en) 2019-12-19 2019-12-19 Smart gym

Publications (1)

Publication Number Publication Date
US20220219046A1 true US20220219046A1 (en) 2022-07-14

Family

ID=70279059

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/720,775 Active 2040-09-30 US11351419B2 (en) 2019-12-19 2019-12-19 Smart gym
US17/712,004 Pending US20220219046A1 (en) 2019-12-19 2022-04-01 Smart Gym

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/720,775 Active 2040-09-30 US11351419B2 (en) 2019-12-19 2019-12-19 Smart gym

Country Status (2)

Country Link
US (2) US11351419B2 (en)
DE (1) DE102020129804A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9261526B2 (en) * 2010-08-26 2016-02-16 Blast Motion Inc. Fitting system for sporting equipment
CA3031040C (en) 2015-07-16 2021-02-16 Blast Motion Inc. Multi-sensor event correlation system
US10853934B2 (en) 2018-09-19 2020-12-01 Indus.Ai Inc Patch-based scene segmentation using neural networks
US10769422B2 (en) * 2018-09-19 2020-09-08 Indus.Ai Inc Neural network-based recognition of trade workers present on industrial sites
US20220072381A1 (en) * 2020-09-04 2022-03-10 Rajiv Trehan Method and system for training users to perform activities
CN112932470B (en) * 2021-01-27 2023-12-29 上海萱闱医疗科技有限公司 Assessment method and device for push-up training, equipment and storage medium
FR3119777B1 (en) * 2021-02-15 2024-03-01 Pleyo Interactive training system
US20230025516A1 (en) * 2021-07-22 2023-01-26 Google Llc Multi-Modal Exercise Detection Framework
CN113762087A (en) * 2021-10-19 2021-12-07 安徽中电光达通信技术有限公司 Data processing method and device and computer storage medium
JP2023096823A (en) * 2021-12-27 2023-07-07 株式会社日立製作所 Activity amount calculation device and activity amount calculation method
WO2023140039A1 (en) * 2022-01-21 2023-07-27 ソニーグループ株式会社 Information processing device, information processing method, program, and information analysis system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120190505A1 (en) * 2011-01-26 2012-07-26 Flow-Motion Research And Development Ltd Method and system for monitoring and feed-backing on execution of physical exercise routines
EP3226229A1 (en) * 2016-03-31 2017-10-04 MEDIACLINICS S.r.l. Motion evaluation method and system in a sport context
US20180104541A1 (en) * 2016-09-28 2018-04-19 Bodbox, Inc. Evaluation And Coaching Of Athletic Performance
US10765378B2 (en) * 2016-02-24 2020-09-08 Preaction Technology Corporation Method and system for determining physiological status of users based on marker-less motion capture and generating appropriate remediation plans

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110251493A1 (en) * 2010-03-22 2011-10-13 Massachusetts Institute Of Technology Method and system for measurement of physiological parameters
US9852271B2 (en) * 2010-12-13 2017-12-26 Nike, Inc. Processing data of a user performing an athletic activity to estimate energy expenditure
US20140267611A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Runtime engine for analyzing user motion in 3d images
US10078795B2 (en) * 2014-08-11 2018-09-18 Nongjian Tao Systems and methods for non-contact tracking and analysis of physical activity using imaging
US10953305B2 (en) * 2015-08-26 2021-03-23 Icon Health & Fitness, Inc. Strength exercise mechanisms
US11004356B2 (en) * 2015-08-26 2021-05-11 Nike, Inc. Providing workout recap
US11511156B2 (en) * 2016-03-12 2022-11-29 Arie Shavit Training system and methods for designing, monitoring and providing feedback of training
US11355226B2 (en) * 2016-08-05 2022-06-07 Koninklijke Philips N.V. Ambulatory path geometric evaluation
US10737140B2 (en) * 2016-09-01 2020-08-11 Catalyft Labs, Inc. Multi-functional weight rack and exercise monitoring system for tracking exercise movements
KR102457296B1 (en) * 2018-05-29 2022-10-21 큐리어서 프로덕츠 인크. Reflective video display device and methods of use for interactive training and demonstration
US20200020165A1 (en) * 2018-07-12 2020-01-16 Bao Tran Smart device
KR20200077775A (en) * 2018-12-21 2020-07-01 삼성전자주식회사 Electronic device and method for providing information thereof
US10960266B2 (en) * 2019-05-06 2021-03-30 Samuel Messinger System of an artificial intelligence (AI) powered wireless gym
CN113128283A (en) * 2019-12-31 2021-07-16 沸腾时刻智能科技(深圳)有限公司 Evaluation method, model construction method, teaching machine, teaching system and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120190505A1 (en) * 2011-01-26 2012-07-26 Flow-Motion Research And Development Ltd Method and system for monitoring and feed-backing on execution of physical exercise routines
US10765378B2 (en) * 2016-02-24 2020-09-08 Preaction Technology Corporation Method and system for determining physiological status of users based on marker-less motion capture and generating appropriate remediation plans
EP3226229A1 (en) * 2016-03-31 2017-10-04 MEDIACLINICS S.r.l. Motion evaluation method and system in a sport context
US20180104541A1 (en) * 2016-09-28 2018-04-19 Bodbox, Inc. Evaluation And Coaching Of Athletic Performance

Also Published As

Publication number Publication date
DE102020129804A1 (en) 2021-06-24
US11351419B2 (en) 2022-06-07
US20200121987A1 (en) 2020-04-23

Similar Documents

Publication Publication Date Title
US11351419B2 (en) Smart gym
US11033776B2 (en) Method and system for athletic motion analysis and instruction
US11673024B2 (en) Method and system for human motion analysis and instruction
US11389085B2 (en) Human physical functional ability and muscle ability comprehensive assessment system and method thereof
US20210379447A1 (en) Interactive exercise apparatus
Velloso et al. Qualitative activity recognition of weight lifting exercises
Fukuda Assessments for sport and athletic performance
US20110281249A1 (en) Method And System For Creating Personalized Workout Programs
CN109637625B (en) Self-learning fitness plan generation system
KR101999748B1 (en) IoT FITNESS EQUIPMENT, EXERCISE INSTRUCTION SYSTEM, AND EXERCISE INSTRUCTION METHOD USING THEREOF
US20210093920A1 (en) Personal Fitness Training System With Biomechanical Feedback
CN106999104A (en) Cardiovascular fitness is assessed
TWI679557B (en) Adaptive sport posture sensing system and method
Nicol et al. The association of range of motion, Dryland strength–power, anthropometry, and velocity in elite breaststroke swimmers
US20200215390A1 (en) Fitness monitoring system
Akkari-Ghazouani et al. Effect of glissade-step on kinetic and kinematic variables of stag ring leaps with and without throw-catch of the ball in rhythmic gymnastics
KR101940032B1 (en) Customized smart health care system for measuring momentum
Cai et al. PoseBuddy: Pose estimation workout mobile application
Petrovic et al. The novel single-stroke kayak test: Can it discriminate between 200-m and longer-distance (500-and 1000-m) specialists in canoe sprint?
US20140073383A1 (en) Method and system for motion comparison
KR20100044588A (en) System for managing personal exercise history of digital muscular exerciser
JP6939939B2 (en) Information processing equipment, information processing methods, and programs
CN113641856A (en) Method and apparatus for outputting information
KR20230005693A (en) Exercise management healthcare service system using tag
TWI722188B (en) Muscle strength training system

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER