US20200043361A1 - Physical Training System and Method - Google Patents

Physical Training System and Method Download PDF

Info

Publication number
US20200043361A1
US20200043361A1 US16/601,559 US201916601559A US2020043361A1 US 20200043361 A1 US20200043361 A1 US 20200043361A1 US 201916601559 A US201916601559 A US 201916601559A US 2020043361 A1 US2020043361 A1 US 2020043361A1
Authority
US
United States
Prior art keywords
user
touch
sensors
drill
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/601,559
Inventor
Kevin L. Martin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US16/601,559 priority Critical patent/US20200043361A1/en
Publication of US20200043361A1 publication Critical patent/US20200043361A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/162Testing reaction times
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6892Mats
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B21/00Exercising apparatus for developing or strengthening the muscles or joints of the body by working against a counterforce, with or without measuring devices
    • A63B21/40Interfaces with the user related to strength training; Details thereof
    • A63B21/4027Specific exercise interfaces
    • A63B21/4037Exercise mats for personal use, with or without hand-grips or foot-grips, e.g. for Yoga or supine floor exercises
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/04Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs
    • A63B23/0405Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs involving a bending of the knee and hip joints simultaneously
    • A63B23/0458Step exercisers without moving parts
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements

Definitions

  • the present disclosure relates to methods, systems, and computer-readable media for providing physical training. More particularly, the present disclosure relates to methods, systems, and computer-readable media for providing instant feedback and objective assessment of the effectiveness of the physical training.
  • an athlete's training process is normally directed by a coach.
  • the coach typically designs training sessions and monitors the athlete's performance.
  • the assessment of an athlete's performance largely depends on the coach's experience, judgment, and patience. It is often difficult for the athlete to receive instant feedback from the coach. In addition, the feedback may not be based on objective measures.
  • a physician often assesses a patient's condition based on overall physical appearance. Subtle imperfections, such as slight imbalance between injured and non-injured legs, are difficult to capture. Therefore, it is desirable to develop systems and methods for providing instant feedback and objective assessment of the effectiveness of physical training.
  • Some disclosed embodiments may involve methods, systems, and computer-readable media for providing physical training to a user.
  • One such system may include a memory for storing a set of instructions.
  • the system may also include a processor communicatively connected to the memory.
  • the processor When executing the set of instructions, the processor may be configured to display a visual indicator on a display panel to indicating a movement to be completed by the user and determine whether the user completes the movement.
  • the processor may be configured to record one or more characteristics of the movement and provide feedback to the user based on the one or more characteristics.
  • FIG. 1 is a schematic diagram of an exemplary physical training system, in accordance with some disclosed embodiments
  • FIG. 2 is a schematic diagram of an exemplary controller, in accordance with some disclosed embodiments.
  • FIG. 3 illustrates a block diagram of an exemplary physical training system, in accordance with some disclosed embodiments
  • FIG. 4A is a schematic diagram of an exemplary sensor pad, in accordance with some disclosed embodiments.
  • FIG. 4B is a schematic diagram of an exemplary sensor unit, in accordance with some disclosed embodiments.
  • FIG. 5 is a flow chart of an exemplary method of determining time duration of touching a sensor unit, in accordance with some disclosed embodiments.
  • FIG. 6 shows an exemplary setup screen for performing an agility drill, in accordance with some disclosed embodiments.
  • Embodiments of the present disclosure may involve systems, methods, and computer-readable media for providing physical training to a user.
  • physical training may include physical condition testing, sports training, physical therapy, athletic performance training, recreational exercise, or a general purpose workout, etc.
  • the physical training may involve one or more physical movements of the body part(s) of the user, such as upper limbs, lower limbs, or whole body movements. A particular set of movements may be referred to as a “drill.”
  • the user may perform one or more pre-set drills and/or customized drills.
  • a customized drill may be built by the user from scratch. Alternatively, a customized drill may be modified from a pre-set drill.
  • FIG. 1 illustrates an exemplary training system 100 .
  • Training system 100 may include a sensor pad 110 .
  • Sensor pad 110 may also be referred to as a touch board, a sensing board, etc.
  • Sensor pad 110 may include one or more sensor units 112 .
  • Sensor unit 112 may sense applied pressure or force and provide signals indicating whether the sensor unit has been pressed or touched. In some embodiments, sensor unit 112 may also provide signals indicating the time duration of the pressure or force applied to the sensor unit 112 .
  • Controller 120 may include a general-purpose portable computing device such as a tablet, a PDA, a mobile phone, a laptop, or any other suitable computing apparatus equipped with a physical training software application.
  • controller 120 may include a dedicated computing device for providing physical training functions.
  • Training system 100 may include a communication interface 130 to provide information exchange between controller 120 and sensor pad 110 .
  • Communication interface 130 may include a wired connection, e.g., via a hardware cable or wire, to provide a communication channel between controller 120 and sensor pad 110 .
  • the hardware cable may include general-purpose cables, such as USB cables.
  • the hardware cable may include some information processing functions.
  • the hardware cable may include built-in electronic chips to perform analog-to-digital signal conversion.
  • communication interface 130 may include a wireless connection, e.g., via WiFi, Bluetooth, infrared, RF, near field communication, etc., to provide a communication channel between controller 120 and sensor pad 110 .
  • controller 120 may be placed on a supporting structure 140 , such as a tripod, so that controller 120 can be held at a proper height to receive input from and provide feedback to the user.
  • Supporting structure 140 may also include a rack, a cart, a hanging rod, or any other suitable means that can be used to hold controller 120 at a proper height.
  • Supporting structure 140 may be adjustable, flexible, moveable, rotatable, etc.
  • training system 100 may include a network interface 150 to connect controller 120 to a network 160 .
  • Network 160 may include LAN. WAN, telecommunication network, Internet, VPN, etc.
  • Network interface 150 may include wired and/or wireless connections, such as WiFi, Ethernet, 3G, 4G, LTE, etc.
  • Controller 120 may exchange information with other computers, such as servers or peers, through network 160 .
  • training system 100 may include a server 170 that connects to network 160 .
  • Server 170 may include a database 172 for storing data related to one or more users of training system 100 .
  • FIG. 2 illustrates an exemplary controller 200 .
  • controller 200 may include a display panel 202 .
  • Display panel 202 may display video, image, and/or text information to a user.
  • display panel 202 may display one or more visual indicators 204 .
  • Visual indicator 204 may have various shapes (circle, square, triangle, etc), colors (yellow, red, green, blue, etc.), sizes, brightness, arrangements (different number of indicators in different row/column), etc.
  • Visual indicator 204 may be individually rendered.
  • different visual indicators may have different colors, brightness, etc.
  • Display panel 202 may also display one or more input buttons 206 .
  • display panel 202 may include a touch-sensitive layer enabling controller 200 to receive input from a user when the user touches display panel 202 .
  • Controller 200 may also include one or more hard buttons 208 , a power switch 212 , a connector interface 210 , a build-in camera 216 , and a wireless communication module 214 .
  • connector interface 210 may be used to connect controller 200 to sensor pad 110 through cable 130 .
  • wireless communication module 214 may be used to connect controller 200 to sensor pad 110 via, for example, Bluetooth, WiFi, etc.
  • wireless communication module 214 may be used to connect controller 200 to network 160 via, for example, WiFi, 3G, 4G, LTE, etc.
  • FIG. 3 shows a block diagram of an exemplary physical training system.
  • the system may include a controller 300 .
  • Controller 300 may be a general purpose computer such as a laptop, a portable computing device such as a tablet or a mobile phone, or a computing device dedicated for physical training.
  • controller 300 may include a processor 310 , a memory/storage module 320 , a user input device 330 , a display device 340 , and a communication interface 350 .
  • Processor 310 can be a central processing unit (“CPU”) or a mobile processor. Depending on the type of hardware being used, processor 310 can include one or more printed circuit boards, and/or a microprocessor chip.
  • Processor 310 can execute sequences of computer program instructions to perform various methods that will be explained in greater detail below.
  • Memory/storage module 320 can include, among other things, a random access memory (“RAM”), a read-only memory (“ROM”), and a flash memory.
  • RAM random access memory
  • ROM read-only memory
  • flash memory any other suitable memory location, and loaded into the RAM for execution by processor 310 .
  • memory/storage module 320 may store an operating system 321 , a software application 322 , and a database 323 . Further, memory/storage module 320 may store an entire software application or only a part of a software application that is executable by processor 310 .
  • software application 322 or portions of it may be stored on a computer readable medium, such as a hard drive, computer disk, CD-ROM, DVD ⁇ R, CD ⁇ RW or DVD ⁇ RW, HD or Blu-ray DVD, flash drive, SD card, memory stick, or any other suitable medium, and can be read and acted upon by processor 310 using routines that have been loaded to memory/storage module 320 .
  • a computer readable medium such as a hard drive, computer disk, CD-ROM, DVD ⁇ R, CD ⁇ RW or DVD ⁇ RW, HD or Blu-ray DVD, flash drive, SD card, memory stick, or any other suitable medium
  • input device 330 and display device 340 may be coupled to processor 310 through appropriate interfacing circuitry.
  • input device 330 may be a hardware keyboard, a keypad, or a touch screen, through which a user may input information to controller 300 .
  • Display device 340 may include one or more display screens that display the training interface, result, or any related information to the user.
  • Communication interface 350 may provide communication connections such that controller 300 may exchange data with external devices.
  • controller 300 may be connected to network 380 through communication channel 390 .
  • Network 380 may be a LAN, a WAN, or the Internet.
  • controller 300 may be connected to an accessory 360 through communication channel 370 .
  • Accessory 360 may include, for example, sensor pad 110 or an external camera (not shown).
  • FIG. 4A is a schematic diagram of an exemplary sensor pad, in accordance with some disclosed embodiments.
  • sensor pad 400 may include a number of sensor units 410 and an interface 420 .
  • Sensor units 410 may be assigned predetermined identifiers, such as numbers 1-5, as illustrated in FIG. 4A . It is noted that different number of sensor units, different kinds of identifiers, and different manners of identifier assignment (e.g., the arrangement and order of the identifiers) may also be used.
  • Sensor pad 400 may provide signals indicating whether a particular sensor unit is touched or pressed. In some embodiment, sensor pad 400 may also provide signals indicating the time duration of the pressure or force applied to the sensor pad 400 . The signals may be sent to controller 120 ( FIG. 1 ) through interface 420 .
  • FIG. 4B is a schematic diagram of an exemplary sensor unit, in accordance with some disclosed embodiments.
  • the sensor unit includes a top panel 412 and a bottom panel 414 .
  • top panel 412 moves downward toward the dashed-line position.
  • Sensor 416 may in turn generate a signal indicating that the sensor unit has been touched or pressed, based on the resistance change, and output the signal to interface 420 for communication with controller 120 .
  • FIG. 5 is a flow chart of an exemplary method of determining time duration of touching a sensor unit, in accordance with some disclosed embodiments.
  • method 500 may include a series of steps, some of them may be optional.
  • sensor 416 reads resistance of a subsequent sensor unit.
  • the resistance of sensor units 1 - 5 may be read sequentially, and a next sensor unit may indicate, for example, sensor unit 5 after the steps described in FIG. 5 has finished a cycle for sensor unit 4 .
  • the position of the sensor unit can be determined based on the resistance. For example, the resistance may be smaller when top panel 412 moves DOWN than when top panel 412 moves UP.
  • the UP/DOWN position information can be saved.
  • step 506 it is determined if the position information has been changed, compared with the previously saved position information. If the position information has changed, the new position can be sent to controller 120 .
  • step 508 it is determined if the position has changed from UP to DOWN. If so, the new position indicates that the sensor unit has been touched or pressed, and a timer can be adapted to measure the contact time.
  • step 510 it is determined if the position has changed from DOWN to UP. If so, the new position indicates that the sensor unit has been released. The timer can be stopped and the time duration of the pressure or force applied to the sensor unit, for example, can be saved. The process then returns to step 502 to read the next sensor unit (e.g., sensor unit 1 ).
  • Method 500 may be initiated upon receiving a request from controller 120 . Contact time information may be sent to controller 120 , once available.
  • software application 320 may include a plurality of pre-set drills for performing physical training.
  • pre-set drills may include count drills, react drills, sequence drills, vertical drills, and agility drills, as described herein.
  • a user may also create a customized drill, a new category of drills, and/or a playlist of drills.
  • Count drills are designed to test and/or improve speed or quickness of movements. The user is instructed to perform pre-determined movements during count drills.
  • controller 120 may activate one or more sensor units 112 (or all sensor units in some embodiments) on sensor pad 110 . Controller 120 may record the total number of touches on the activated sensor units during specific time durations. Alternatively, controller 120 may record the time duration for finishing a pre-set number of touches. The goal of the user is to touch the activated sensor units as quickly as possible. When the user touches a sensor unit, the corresponding visual indicator may light up on display panel 202 . The number of touches and/or the time (spending and/or remaining) can be displayed on display panel 202 in real time.
  • the user may be able to choose or program the pre-determined movements (e.g., type or manner of movements, sensor units to be activated, specific order of touches, etc.), number of touches to be accomplished, and/or time duration of the drill.
  • the user can stand on the front or back two sensor units (e.g., 1 and 2 or 4 and 5 in FIG. 4A ), and controller 120 may activate the two sensor units as a left and a right touch target.
  • the number of left and right touches can be displayed separately on display panel 202 in real time.
  • the time (spending and/or remaining) can also be displayed on display panel 202 .
  • Contact time may also be determined by, for example, method 500 illustrated in FIG. 5 .
  • React drills are designed to test and/or improve a user's speed of reaction to visual indicators 204 displayed on display panel 202 .
  • a sequence of visual indicators 204 can be programmed to be displayed on display panel 202 , and a user needs to react to the visual indicators 204 and perform a series of movements, e.g., by touching the corresponding sensor units 112 on sensor pad 110 .
  • the visual indicators can be assigned a certain color, and the user is instructed to react to the visual indicators with the assigned color.
  • the visual indicators can be assigned different colors, and the user is instructed to react to certain colors, but not others. The user reacts to the visual indicator by touching the corresponding sensor unit as quickly as possible.
  • controller 120 may receive a signal from sensor pad 110 indicating a corresponding sensor unit has been touched. Controller 120 may then determine that the user has completed the expected movement and record a correct touch. If the user touches an incorrect sensor unit that does not correspond to the “react to” visual indicator, or the user touches a sensor unit corresponding to a “don't react to” visual indicator, or the user fails to touch the corresponding sensor unit fast enough (e.g., the reaction/response time is longer than a predetermined threshold), controller 120 may determine that the user does not complete the expected movement and record an incorrect touch. The total number of correct touches and the total number of incorrect touches can be displayed on display panel 202 in real time.
  • the time can also be displayed on display panel 202 .
  • response times may also be recorded.
  • the response time may be measured from the time the visual indicator first appears on display panel 202 to the time when the sensor unit (either a correct or an incorrect one) senses the responding/reacting touch.
  • Contact time may also be determined by, for example, method 500 illustrated in FIG. 5 .
  • controller 120 may be configured such that the same order of visual indicators in a random sequence is not repeated.
  • Visual indicators may be displayed in different manners.
  • the visual indicator may display in a solid mode, in which the visual indicator is being displayed until the user touches a sensor unit.
  • the visual indicator may display a flash mode, in which the visual indicator illuminates for a pre-determined duration (e.g., 0.10 to 10.0 seconds in increments of 0.10 seconds) and then disappears, regardless of whether the user touches a sensor unit during this time duration. If the user fails to touch the corresponding sensor unit during the pre-determined time duration, the failure to touch will be treated as an incorrect touch or an incomplete movement.
  • a pre-determined duration e.g. 0.10 to 10.0 seconds in increments of 0.10 seconds
  • the react drill may include a flip mode.
  • the flip mode the user is instructed to react to the opposite sensor corresponding to the displayed visual indicator.
  • the user may be instructed to react to a diagonally opposite sensor. For example, if the top right visual indicator appears on the display panel, the user should react to the sensor unit on the bottom left.
  • the user may be instructed to react to a lateral opposite sensor. For example, if the top right visual indicator appears on the display panel, the user should react to the top left sensor unit.
  • the user may be instructed to react to a linear opposite sensor. For example, if the top right visual indicator appears on the display panel, the user should react to the bottom right sensor unit.
  • Sequence drills are similar to react drills.
  • One of the differences is that instead of random sequences, pre-determined sequences of visual indicators are used in sequence drills.
  • the pre-determined sequences can be generated by the software application or by the user.
  • the user may setup a sequence by inputting a series of numbers indicating the corresponding sensor units.
  • the total number of sensor units in a sequence may vary. For example, in some embodiments, the total number may be from 1-20.
  • the number of correct and incorrect touches can also be provided and displayed on display panel 202 in real time, similar to that of the react drills.
  • the time (spending and/or remaining) can also be displayed on display panel 202 .
  • Contact time may also be determined by, for example, method 500 illustrated in FIG. 5 .
  • a time delay can be set between the appearance of a visual indicator and the receiving of a user response (e.g., a correct touch or an incorrect touch). For example, a 2-3 seconds delay may be added before the user responds to the visual indicator.
  • controller 120 may save the result of the drill.
  • controller 120 may provide comparative data about the current result and previous results of the user, thereby showing whether the user's performance has been improved over time.
  • the comparative data may be displayed on display panel 202 using a tabular view and/or a graphical view.
  • the tabular view may include percentage improvement or decrease, in addition to current result and past results.
  • the graphical view may include bar/circle/curve graphical representations of the results comparisons.
  • Drills may be performed using either lower limbs (legs) or upper limbs (hands).
  • sensor board 110 can be used to sense the user's movements/touches.
  • upper limb mode the user may perform the drill by touching display panel 202 directly using his/her fingers.
  • Agility drills may be performed to test and/or improve the agility of a user.
  • the user responds to a sequence of arrows displayed on display panel 202 and moves in the direction of the arrow.
  • FIG. 6 shows an exemplary setup screen for performing an agility drill in an arrow sequence mode.
  • eight arrows represent eight directions identified by numbers 1 to 8.
  • the user is able to program the sequence of the drill by inputting the sequence of numbers.
  • Number 9 can be used as a wild card.
  • arrows may appear randomly.
  • the length of the agility drill can be programmed by time (e.g., 1 second to 60 mins) or by arrow count (e.g., 1 to 20 arrows).
  • the delay time between the appearing of two adjacent arrows can also be programmed (e.g., from 1 to 20 seconds with incremental of 0.5 second).
  • Vertical drills may be performed to assess height of a user's jump.
  • the user stands on one or more of the sensor units, jumps, and lands on the same sensor unit.
  • Controller 120 can detect the duration of time between the user's lift off from the sensor unit (e.g., sensor unit released) and the user's next contact with the sensor unit (e.g., sensor unit compressed again). Contact time may also be determined by, for example, method 500 illustrated in FIG. 5 .
  • Body weight of the user can be taken into account to improve accuracy. Similar to those of count drills, the number of jumps and the time (spending and/or remaining) can be displayed on display panel 202 in real time.
  • the user can program the length of the drill by specify either the time duration of the drill or the target number of touches.
  • the target number of touches may include the number of correct touches, the number of total touches (both correct and/or incorrect), or the number of visual indicators appearing on the display panel.
  • Videos of a user performing a drill can be recorded.
  • a video can be recorded using camera 216 ( FIG. 2 ) that is equipped with controller 120 .
  • the recorded video may be provided to the user along with other performance data collected, for example, from sensor pad 110 .
  • sensor pad 110 may be replaced by a high speed camera.
  • the high speed camera may be set up to synchronize with controller 120 to monitor the user performing a drill.
  • the user can place any type of targets on the ground, e.g., a rubber mat, rubber dots, spray painted dots, etc., for the high speed camera to capture and register the user's touch/contact actions or movements.
  • the camera can be used for gait testing and analysis, training and rehabilitation for high performance athletes, or general rehabilitation patients.
  • a user may create a use profile.
  • the user profile may include use data such as name, gender, age, height, weight, sports, position, injuries, etc.
  • Performance data and videos of pre-set and customized drills may be saved based on user profiles.
  • performance data may be exported (e.g., as .csv files).
  • an online database may be provided to users (e.g., for an annual fee or other suitable fee structures).
  • the database may allow users to upload their saved drill results for comparison with other users.
  • Databases can be developed for the general consumers, high school sports teams, collegiate sports teams, professional teams, Olympic athletes, physical therapy clinics, sports medicine clinics, etc.
  • Controller 120 may upload saved drill results and the user profile data in order to compare the results.
  • a user profile may include fields for segregating, filtering, or targeting certain user information, such as sports, position, professional level, demographic information to specific drills data and settings.
  • Drills listed in the online database may contain demo videos to display how the drills should be performed.
  • Exemplary online databases include: (1) a performance database for individuals, high schools and colleges to compare results to other users around the world; (2) a rehabilitation database for physical therapy clinics, collegiate/professional sports medicine staffs and sports medicine clinics; and (3) a high level athletic performance and rehabilitation database for professional sports teams or Olympic level training facilities that includes normative data of athletes who possess similar performance abilities.
  • the high level database may isolate normative data from professional leagues, e.g., NBA, NFL, MLB, NHL, etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Technology (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Educational Administration (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Child & Adolescent Psychology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Developmental Disabilities (AREA)
  • User Interface Of Digital Computer (AREA)
  • Rehabilitation Tools (AREA)

Abstract

Systems, methods, and computer-readable media for providing a physical training routine for a user are disclosed. One such method may include displaying a visual indicator indicating a movement to be completed by the user and determining whether the user completes the movement. The methods also include recording one or more characteristics of the movement and providing feedback to the user based on the one or more characteristics.

Description

    RELATED APPLICATION
  • The present application claims priority to U.S. Provisional Patent Application No. 61/604,417, filed on Feb. 28, 2012, which is fully incorporated herein by reference.
  • TECHNICAL HELD
  • The present disclosure relates to methods, systems, and computer-readable media for providing physical training. More particularly, the present disclosure relates to methods, systems, and computer-readable media for providing instant feedback and objective assessment of the effectiveness of the physical training.
  • BACKGROUND
  • In sports training, an athlete's training process is normally directed by a coach. The coach typically designs training sessions and monitors the athlete's performance. In this traditional framework, the assessment of an athlete's performance largely depends on the coach's experience, judgment, and patience. It is often difficult for the athlete to receive instant feedback from the coach. In addition, the feedback may not be based on objective measures. In physical therapy especially rehabilitation treatment, a physician often assesses a patient's condition based on overall physical appearance. Subtle imperfections, such as slight imbalance between injured and non-injured legs, are difficult to capture. Therefore, it is desirable to develop systems and methods for providing instant feedback and objective assessment of the effectiveness of physical training.
  • SUMMARY
  • Some disclosed embodiments may involve methods, systems, and computer-readable media for providing physical training to a user. One such system may include a memory for storing a set of instructions. The system may also include a processor communicatively connected to the memory. When executing the set of instructions, the processor may be configured to display a visual indicator on a display panel to indicating a movement to be completed by the user and determine whether the user completes the movement. Moreover, the processor may be configured to record one or more characteristics of the movement and provide feedback to the user based on the one or more characteristics.
  • The preceding summary is not intended to restrict in any way the scope of the claimed invention. In addition, it is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate various embodiments and exemplary aspects of the present invention and, together with the description, explain principles of the invention. In the drawings:
  • FIG. 1 is a schematic diagram of an exemplary physical training system, in accordance with some disclosed embodiments;
  • FIG. 2 is a schematic diagram of an exemplary controller, in accordance with some disclosed embodiments;
  • FIG. 3 illustrates a block diagram of an exemplary physical training system, in accordance with some disclosed embodiments;
  • FIG. 4A is a schematic diagram of an exemplary sensor pad, in accordance with some disclosed embodiments;
  • FIG. 4B is a schematic diagram of an exemplary sensor unit, in accordance with some disclosed embodiments;
  • FIG. 5 is a flow chart of an exemplary method of determining time duration of touching a sensor unit, in accordance with some disclosed embodiments; and
  • FIG. 6 shows an exemplary setup screen for performing an agility drill, in accordance with some disclosed embodiments.
  • DESCRIPTION OF EXEMPLARY EMBODIMENTS
  • Reference will now be made in detail to exemplary embodiments of the invention, examples of which are illustrated in the accompanying drawings. When appropriate, the same reference numbers are used throughout the drawings to refer to the same or like parts.
  • Embodiments of the present disclosure may involve systems, methods, and computer-readable media for providing physical training to a user. As used herein, physical training may include physical condition testing, sports training, physical therapy, athletic performance training, recreational exercise, or a general purpose workout, etc. The physical training may involve one or more physical movements of the body part(s) of the user, such as upper limbs, lower limbs, or whole body movements. A particular set of movements may be referred to as a “drill.” During a training session, the user may perform one or more pre-set drills and/or customized drills. A customized drill may be built by the user from scratch. Alternatively, a customized drill may be modified from a pre-set drill.
  • FIG. 1 illustrates an exemplary training system 100. Training system 100 may include a sensor pad 110. Sensor pad 110 may also be referred to as a touch board, a sensing board, etc. Sensor pad 110 may include one or more sensor units 112. Sensor unit 112 may sense applied pressure or force and provide signals indicating whether the sensor unit has been pressed or touched. In some embodiments, sensor unit 112 may also provide signals indicating the time duration of the pressure or force applied to the sensor unit 112.
  • Training system 100 may include a controller 120. Controller 120 may include a general-purpose portable computing device such as a tablet, a PDA, a mobile phone, a laptop, or any other suitable computing apparatus equipped with a physical training software application. In some embodiments, controller 120 may include a dedicated computing device for providing physical training functions.
  • Training system 100 may include a communication interface 130 to provide information exchange between controller 120 and sensor pad 110. Communication interface 130 may include a wired connection, e.g., via a hardware cable or wire, to provide a communication channel between controller 120 and sensor pad 110. The hardware cable may include general-purpose cables, such as USB cables. In some embodiments, the hardware cable may include some information processing functions. For example, the hardware cable may include built-in electronic chips to perform analog-to-digital signal conversion. In some embodiments, communication interface 130 may include a wireless connection, e.g., via WiFi, Bluetooth, infrared, RF, near field communication, etc., to provide a communication channel between controller 120 and sensor pad 110.
  • In some embodiments, such as during a training session, controller 120 may be placed on a supporting structure 140, such as a tripod, so that controller 120 can be held at a proper height to receive input from and provide feedback to the user. Supporting structure 140 may also include a rack, a cart, a hanging rod, or any other suitable means that can be used to hold controller 120 at a proper height. Supporting structure 140 may be adjustable, flexible, moveable, rotatable, etc.
  • In some embodiments, training system 100 may include a network interface 150 to connect controller 120 to a network 160. Network 160 may include LAN. WAN, telecommunication network, Internet, VPN, etc. Network interface 150 may include wired and/or wireless connections, such as WiFi, Ethernet, 3G, 4G, LTE, etc. Controller 120 may exchange information with other computers, such as servers or peers, through network 160.
  • In some embodiments, training system 100 may include a server 170 that connects to network 160. Server 170 may include a database 172 for storing data related to one or more users of training system 100.
  • FIG. 2 illustrates an exemplary controller 200. As shown in FIG. 2, controller 200 may include a display panel 202. Display panel 202 may display video, image, and/or text information to a user. For example, display panel 202 may display one or more visual indicators 204. Visual indicator 204 may have various shapes (circle, square, triangle, etc), colors (yellow, red, green, blue, etc.), sizes, brightness, arrangements (different number of indicators in different row/column), etc. Visual indicator 204 may be individually rendered. For example, different visual indicators may have different colors, brightness, etc. Display panel 202 may also display one or more input buttons 206. In some embodiments, display panel 202 may include a touch-sensitive layer enabling controller 200 to receive input from a user when the user touches display panel 202.
  • Controller 200 may also include one or more hard buttons 208, a power switch 212, a connector interface 210, a build-in camera 216, and a wireless communication module 214. In some embodiments, connector interface 210 may be used to connect controller 200 to sensor pad 110 through cable 130. In some embodiments, wireless communication module 214 may be used to connect controller 200 to sensor pad 110 via, for example, Bluetooth, WiFi, etc. In some embodiments, wireless communication module 214 may be used to connect controller 200 to network 160 via, for example, WiFi, 3G, 4G, LTE, etc.
  • FIG. 3 shows a block diagram of an exemplary physical training system. Consistent with some embodiments, the system may include a controller 300. Controller 300 may be a general purpose computer such as a laptop, a portable computing device such as a tablet or a mobile phone, or a computing device dedicated for physical training. As shown in FIG. 3, controller 300 may include a processor 310, a memory/storage module 320, a user input device 330, a display device 340, and a communication interface 350. Processor 310 can be a central processing unit (“CPU”) or a mobile processor. Depending on the type of hardware being used, processor 310 can include one or more printed circuit boards, and/or a microprocessor chip. Processor 310 can execute sequences of computer program instructions to perform various methods that will be explained in greater detail below.
  • Memory/storage module 320 can include, among other things, a random access memory (“RAM”), a read-only memory (“ROM”), and a flash memory. The computer program instructions can be stored, accessed, and read from the ROM or flash, or any other suitable memory location, and loaded into the RAM for execution by processor 310. For example, memory/storage module 320 may store an operating system 321, a software application 322, and a database 323. Further, memory/storage module 320 may store an entire software application or only a part of a software application that is executable by processor 310.
  • In some embodiments, software application 322 or portions of it may be stored on a computer readable medium, such as a hard drive, computer disk, CD-ROM, DVD±R, CD±RW or DVD±RW, HD or Blu-ray DVD, flash drive, SD card, memory stick, or any other suitable medium, and can be read and acted upon by processor 310 using routines that have been loaded to memory/storage module 320.
  • In some embodiments, input device 330 and display device 340 may be coupled to processor 310 through appropriate interfacing circuitry. In some embodiments, input device 330 may be a hardware keyboard, a keypad, or a touch screen, through which a user may input information to controller 300. Display device 340 may include one or more display screens that display the training interface, result, or any related information to the user.
  • Communication interface 350 may provide communication connections such that controller 300 may exchange data with external devices. For example, controller 300 may be connected to network 380 through communication channel 390. Network 380 may be a LAN, a WAN, or the Internet. In some embodiments, controller 300 may be connected to an accessory 360 through communication channel 370. Accessory 360 may include, for example, sensor pad 110 or an external camera (not shown).
  • FIG. 4A is a schematic diagram of an exemplary sensor pad, in accordance with some disclosed embodiments. Referring to FIG. 4A, sensor pad 400 may include a number of sensor units 410 and an interface 420. Sensor units 410 may be assigned predetermined identifiers, such as numbers 1-5, as illustrated in FIG. 4A. It is noted that different number of sensor units, different kinds of identifiers, and different manners of identifier assignment (e.g., the arrangement and order of the identifiers) may also be used. Sensor pad 400 may provide signals indicating whether a particular sensor unit is touched or pressed. In some embodiment, sensor pad 400 may also provide signals indicating the time duration of the pressure or force applied to the sensor pad 400. The signals may be sent to controller 120 (FIG. 1) through interface 420.
  • FIG. 4B is a schematic diagram of an exemplary sensor unit, in accordance with some disclosed embodiments. In FIG. 4B, the sensor unit includes a top panel 412 and a bottom panel 414. When a user touches or presses the sensor unit, top panel 412 moves downward toward the dashed-line position. Such position change may lead to a change of resistance, which may be sensed by sensor 416. Sensor 416 may in turn generate a signal indicating that the sensor unit has been touched or pressed, based on the resistance change, and output the signal to interface 420 for communication with controller 120.
  • FIG. 5 is a flow chart of an exemplary method of determining time duration of touching a sensor unit, in accordance with some disclosed embodiments. As shown in FIG. 5, method 500 may include a series of steps, some of them may be optional. In step 502, sensor 416 reads resistance of a subsequent sensor unit. For example, as shown in FIG. 4A, the resistance of sensor units 1-5 may be read sequentially, and a next sensor unit may indicate, for example, sensor unit 5 after the steps described in FIG. 5 has finished a cycle for sensor unit 4. In step 504, the position of the sensor unit can be determined based on the resistance. For example, the resistance may be smaller when top panel 412 moves DOWN than when top panel 412 moves UP. The UP/DOWN position information can be saved. In step 506, it is determined if the position information has been changed, compared with the previously saved position information. If the position information has changed, the new position can be sent to controller 120. In step 508, it is determined if the position has changed from UP to DOWN. If so, the new position indicates that the sensor unit has been touched or pressed, and a timer can be adapted to measure the contact time. In step 510, it is determined if the position has changed from DOWN to UP. If so, the new position indicates that the sensor unit has been released. The timer can be stopped and the time duration of the pressure or force applied to the sensor unit, for example, can be saved. The process then returns to step 502 to read the next sensor unit (e.g., sensor unit 1). Method 500 may be initiated upon receiving a request from controller 120. Contact time information may be sent to controller 120, once available.
  • In some embodiments, software application 320 may include a plurality of pre-set drills for performing physical training. Examples of pre-set drills may include count drills, react drills, sequence drills, vertical drills, and agility drills, as described herein. A user may also create a customized drill, a new category of drills, and/or a playlist of drills.
  • Count drills are designed to test and/or improve speed or quickness of movements. The user is instructed to perform pre-determined movements during count drills. In an exemplary count drill, controller 120 may activate one or more sensor units 112 (or all sensor units in some embodiments) on sensor pad 110. Controller 120 may record the total number of touches on the activated sensor units during specific time durations. Alternatively, controller 120 may record the time duration for finishing a pre-set number of touches. The goal of the user is to touch the activated sensor units as quickly as possible. When the user touches a sensor unit, the corresponding visual indicator may light up on display panel 202. The number of touches and/or the time (spending and/or remaining) can be displayed on display panel 202 in real time. The user may be able to choose or program the pre-determined movements (e.g., type or manner of movements, sensor units to be activated, specific order of touches, etc.), number of touches to be accomplished, and/or time duration of the drill. In another example, the user can stand on the front or back two sensor units (e.g., 1 and 2 or 4 and 5 in FIG. 4A), and controller 120 may activate the two sensor units as a left and a right touch target. The number of left and right touches can be displayed separately on display panel 202 in real time. The time (spending and/or remaining) can also be displayed on display panel 202. Contact time may also be determined by, for example, method 500 illustrated in FIG. 5.
  • React drills are designed to test and/or improve a user's speed of reaction to visual indicators 204 displayed on display panel 202. In an exemplary react drill, a sequence of visual indicators 204 can be programmed to be displayed on display panel 202, and a user needs to react to the visual indicators 204 and perform a series of movements, e.g., by touching the corresponding sensor units 112 on sensor pad 110. In some embodiments, the visual indicators can be assigned a certain color, and the user is instructed to react to the visual indicators with the assigned color. In some embodiments, the visual indicators can be assigned different colors, and the user is instructed to react to certain colors, but not others. The user reacts to the visual indicator by touching the corresponding sensor unit as quickly as possible. If the user correctly touches the corresponding sensor unit, controller 120 may receive a signal from sensor pad 110 indicating a corresponding sensor unit has been touched. Controller 120 may then determine that the user has completed the expected movement and record a correct touch. If the user touches an incorrect sensor unit that does not correspond to the “react to” visual indicator, or the user touches a sensor unit corresponding to a “don't react to” visual indicator, or the user fails to touch the corresponding sensor unit fast enough (e.g., the reaction/response time is longer than a predetermined threshold), controller 120 may determine that the user does not complete the expected movement and record an incorrect touch. The total number of correct touches and the total number of incorrect touches can be displayed on display panel 202 in real time. The time (spending and/or remaining) can also be displayed on display panel 202. In some embodiments, response times may also be recorded. The response time may be measured from the time the visual indicator first appears on display panel 202 to the time when the sensor unit (either a correct or an incorrect one) senses the responding/reacting touch. Contact time may also be determined by, for example, method 500 illustrated in FIG. 5.
  • The sequence of visual indicators in react drills may be randomly generated. In some embodiments, controller 120 may be configured such that the same order of visual indicators in a random sequence is not repeated.
  • Visual indicators may be displayed in different manners. For example, the visual indicator may display in a solid mode, in which the visual indicator is being displayed until the user touches a sensor unit. In another example, the visual indicator may display a flash mode, in which the visual indicator illuminates for a pre-determined duration (e.g., 0.10 to 10.0 seconds in increments of 0.10 seconds) and then disappears, regardless of whether the user touches a sensor unit during this time duration. If the user fails to touch the corresponding sensor unit during the pre-determined time duration, the failure to touch will be treated as an incorrect touch or an incomplete movement.
  • In some embodiments, the react drill may include a flip mode. In the flip mode, the user is instructed to react to the opposite sensor corresponding to the displayed visual indicator. In one embodiment, the user may be instructed to react to a diagonally opposite sensor. For example, if the top right visual indicator appears on the display panel, the user should react to the sensor unit on the bottom left. In another embodiment, the user may be instructed to react to a lateral opposite sensor. For example, if the top right visual indicator appears on the display panel, the user should react to the top left sensor unit. In yet another embodiment, the user may be instructed to react to a linear opposite sensor. For example, if the top right visual indicator appears on the display panel, the user should react to the bottom right sensor unit.
  • Sequence drills are similar to react drills. One of the differences is that instead of random sequences, pre-determined sequences of visual indicators are used in sequence drills. The pre-determined sequences can be generated by the software application or by the user. The user may setup a sequence by inputting a series of numbers indicating the corresponding sensor units. The total number of sensor units in a sequence may vary. For example, in some embodiments, the total number may be from 1-20. In sequence drills, the number of correct and incorrect touches can also be provided and displayed on display panel 202 in real time, similar to that of the react drills. The time (spending and/or remaining) can also be displayed on display panel 202. Contact time may also be determined by, for example, method 500 illustrated in FIG. 5.
  • In both react and sequence drills, a time delay can be set between the appearance of a visual indicator and the receiving of a user response (e.g., a correct touch or an incorrect touch). For example, a 2-3 seconds delay may be added before the user responds to the visual indicator.
  • After a user finishes a drill, controller 120 may save the result of the drill. In some embodiments, controller 120 may provide comparative data about the current result and previous results of the user, thereby showing whether the user's performance has been improved over time. The comparative data may be displayed on display panel 202 using a tabular view and/or a graphical view. The tabular view may include percentage improvement or decrease, in addition to current result and past results. The graphical view may include bar/circle/curve graphical representations of the results comparisons.
  • Drills may be performed using either lower limbs (legs) or upper limbs (hands). In lower limb mode, sensor board 110 can be used to sense the user's movements/touches. In upper limb mode, the user may perform the drill by touching display panel 202 directly using his/her fingers.
  • Agility drills may be performed to test and/or improve the agility of a user. In an agility drill, the user responds to a sequence of arrows displayed on display panel 202 and moves in the direction of the arrow. FIG. 6 shows an exemplary setup screen for performing an agility drill in an arrow sequence mode. In FIG. 6, eight arrows represent eight directions identified by numbers 1 to 8. In the arrow sequence mode, the user is able to program the sequence of the drill by inputting the sequence of numbers. Number 9 can be used as a wild card. Alternatively, in an arrow react mode, arrows may appear randomly.
  • The length of the agility drill can be programmed by time (e.g., 1 second to 60 mins) or by arrow count (e.g., 1 to 20 arrows). The delay time between the appearing of two adjacent arrows can also be programmed (e.g., from 1 to 20 seconds with incremental of 0.5 second).
  • Vertical drills may be performed to assess height of a user's jump. In a vertical drill, the user stands on one or more of the sensor units, jumps, and lands on the same sensor unit. Controller 120 can detect the duration of time between the user's lift off from the sensor unit (e.g., sensor unit released) and the user's next contact with the sensor unit (e.g., sensor unit compressed again). Contact time may also be determined by, for example, method 500 illustrated in FIG. 5. Body weight of the user can be taken into account to improve accuracy. Similar to those of count drills, the number of jumps and the time (spending and/or remaining) can be displayed on display panel 202 in real time.
  • For every type of drills, the user can program the length of the drill by specify either the time duration of the drill or the target number of touches. The target number of touches may include the number of correct touches, the number of total touches (both correct and/or incorrect), or the number of visual indicators appearing on the display panel.
  • Videos of a user performing a drill can be recorded. For example, a video can be recorded using camera 216 (FIG. 2) that is equipped with controller 120. The recorded video may be provided to the user along with other performance data collected, for example, from sensor pad 110.
  • In some embodiments, sensor pad 110 may be replaced by a high speed camera. For example, the high speed camera may be set up to synchronize with controller 120 to monitor the user performing a drill. The user can place any type of targets on the ground, e.g., a rubber mat, rubber dots, spray painted dots, etc., for the high speed camera to capture and register the user's touch/contact actions or movements. In some embodiments, the camera can be used for gait testing and analysis, training and rehabilitation for high performance athletes, or general rehabilitation patients.
  • In some embodiments, a user may create a use profile. The user profile may include use data such as name, gender, age, height, weight, sports, position, injuries, etc. Performance data and videos of pre-set and customized drills may be saved based on user profiles. In some embodiments, performance data may be exported (e.g., as .csv files).
  • In some embodiments, an online database (e.g., database 172) may be provided to users (e.g., for an annual fee or other suitable fee structures). The database may allow users to upload their saved drill results for comparison with other users. Databases can be developed for the general consumers, high school sports teams, collegiate sports teams, professional teams, Olympic athletes, physical therapy clinics, sports medicine clinics, etc. Controller 120 may upload saved drill results and the user profile data in order to compare the results. A user profile may include fields for segregating, filtering, or targeting certain user information, such as sports, position, professional level, demographic information to specific drills data and settings. Drills listed in the online database may contain demo videos to display how the drills should be performed.
  • Exemplary online databases include: (1) a performance database for individuals, high schools and colleges to compare results to other users around the world; (2) a rehabilitation database for physical therapy clinics, collegiate/professional sports medicine staffs and sports medicine clinics; and (3) a high level athletic performance and rehabilitation database for professional sports teams or Olympic level training facilities that includes normative data of athletes who possess similar performance abilities. The high level database may isolate normative data from professional leagues, e.g., NBA, NFL, MLB, NHL, etc.
  • In the foregoing description of exemplary embodiments, various features are grouped together in a single embodiment for purposes of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the following claims are hereby incorporated into this description of the exemplary embodiments, with each claim standing on its own as a separate embodiment of the invention.
  • Moreover, it will be apparent to those skilled in the art from consideration of the specification and practice of the present disclosure that various modifications and variations can be made to the disclosed systems and methods without departing from the scope of the disclosure, as claimed. Thus, it is intended that the specification and examples be considered as exemplary only, with a true scope of the present disclosure being indicated by the following claims and their equivalents.

Claims (16)

1-20. (canceled)
21. A system for providing physical training routines for a user, the system comprising:
a sensor pad, the sensor pad including a plurality of sensor units for registering foot touches by said user, the sensor pad configured to generate a touch signal when a foot touch is received by a sensor unit;
a general-purpose portable computing device, the general-purpose portable computing device comprising a memory, a controller, and an associated display panel,
the memory storing program instructions for physical training routines,
the controller operatively connected to the sensor units of the sensor pad for receiving the touch signals generated by the sensor pads, the controller including a processor communicatively connected to the memory, wherein the program instructions, when executed by the processor, cause the processor to perform operations including:
controlling the display panel to display visual indicators on the display panel, each of the visual indicators arranged to correspond to a position of one of the sensor units of the sensor pad for use in directing said user to touch a sequence of sensor units,
the program instructions programed to allow said user to select from at least three separate drills, the drills comprising a count drill, a sequence drill and a react drill, wherein
in the count drill, the display indicates one or more of the plurality of sensors to touch, and the processor counts the number of times said user touches the sensors to touch, without taking into account erroneous touches,
in the sequence drill, the display repetitively displays a cycle of a pre-determined sequence of which sensors to touch, and the processor counts the number of times the user touches the sensors to touch, taking into account erroneous touches of sensors, and
in the react drill, the display displays a random series of sensors from among the plurality of sensors, and the processor counts the number of times the user touches the sensors to touch, taking into account erroneous touches of sensors.
22. The system of claim 21, wherein the processor calculates the number of times said user touches the sensors to touch in a selected period of time.
23. The system to claim 21, wherein the processor terminates a selected drill when the number of times the user touches the sensors to touch matches a pre-selected number of touches.
24. The system of claim 21, wherein the general-purpose computing device is a tablet device and the associated display panel is integral to the tablet device.
25. The system of claim 21, wherein the display panel is selectively interconnected to the general-purpose computing device.
26. The system of claim 21, wherein the touch signals are generated based on a change of resistance of the sensor units when a foot touch is received.
27. The system of claim 21, wherein, in react and sequence drills, the processor changes the visual indicators to any of a plurality of different colors, and the sensors to touch are indicated by at least one pre-selected color from the plurality of different colors to thereby test neurocognitive ability.
28. The system of claim 27, wherein the sensors to touch are indicated by a plurality of different pre-selected colors from the plurality of different colors, to thereby slow down reaction time and test neurocognitive ability.
29. The system of claim 21, wherein the sequence and react drills include a selectable flash mode in which the visual indicator to be touched illuminates for a pre-determined time duration and then turns off, and if said user fails to touch the corresponding sensor unit to be touched during the pre-determined time duration, said user's failure to touch is counted as an error.
30. The system of claim 21, wherein the sequence and react drills include a selectable delay mode in which indicators to be touched are illuminated after a pre-determined delay, to thereby adjust the system for users of different abilities.
31. The system of claim 21, wherein in the count drill, the sequence drill, and the react drill, said user can select which sensors from among the plurality of sensors will be used during the drill.
32. The system of claim 21, wherein the at least three separate drills further includes a vertical jump drill, wherein the processor calculates a vertical jump height for said user based on the amount of air time between touches of sensors, a longer air time correlating with a higher vertical jump.
33. The system of claim 21, wherein the plurality of sensor units includes at least four peripheral sensor units peripherally spaced from a center of the sensor pad.
34. The system of claim 33, wherein the plurality of sensor units further comprises a center sensor unit in the center of the sensor pad.
35. The system of claim 21, wherein each display of a visual indicator is initiated by said user touching a sensor unit, rather than by the controller.
US16/601,559 2012-02-28 2019-10-14 Physical Training System and Method Abandoned US20200043361A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/601,559 US20200043361A1 (en) 2012-02-28 2019-10-14 Physical Training System and Method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201261604417P 2012-02-28 2012-02-28
US13/780,910 US10446049B2 (en) 2012-02-28 2013-02-28 Physical training system and method
US16/601,559 US20200043361A1 (en) 2012-02-28 2019-10-14 Physical Training System and Method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/780,910 Continuation US10446049B2 (en) 2012-02-28 2013-02-28 Physical training system and method

Publications (1)

Publication Number Publication Date
US20200043361A1 true US20200043361A1 (en) 2020-02-06

Family

ID=49003256

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/780,910 Expired - Fee Related US10446049B2 (en) 2012-02-28 2013-02-28 Physical training system and method
US16/601,559 Abandoned US20200043361A1 (en) 2012-02-28 2019-10-14 Physical Training System and Method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/780,910 Expired - Fee Related US10446049B2 (en) 2012-02-28 2013-02-28 Physical training system and method

Country Status (1)

Country Link
US (2) US10446049B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2808399C1 (en) * 2022-12-02 2023-11-28 Общество с ограниченной ответственностью "А1" Yoga and fitness trainer

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9280586B1 (en) * 2012-02-22 2016-03-08 LookForIt, LLC Electronic database for athletes and coaches for recruiting purposes
US9161708B2 (en) * 2013-02-14 2015-10-20 P3 Analytics, Inc. Generation of personalized training regimens from motion capture data
GB2532287A (en) * 2014-11-17 2016-05-18 Ahmet Denis Exercise apparatus and device
WO2018045319A1 (en) 2016-09-01 2018-03-08 Catalyft Labs, Inc. Multi-functional weight rack and exercise monitoring system for tracking exercise movements
US11590402B2 (en) * 2018-05-31 2023-02-28 The Quick Board, Llc Automated physical training system
US11452916B1 (en) * 2019-01-17 2022-09-27 Dp Technologies, Inc. Monitoring exercise surface system
US11213735B2 (en) * 2019-03-22 2022-01-04 Jason Shrout Exercise apparatus
WO2021108931A1 (en) * 2019-12-02 2021-06-10 Universidad Técnica Federico Santa María A perceptual-cognitve training system and method for the assessment of reactive agility in athletes when faced with different scenarios
CN112137628B (en) * 2020-09-10 2021-08-03 北京津发科技股份有限公司 Three-dimensional space cognition evaluation and training method and system
AU2021236480A1 (en) * 2020-10-20 2022-05-05 Kistler Holding Ag Method and device for measuring forces

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4261563A (en) 1978-12-21 1981-04-14 Goldfarb Adolph E Electronic time reaction game apparatus
CA1233687A (en) 1985-01-29 1988-03-08 Leon A. Malinowski Chisel plow release mechanisms
US5469740A (en) * 1989-07-14 1995-11-28 Impulse Technology, Inc. Interactive video testing and training system
US5277674A (en) 1991-03-30 1994-01-11 Combi Corporation Leg extension apparatus with pivotal foot section for measuring instantaneous power generated by a leg extending force
US5491912A (en) 1993-06-10 1996-02-20 Snabb; John C. Athletic shoes with reverse slope construction
US5752330A (en) 1992-06-10 1998-05-19 Snabb; John C. Athletic shoes with reverse slope sole construction
US5720200A (en) 1995-01-06 1998-02-24 Anderson; Kenneth J. Performance measuring footwear
US5584779A (en) 1995-04-10 1996-12-17 Wendy S. Knecht Step exercising system and method
US5838638A (en) 1997-02-10 1998-11-17 The University Of Tulsa Portable verticle jump measuring device
US6336891B1 (en) * 1997-12-08 2002-01-08 Real Vision Corporation Interactive exercise pad system
KR100280968B1 (en) 1997-12-10 2001-02-01 윤종용 Optical fiber amplifier using a synchronized etal on filter
US5916046A (en) 1998-02-02 1999-06-29 Allred; Dale Device for physical conditioning and coordination development
US6110073A (en) 1999-02-03 2000-08-29 Tread Pad Partners, Llc Physical fitness device
JP3496874B2 (en) * 2000-02-23 2004-02-16 コナミ株式会社 GAME DEVICE, GAME DEVICE CONTROL METHOD, INFORMATION STORAGE MEDIUM, GAME DISTRIBUTION DEVICE, AND GAME DISTRIBUTION METHOD
JP2001269431A (en) * 2000-03-24 2001-10-02 Yamaha Corp Body movement state-evaluating device
US7060000B2 (en) * 2001-10-11 2006-06-13 Carlson Carl A Game and exercise device and method
US20050153265A1 (en) * 2002-12-31 2005-07-14 Kavana Jordan S. Entertainment device
ES2709198T3 (en) 2003-02-26 2019-04-15 Engineering Fitness Int Corp Exercise device and method to use the same
US7604571B2 (en) 2003-06-18 2009-10-20 Scott & Wilkins Enterprises, Llc Exercise device with a user-defined exercise mode
US7572206B2 (en) 2003-06-18 2009-08-11 Scott & Wilkins Enterprises, Llc Exercise device having position verification feedback
FR2862884B1 (en) * 2003-12-02 2008-05-30 Xkpad DEVICE FOR THE PRACTICE OF INTERACTIVE GYMNASTICS OF "STEP" TYPE
US7500260B2 (en) 2004-01-07 2009-03-03 D1Athletes.Com, Inc. Motion video indexing mechanism for athlete recruiting architecture
JP4006008B2 (en) * 2005-03-14 2007-11-14 株式会社コナミスポーツ&ライフ Motion information processing system
GB2465918B (en) * 2005-05-03 2010-08-04 Codemasters Software Co Rhythm action game apparatus and method
US20070079690A1 (en) * 2005-09-29 2007-04-12 Konami Digital Entertainment, Inc. Dance game machine, method for scoring dance game and computer-readable recording medium
CA2635638A1 (en) * 2006-01-20 2007-07-26 Jeffrey Compton Method and system for assessing athletic performance
US20080004111A1 (en) * 2006-06-30 2008-01-03 Logitech Europe S.A. Non Impact Video Game Controller for Dancing Games
US20080102991A1 (en) * 2006-10-27 2008-05-01 Thomas Clark Hawkins Athlete Reaction Training System
US20090221372A1 (en) * 2008-02-29 2009-09-03 Molly Casey Footpad-based game and gaming system
US8253586B1 (en) 2009-04-24 2012-08-28 Mayfonk Art, Inc. Athletic-wear having integral measuring sensors
CA2955632A1 (en) 2010-11-10 2012-05-18 Nike Innovate C.V. Systems and methods for time-based athletic activity measurement and display
US8727885B2 (en) * 2010-11-15 2014-05-20 Google Inc. Social information game system
US9017222B2 (en) 2011-01-10 2015-04-28 Robert Herman Hofeldt Machine for testing and training jumping and reaching ability
JP5782529B2 (en) 2011-02-17 2015-09-24 ナイキ イノベイト シーブイ Selection of physical activity data and association with image data
US9326911B2 (en) 2012-09-14 2016-05-03 Recovery Force, LLC Compression integument
EP3042485B1 (en) 2013-09-05 2020-03-11 NIKE Innovate C.V. Conducting sessions with captured image data of physical activity and uploading using token-verifiable proxy uploader

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2808399C1 (en) * 2022-12-02 2023-11-28 Общество с ограниченной ответственностью "А1" Yoga and fitness trainer

Also Published As

Publication number Publication date
US20130224708A1 (en) 2013-08-29
US10446049B2 (en) 2019-10-15

Similar Documents

Publication Publication Date Title
US20200043361A1 (en) Physical Training System and Method
US10933281B2 (en) Flight time
JP6454304B2 (en) Fitness monitoring method, system, program product and application thereof
US20210379447A1 (en) Interactive exercise apparatus
US10121065B2 (en) Athletic attribute determinations from image data
US9498679B2 (en) Adjustable fitness arena
WO2016199350A1 (en) Information processing apparatus, information processing system, and insole
US9782116B2 (en) Stability-assessing system
US11534673B2 (en) Interactive exercise and training system
AU2017206218B2 (en) Physical performance assessment
US20160271451A1 (en) Wearable Device used in Various Exercise Devices
WO2015186132A2 (en) Physical training system and methods useful in conjunction therewith
AU2023201742B2 (en) A sensor-enabled platform configured to measure athletic activity
TW201416112A (en) Motion sensing game directing system and method
US20170312575A1 (en) Rehabilitation exercise system
TW201824158A (en) Health management system and method having a display interface, a physical parameter sensor, a communication module and an exercise unit
Brooks et al. Velocity-Based Training: Current Concepts and Future Directions
TW201629883A (en) Sports personnel selection system based on vision and limb response coordination performance and method thereof
CN203196228U (en) Scene synchronizing device capable of being combined with exercise device
KR20240013019A (en) Device providing golf training interface and golf training method using the same
WO2023168498A1 (en) Configuration of hardware to enable multi-modal functional exercise programs at distributed locations
KR20190023396A (en) user-customized exercise management system and method
JP2015013032A (en) Exercise support device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION