WO2023147548A1 - Training system and method of using same - Google Patents

Training system and method of using same Download PDF

Info

Publication number
WO2023147548A1
WO2023147548A1 PCT/US2023/061577 US2023061577W WO2023147548A1 WO 2023147548 A1 WO2023147548 A1 WO 2023147548A1 US 2023061577 W US2023061577 W US 2023061577W WO 2023147548 A1 WO2023147548 A1 WO 2023147548A1
Authority
WO
WIPO (PCT)
Prior art keywords
trainee
training
delivery device
target zone
objects
Prior art date
Application number
PCT/US2023/061577
Other languages
French (fr)
Inventor
Preston Carpenter Cox
Robin Birdwell Cox
Brian Charles Ellis
Original Assignee
Breakout Hitting Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Breakout Hitting Llc filed Critical Breakout Hitting Llc
Publication of WO2023147548A1 publication Critical patent/WO2023147548A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/40Stationarily-arranged devices for projecting balls or other bodies
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B43/00Balls with special arrangements
    • A63B43/002Balls with special arrangements with special configuration, e.g. non-spherical
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B43/00Balls with special arrangements
    • A63B43/008Balls with special arrangements with means for improving visibility, e.g. special markings or colours
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B63/00Targets or goals for ball games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0002Training appliances or apparatus for special sports for baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0015Training appliances or apparatus for special sports for cricket
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/002Training appliances or apparatus for special sports for football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0024Training appliances or apparatus for special sports for hockey
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0071Training appliances or apparatus for special sports for basketball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0095Training appliances or apparatus for special sports for volley-ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • A63B69/3623Training appliances or apparatus for special sports for golf for driving
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/38Training appliances or apparatus for special sports for tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0669Score-keepers or score display devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0028Tracking the path of an object, e.g. a ball inside a soccer pitch
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0037Tracking a path or terminating locations on a target surface or at impact on the ground
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0068Comparison to target or threshold, previous performance or not real time comparison to other individuals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0002Training appliances or apparatus for special sports for baseball
    • A63B2069/0004Training appliances or apparatus for special sports for baseball specially adapted for particular training aspects
    • A63B2069/0008Training appliances or apparatus for special sports for baseball specially adapted for particular training aspects for batting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/40Stationarily-arranged devices for projecting balls or other bodies
    • A63B2069/402Stationarily-arranged devices for projecting balls or other bodies giving spin
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/02Games or sports accessories not covered in groups A63B1/00 - A63B69/00 for large-room or outdoor sporting games
    • A63B71/023Supports, e.g. poles
    • A63B2071/025Supports, e.g. poles on rollers or wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0625Emitting sound, noise or music
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B2071/0675Input for modifying training controls during workout
    • A63B2071/0677Input by image recognition, e.g. video signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/02Tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/14Lacrosse
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/16Table tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/18Baseball, rounders or similar games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/18Baseball, rounders or similar games
    • A63B2102/182Softball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/20Cricket
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/24Ice hockey
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2214/00Training methods
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/89Field sensors, e.g. radar systems
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/09Adjustable dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/09Adjustable dimensions
    • A63B2225/093Height
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/20Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/76Miscellaneous features of sport apparatus, devices or equipment with means enabling use in the dark, other than powered illuminating means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0025Football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0037Basketball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0066Rugby; American football
    • A63B2243/007American football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0095Volleyball

Definitions

  • the present invention relates, in general, to the field of training individuals to improve their performance in their field of work or play. More particularly, present embodiments relate to a system and method for projecting an object toward a target and a trainee (or individual) interacting with the object.
  • a system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions.
  • One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.
  • One general aspect includes a method for training that can include projecting, via a delivery device, one or more objects along one or more pre-determined trajectories toward a target zone, where each of the one or more objects has a diameter that is less than a diameter of a regulation table tennis ball; a trainee attempting to prevent the one or more objects from entering the target zone, and scoring an ability of the trainee to prevent the one or more objects from entering the target zone.
  • Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • One general aspect includes a method for training.
  • the method also includes projecting, via a delivery device, an object along a pre-determined trajectory toward a target zone proximate a trainee, where the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring, via a controller, an ability of the trainee to interact with the object at the target zone.
  • Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • One general aspect includes a method for training that can include operating of a delivery device in response to a first command from a trainee to perform a training; receiving, by the trainee, a response to the first command from the delivery device, where the response is requesting input from the trainee; providing, via a second command from the trainee, the requested input to the delivery device; adjusting, via a controller, one or more parameters of the delivery device based on the second command; projecting, via the delivery device, an object along a pre-determined trajectory toward a target zone proximate the trainee, where the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring an ability of the trainee to interact with the object at the target zone.
  • Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • One general aspect includes a method for training that can include projecting, via a delivery device, a plurality of first objects along a pre-determined trajectory toward a first target zone in a first training field, where each of the plurality of first objects has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a delivery device to deliver the plurality of first objects at a pre-determined location at the first target zone; adjusting one or more parameters of the delivery device based on the first score; projecting, via the delivery device, a plurality of second objects along a pre-determined trajectory toward the first target zone, where each of the plurality of second objects has a diameter less than the diameter of a regulation table tennis ball; and determining, via the controller and the imaging sensor, a second score that indicates an ability of a delivery device to deliver the plurality of second objects at the pre-determined location at the first target zone, where the second score indicates an improved performance of the delivery device compared to the first score.
  • One general aspect includes a method for training that can include projecting, via a delivery device, a first object along a pre-determined trajectory toward a first target zone in a first training field, where the first object has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a trainee to interact with the first object at the first target zone; moving the delivery device to a second training field, with a second target zone; projecting, via the delivery device, a second object along the pre-determined trajectory toward the second target zone, where the second object has a diameter less than the diameter of the regulation table tennis ball; determining, via the controller and the imaging sensor, a second score that indicates an ability of the delivery device to project the second object along the predetermined trajectory to the second target zone; adjusting one or more parameters of the delivery device based on the second score; projecting another object toward the second target zone; determining, via the controller and the imaging sensor, the second score and comparing the second score to a desired score;
  • FIGS. 1A-1C are representative functional diagrams of systems and methods for training a trainee to improve coordination, vision training and/or tracking capabilities, and vision training and/or timing capabilities, in accordance with certain embodiments;
  • FIGS. ID- IE are representative diagrams of an object used with a delivery device for training a trainee to improve coordination, vision training and/or tracking capabilities, and vision training and/or timing capabilities, in accordance with certain embodiments;
  • FIG. 2 includes a representative functional diagram of a system and method for modifying parameters of the delivery device based on detected characteristics, in accordance with certain embodiments.
  • FIG. 3 is a representative functional block diagram of an object delivery device that can support the systems and methods of the current disclosure, in accordance with certain embodiments;
  • FIG. 4 is a representative perspective view of a friction device for the delivery device, in accordance with certain embodiments.
  • FIG. 5 is a representative partial cross-sectional view along line 7-7 shown in FIGG, in accordance with certain embodiments;
  • FIG. 6 is a representative functional block diagram of a control system for a training system, in accordance with certain embodiments.
  • FIG. 7 is a representative functional diagram of a system and method for training a trainee to improve coordination, vision training, and/or tracking capabilities, in accordance with certain embodiments.
  • the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having,” or any other variation thereof, are intended to cover a non-exclusive inclusion.
  • a process, method, article, or apparatus that comprises a list of features is not necessarily limited only to those features but may include other features not expressly listed or inherent to such process, method, article, or apparatus.
  • “or” refers to an inclusive-or and not to an exclusive-or. For example, a condition A or B is satisfied by any one of the following: A is true (or present), and B is false (or not present), A is false (or not present), and B is true (or present), and both A and B are true (or present).
  • FIGS. 1A-1C are representative functional diagrams of a system 10 for training a trainee 8 to improve coordination, vision training, and/or tracking capabilities.
  • a system 10 and method of using the system as disclosed according to the embodiments herein may be particularly suited for sports training.
  • sports may include, without limitation, baseball (FIG. 1A), tennis (FIG. IB), or hockey (FIG. 1C).
  • Other sports can also benefit from similar training, such as softball, lacrosse, cricket, soccer, table tennis, American football (referred to as “football”), volleyball, basketball, shooting sports, etc.
  • training activities can also benefit from similar training using the systems described in this disclosure such as military training, first responder training, search and rescue training, rehabilitation training (e.g., where the trainee 8 is autistic, recovering from a stroke, recovering from an injury, or has other medical conditions), or other trainees that can benefit from eye-body coordination training provided by the training systems described in this disclosure.
  • rehabilitation training e.g., where the trainee 8 is autistic, recovering from a stroke, recovering from an injury, or has other medical conditions
  • eye-body coordination training provided by the training systems described in this disclosure.
  • Visual skills that can be improved by the training systems in this disclosure are, but not limited to:
  • the figures 1A, IB, 1C show a delivery device 20 that can be used to project an object 30 toward a target zone 50 (or a trainee 8).
  • the object 30 may be projected along a trajectory (e.g., 40, 42, 44) in a direction 66 toward the target zone 50 or trainee 8.
  • a “trajectory” is a representation of a flight path of an object through a three-dimensional (3D) X, Y, Z coordinate system space, where each point along the trajectory can be represented by a point in the 3D space.
  • Each point along the trajectory can include a velocity vector that is representative of a velocity and direction of travel of the object at that point along the trajectory.
  • the projection of the object 30 along the trajectory (40, 42, 44) may be controlled by one or more controllers 28, 29 (also referred to as “controller 28, 29”) capable of controlling various aspects of the process of projection of the object 30, such that the projection is conducted along a predetermined trajectory 40, 42, or 44.
  • the one or more controllers 28, 29 can include only one controller (28 or 29) that can control the aspects of the delivery device 20 and communicate with internal and external data sources for setting parameters of the delivery device 20 to desired values.
  • the one or more controllers 28, 29 can include an internal controller(s) 28 and an external controller(s) 29 that can control the aspects of the delivery device 20 and communicate with each of the controllers and with internal and external data sources for setting parameters of the delivery device 20 to desired values.
  • a predetermined trajectory can include a trajectory that is estimated (or determined) prior to projection of the object 30.
  • the predetermined trajectory can be selected by the controller 28, 29 which can control one or more components of the delivery device 20 to control the trajectory of the object.
  • the delivery device 20 can include or be communicatively coupled (wired or wirelessly) to the controller 28, 29 that can be configured to control one or more delivery variables associated with delivering the object along a predetermined trajectory 40, 42, or 44.
  • the delivery variables can include, position of the device in 3D-space (i.e., position in space according to X, Y, and Z planes), angle of the device relative to an intended target or trainee, distance from a target or trainee, intended velocity of the object along the intended trajectory between the device and the target or trainee, spin of the object along the intended trajectory between the device and the target or trainee, the weight of the object by selecting an object, surface features of the object by selecting the object, as well as others. Additional delivery variables (or parameters) are defined in the following description at least in regard to FIGS. 3-6.
  • these parameters can be: selection of a barrel through which to propel the object; an air pressure supplied to the object to propel the object through the barrel with a center axis; an air volume supplied to the object; an inclination of the barrel; an azimuthal orientation of the barrel; a length of the barrel; an inclination of the friction device which comprises a ramp and a surface material on the ramp; an azimuthal orientation of the friction device around the center axis of the barrel; an azimuthal orientation of the friction device about a longitudinal axis of the friction device; a distance of the friction device from the barrel; the surface material of the friction device; an object launch position from the delivery device, the object launch position being a 3D position in X-Y-Z coordinate space relative to the target or the trainee; an object selection; a distance to the target or the trainee; and a height of the target or the trainee.
  • the delivery device 20 can be moved horizontally shown by arrows 60, 62, or vertically shown by arrows 64.
  • the height LI of the object exiting the delivery device 20 can be adjusted by moving the chassis 22 of the delivery device 20 up or down (arrows 64) a desired distance.
  • This 3D movement of the delivery device 20 can allow users (e.g., coach 4, trainer 4, individual 8, trainee 8, or others) to adjust the position that an object 30 exits the delivery device 20.
  • a regulation object e.g., a regulation baseball, a regulation softball, a regulation hockey puck, a regulation tennis ball, a regulation table tennis ball, a regulation lacrosse ball, a regulation cricket ball, a regulation football, and a regulation soccer ball
  • a regulation object e.g., a regulation baseball, a regulation softball, a regulation hockey puck, a regulation tennis ball, a regulation table tennis ball, a regulation lacrosse ball, a regulation cricket ball, a regulation football, and a regulation soccer ball
  • a pitcher for baseball or softball e.g., a quarterback for football, a skeet delivery device for shooting sports, a soccer player making shots on goal, a hockey player making shots on goal, etc.
  • a “real-life” event refers to a game, practice session, or tactical situation for which the trainee is training to improve performance. The real-life event would be those events that use regulation equipment to perform the sport or tactical operations or situations.
  • the object 30 trajectory can be projected from the delivery device 20 at an appropriate angle relative to a surface 6.
  • a guide 24 can be used to cause the object to exit the delivery device 20 at an angle and cause the object to experience varied resistance when it is ejected from the guide 24.
  • the guide 24 can include a barrel and a friction device for imparting spin and deflection to the object to project the object 30 along a predetermined trajectory.
  • a controller 28, 29 can control the angle and position of the guide 24, as well as select the predetermined (or desired, or expected) trajectory from a plurality of trajectories or define the predetermined trajectory based on collected data from data sources.
  • each predetermined trajectory can include any parameters needed to setup the delivery device 20 to deliver the object 30 along that particular predetermined trajectory (e.g., trajectories 40, 42, 44).
  • the parameters can include an azimuthal direction of the guide 24 to produce a desired azimuthal direction of an object 30 exiting the delivery device 20.
  • the parameters can also include the amount and location of resistance to be applied to the object as the object is propelled toward the exit of the delivery device 20. These will be described in more detail below with regard to the delivery device 20.
  • the parameters can also include the force to be applied to the object 30 that will propel the object 30 from the delivery device 20 and cause the object to travel along the predetermined trajectory (e.g., trajectories 40, 42, 44).
  • the force can be applied to the object 30 via pneumatic, hydraulic, electrical, electro-mechanical, or mechanical power sources that can selectively vary the amount of force applied to the object 30.
  • the parameters can also include which one of a plurality of objects 30 and which one of a plurality of barrels 360 should be chosen to provide the desired trajectory.
  • the plurality of objects 30 can have many different features which are described in more detail below.
  • the controller 28, 29 can select the object 30 that is needed to produce the desired trajectory.
  • the controller 28, 29 can control an alert feature 26 (such as turn ON or OFF a light, turn ON or OFF an audible signal, play a synchronized video of a real-life delivery source, etc.) to indicate that an object 30 is about to be projected from the delivery device 20 toward the target zone 50.
  • the alert feature 26 can be any device that can alert the trainee 8 to be ready for the object 30 to exit the delivery device 20.
  • the object 30 can be a spherical or substantially spherical object used for training purposes.
  • the object 30 may be shaped to represent a desired sport.
  • the object 30 can come in different colors such as white, yellow, orange, red, blue, tan, grey, black, or a luminescent color.
  • the color of the object 30 can be selected for the sport for which the trainee 8 is being trained or for the type of training being used.
  • a colored pattern e.g., red, yellow, white, green, blue, orange, or black pattern
  • the colored pattern can be used to assist the trainee 8 in focusing intently on the object 30 so that they may pick up and track a particular sports ball quicker.
  • the object may have one or more surface features (e.g., smooth, dimples, bumps, recesses, ridges, grainy texture, etc.) that facilitate delivery along various trajectories.
  • the object 30 can be made from a material such as acrylonitrile butadiene styrene, polylactic acid, calcium carbonate, recycled paper, cotton, foam, plastics, calcites, rubber, a metal such as steel, lead, copper, aluminum, or metal alloys, a plant-based material, or a fungus-based material.
  • the device can include a magazine that may contain a plurality of objects.
  • the objects 30 in the magazine can be substantially the same or at least a portion of the objects 30 can have varied characteristics relative to the other objects 30.
  • Object characteristics can include but are not limited to, shape, size (e.g., longest dimension or length of the object, which in the case of a sphere is the diameter and in the case of a disk is the diameter along a major surface), color, surface features, density, material (e.g., inorganic, organic, metal, polymer, ceramic, or any combination thereof), or any combination thereof.
  • the delivery device 20 can include a first magazine with a first portion of objects having a first object characteristic, and a second magazine with a second portion of objects having a second object characteristic different from the first object characteristic.
  • the device is capable of selecting a single object from the first portion or the second portion.
  • Various parameters may be used to select different objects, which may include, but is not limited to, a method of training (e.g., a preselected training protocol), a measured or scored capability of a trainee, a selection by the trainee, an instruction from one or more devices (e.g., data input from a sensor, such as a sensor associated with an impact device) communicatively coupled to the controller 28, 29.
  • the object 30 can be sized such that it is significantly smaller than a corresponding regulation object.
  • a corresponding regulation object is determined based upon the intended sport for which the trainee is training. For example, when training for baseball, the corresponding regulation object would be the regulation size of a baseball.
  • the size of a football for professional football can be different than a size of a football for youth football, yet, both footballs are regulation size.
  • the object 30 of the current disclosure is significantly smaller than any of the regulation sizes for footballs or any other regulation objects.
  • the delivery device of the current disclosure does not project objects of the same size of a regulation object for the intended sport or activity for which the trainee is training.
  • the difference in size between the object 30 and a corresponding regulation object can be expressed as a value of Lo/Lr, wherein Lo is the largest dimension (i.e., length) of the object 30 and Lr is the largest dimension (i.e., length) of the regulation object.
  • the difference in size (or ration Lo/Lr) can be less than 0.9 or less than 0.8 or less than 0.7 or less than 0.6 or less than 0.5 or less than 0.4 or less than 0.3 or less than 0.2 or less than 0.1.
  • the difference in size can be at least 0.001 or at least 0.002 or at least 0.004 or at least 0.006 or at least 0.008 or at least 0.01 or at least 0.02 or at least 0.03 or at least 0.05 or at least 0.07 or at least 0.1 or at least 0.15 or at least 0.2 or at least 0.25 or at least 0.3.
  • the difference in size between the object 30 and a corresponding regulation object (Lo/Lr) can be within a range including any of the minimum and maximum values noted above, including, for example, but not limited to at least 0.001 and less than 0.9 or within a range of at least 0.001 and less than 0.5 or within a range of at least 0.002 and less than 0.006.
  • the diameter DI (see FIGS. ID, IE) of the object 30 can be at least 0.05 inches, at least 0.06 inches, at least 0.07 inches, at least 0.08 inches, at least 0.09 inches, at least 0.10 inches, at least 0.110 inches, at least 0.118 inches, at least 0.120 inches, at least 0.125 inches, at least 0.130 inches, at least 0.135 inches, at least 0.140 inches, at least 0.145 inches, at least 0.150 inches, at least 0.20 inches, or at least 0.25 inches.
  • the diameter DI of the object 30 can be less than 2.0 inches, less than 1.90 inches, less than 1.80 inches, less than 1.70 inches, less than 1.60 inches, less than 1.50 inches, less than 1.40 inches, less than 1.30 inches, less than 1.20 inches, less than 1.10 inches, less than 1.00 inches, less than 0.90 inches, less than 0.85 inches, less than 0.80 inches, less than 0.75 inches, less than 0.70 inches, less than 0.65 inches, less than 0.60 inches, less than 0.59 inches, less than 0.55 inches, less than 0.50 inches, less than 0.45 inches, less than 0.40 inches.
  • the diameter of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.05 inches and less than 2.0 inches, or within a range of at least 0.05 inches and less than 1.10 inches, or within a range of at least 0.07 inches and less than 1.00 inch.
  • the size of the object 30 can be at least 120 times smaller than a baseball, at least 220 times smaller than a softball, at least 400 times smaller than a soccer ball, at least 25 times smaller than a table tennis ball, at least 90 times smaller than a lacrosse ball, at least 40 times smaller than a hockey puck, at least 70 times smaller than a clay pigeon (for shooting sports), or at least 110 times smaller than a cricket ball.
  • the size of the object 30 can be smaller than a regulation table tennis ball, where the regulation table tennis ball is spherical, and its diameter is 1.57 inches (40 mm), where the diameter DI of the object 30 can be less than 1.57 inches (40 mm).
  • the weight of the object 30 can be at least 0.001 ounces, at least 0.002 ounces, at least 0.003 ounces, at least 0.004 ounces, at least 0.005 ounces, at least 0.006 ounces, at least 0.007 ounces, at least 0.008 ounces, at least 0.009 ounces, at least 0.010 ounces, at least 0.011 ounces, at least 0.012 ounces, at least 0.013 ounces, at least 0.014 ounces, at least 0.015 ounces, at least 0.20 ounces, at least 0.25 ounces, at least 0.30 ounces, at least 0.35 ounces, at least 0.40 ounces, at least 0.45 ounces, at least 0.50 ounces, at least 0.55 ounces, or at least 0.60 ounces.
  • the weight of the object 30 can be less than 10 ounces, less than 9 ounces, less than 8 ounces, less than 7 ounces, less than 6 ounces, less than 5 ounces, less than 4 ounces, less than 3 ounces, less than 2 ounces, less than 1.5 ounces, less than 1 ounce, less than 0.9 ounces, less than 0.8 ounces, less than 0.7 ounces, less than 0.6 ounces, less than 0.5 ounces, less than 0.4 ounces, less than 0.3 ounces, less than 0.2 ounces, less than 0.1 ounces, less than 0.09 ounces, less than 0.08 ounces, or less than 0.05 ounces.
  • the weight of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.001 ounces and less than 10 ounces, or within a range of at least 0.07 ounces and less than 0.9 ounces, or within a range of at least 0.002 ounces and less than 5 ounces, or within a range of at least 0.002 ounces and less than 1.5 ounces.
  • other sizes and weights of the object 30 can be used with the delivery device 20 to project the object 30 toward the target zone 50.
  • the weight of the object 30 can be adjusted for different training purposes and achieving various predetermined trajectories (e.g., 40, 42, 44).
  • the weight can depend on the size and materials used for the specific object 30 that support different training processes.
  • the variation of weight can result in speed changes of the object 30.
  • the shape of the object 30 can be substantially spherical. In another non-limiting embodiment, the object can be non-spherical, such as spheroidal. In another non-limiting embodiment, the object 30 can also have surface features (e.g., dimples, divots, holes, recesses, ridges, bumps, grainy textures, etc.) for trajectory modification.
  • the shape of the object 30 can be tailored to emulate certain predetermined trajectories such as knuckle ball throws, kicks from a soccer ball, etc.
  • the materials that make up the object 30 can be acrylonitrile butadiene styrene, polylactic acid, calcium carbonate, paper, cotton, or foam, any poly-based plastics, or plastics in general, calcites, metal such as steel, lead, copper or aluminum, rubber, a plant-based material, or a fungus-based material.
  • the object 30 can be coated with glow in the dark colors. This can be used in various training methods for vision training, such as segmenting training and strike zone training (described later).
  • the object 30 can be illuminated by ultraviolet lights such as black lights for isolated training processes for vision tracking. Being smaller than the regulation objects, the object 30 can be safer than regulation objects. A user may need to only wear safety glasses or a mask.
  • the delivery device 20 can be positioned at a distance L2 from a target zone 50 or trainee 8.
  • the distance L2 can be at least 3 feet, at least 4 feet, at least 5 feet, at least 6 feet, at least 7 feet, at least 8 feet, at least 9 feet, at least 10 feet, at least 11 feet, at least 12 feet, at least 13 feet, at least 14 feet, at least 15 feet, at least 16 feet, at least 17 feet, at least 18 feet, at least 19 feet, at least 20 feet, at least 25 feet, at least 30 feet, at least 35 feet, or at least 40 feet.
  • the distance L2 can be less than 210 feet, less than 205 feet, less than 200 feet, less than 190 feet, less than 180 feet, less than 170 feet, less than 160 feet, less than 150 feet, less than 140 feet, less than 130 feet, less than 120 feet, less than 110 feet, less than 100 feet, less than 90 feet, less than 80 feet, less than 70 feet, less than 60 feet, less than 55 feet, less than 50 feet, less than 45 feet, less than 40 feet, less than 35 feet, less than 30 feet, less than 25 feet, or less than 20 feet.
  • the distance L2 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 5 feet and less than 200 feet, or within a range of at least 5 feet and less than 55 feet, or within a range of at least 15 feet and less than 50 feet, or within a range of at least 15 feet and less than 40 feet, or within a range of at least 5 feet and less than 15 feet, or within a range of at least 10 feet and less than 25 feet.
  • the target zone 50 can be a rectangle defined by a height L5 and a width L4 and can represent a relative position in space, or the target zone 50 can be a physical collection device that captures the objects 30 that enter individual target segments 76.
  • the target can be moved up or down (arrows 68, FIG. 2) to position the target zone 50 at the desired height L3.
  • An imaging sensor 32 can capture imagery of the trainee 8 and communicate the imagery to the controller 28, 29.
  • the imaging sensor 32 can include a camera, a 2D camera, a 3D camera, a LiDAR sensor, a smartphone, a tablet, a laptop, or other video recorders.
  • the target zone 50 can be divided into a plurality of target segments 76 and the controller 28, 29 can initiate the projecting of the object 30 through a predetermined trajectory (e.g., trajectories 40, 42, 44) toward a specific target segment 76 or toward an area outside of the target zone 50 for various training methods.
  • a predetermined trajectory e.g., trajectories 40, 42, 44
  • the controller 28, 29 e.g., via selections from a coach/trainer 4, the trainee 8 or another user
  • the controller 28, 29 can deliver fast balls along the trajectory 42 that can arrive at the target zone 50 in the center target segment 76 (or any other appropriate segment 76). This can be used to help train the trainee 8 to recognize the object 30 and track it through the trajectory 42 through consistent training using the trajectory 42.
  • other trajectories can be selected for additional training.
  • These other trajectories can be designed by the trainee 8, the coach 4, other individual, or controller 28, 29 for the particular training method.
  • These other trajectories can also be designed to mimic at least a portion of the trajectories of a sports object that was projected through one or more game trajectories in a real-life event by a real- life athlete.
  • the trainee 8 can train like they are facing the real-life athlete that projected the sports object along the one or more game trajectories.
  • the scoring can be determined via imagery captured by one or more imaging sensors or by a coach/trainer 4 visually observing the interaction of the trainee 8 with the object 30.
  • the controller 28, 29 can analyze the imagery to determine the performance of the trainee 8 to the training goals or criteria for the training method being performed.
  • the controller 28, 29 can then establish a score for the trainee 8, which can be used to provide feedback to the trainee 8, coach/trainer 4, or other user for improving the trainee’s performance.
  • the score can be compared to previous scores to identify trends in the trainee’s performance.
  • the object 30 can be projected by the delivery device 20 along the trajectory 42.
  • the object 30 can be seen traveling along the trajectory 42 as indicated by the object position 30”.
  • the object 30 can be seen traveling along the trajectory 40, 44 as indicated by positions 30’ and 30’”.
  • FIGS. ID, IE are representative side views of an example object 30 which can be various shapes and sizes.
  • the object 30 in FIG. ID is shown to be a sphere with center axis 31 and diameter DI.
  • the object 30, when projected by the delivery device 20, can have a spin 94 imparted to the object 30 by the delivery device 20.
  • the spin 94 can be in any rotational direction around the axis 31.
  • the object 30 in FIG. IE is shown to be a spheroid with center axis 31 and diameter DI that is the shortest diameter of the spheroid shape.
  • the object 30, when projected by the delivery device 20, can have a spin 94 imparted to the object 30 by the delivery device 20.
  • the spin 94 can be in any rotational direction around the axis 31.
  • the spin 94 is shown to rotate the object 30 about the axis 31 similar to a spiral throw of a football. However, the spin 94 can also rotate the object 30 end over end about the axis 31 and any rotational direction in between.
  • the spin 94 can be “0” zero, at least 1 RPM, at least 2 RPMS, at least 3 RPMS, at least 4 RPMS, at least 5 RPMS, at least 10 RPMS, at least 20 RPMS, at least 50 RPMS, at least 100 RPMS, at least 200 RPMS, or at least 300 RPMS.
  • the spin 94 can be less than 120,000 RPMs, less than 116,000 RPMs, greater than 115,000 RPMs, less than 110,000 RPMs, less than 105,000 RPMs, less than 100,000 RPMs, less than 90,000 RPMs, less than 80,000 RPMs, less than 70,000 RPMs, less than 60,000 RPMs, less than 50,000 RPMs, less than 40,000 RPMs, less than 30,000 RPMs, less than 20,000 RPMs, less than 15,000 RPMs, less than 14,000 RPMs, less than 13,000 RPMs, less than 12,000 RPMs, less than 11,000 RPMs, less than 10,000 RPMs, less than 9,000 RPMs, less than 8,000 RPMs, less than 7,000 RPMs, less than 6,000 RPMs, or less than 5,000 RPMs.
  • the spin 94 of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least “0” zero RPMs and less than 11,000 RPMs, or within a range of at least 1 RPM and less than 116,000 RPMs, or within a range of at least 1 RPM and less than 115,000 RPMs, or within a range of at least 100 RPMs and less than 10,000 RPMs.
  • FIG. 2 is a representative functional diagram of a system 10 for training a trainee 8 to improve eye-body coordination in various sports.
  • the delivery device 20 can be adjusted in various ways to facilitate projecting the object 30 along a predetermined trajectory 40, 42, 44 toward a target zone positioned a distance L2 from the delivery device 20.
  • Distance L2 can be at least 5 feet, or at least 10 feet, or at least 15 feet, or at least 20 feet, or at least 25 feet, or at least 30 feet, or at least 35 feet, or at least 40 feet, or at least 45 feet, or at least 50 feet, at least 55 feet, or at least 75 feet, or up to 100 feet.
  • One or more imaging sensors 32 can be used to capture and record the travel of the object 30 along a predetermined trajectory (e.g., 40, 42, 44).
  • the imaging sensors 32 can be placed at any position around the system 10 with two possible positions indicated in FIG. 2. Users (e.g., a coach 4, trainer 4, trainee 8, individual 8, or others) can also track the object along the predetermined trajectory and score the repeatability of the object 30 to travel along the predetermined trajectory.
  • the imaging sensors 32 can capture and record how the eyes of the trainee 8 track the object 30 along the predetermined trajectory.
  • Imagery collected via the imagery sensors 32 can be analyzed by a local controller 28 or a remotely located controller 29 to determine how the trainee 8 tracks the object 30 along a predetermined trajectory and the controller(s) 28, 29 can score the ability of the trainee 8 to track the object 30 along a predetermined trajectory. The score can be used to improve the capability of the trainee to track the object 30, adjust the delivery device 20, or select subsequent trajectories of another object 30 (such as when the trainee 8 performs well enough to progress to a more difficult trajectory).
  • the imaging sensors 32 can capture physical characteristics of the trainee 8 as well as attributes of a training field 100. Imagery from the imaging sensors 32 can be used by the controller 28, 29 to perform facial recognition of the trainee 8, voice recognition (via audio sensors in the imaging sensors 32), detection and recognition of body movements of the trainee 8, detection and recognition of objects in the training field 100, detection and recognition of body movements of the trainee 8 for controlling the delivery device 20 (e.g., “visual servoing”), detection and recognition of light and color for controlling the delivery device 20, creation of a virtual Grid, and combinations thereof to control operation of the delivery device 20.
  • the delivery device 20 e.g., “visual servoing”
  • detection and recognition of light and color for controlling the delivery device 20 creation of a virtual Grid, and combinations thereof to control operation of the delivery device 20.
  • the detection and recognition of body movements of the trainee 8, and facial recognition of the trainee 8 can be used by the controller 28, 29 to adjust one or more of the parameters of the delivery device 20 to deliver objects to a target zone, where the target zone 50 has been adjusted based on the identified trainee 8.
  • the physical characteristics of the trainee 8 can be retrieved from a database (e.g., trainee database 344 in FIG. 6) once the trainee 8 has been identified. From these physical characteristics, the controller 28, 29 can adjust the target zone 50 accordingly to accommodate the trainee 8.
  • the physical characteristics of the trainee 8 can include one or more of the following: an overall height of the trainee; a height of a knee of the trainee; a height of a shoulder of the trainee; a length of a leg of the trainee; a length of an arm of the trainee; a position of the trainee; and a width of the trainee.
  • the voice recognition can be used by the controller 28, 29 to identify the trainee 8 or the coach 4 and adjust one or more of the parameters of the delivery device 20 to deliver objects to a target zone, where the target zone 50 has been adjusted based on the identified trainee 8.
  • the voice recognition can also be used to identify if the trainee 8 or coach 4 is approved for use of the delivery device 20 and enable operation is approved or disable operation if not approved. If approved, then the trainee 8 or coach 4 can issue voice commands to the delivery device 20 to control operation of the delivery device 20, such as pause, resume, start session, select mode, end session, select training session, provide score, provide training statistics, etc.
  • the detection and recognition of objects in the training field 100 can be used by the controller 28, 29 to determine the type of activity for which training is to be administered to the trainee 8, where the type of activity can be for a sport (e.g., baseball, softball, cricket, hockey, tennis, table tennis, football, soccer, lacrosse, handball, racket ball, basketball, shooting sports, etc.), tactical training, trainee rehabilitation training, or any other activity that can be benefit from improving eye -body coordination of the trainee 8.
  • a sport e.g., baseball, softball, cricket, hockey, tennis, table tennis, football, soccer, lacrosse, handball, racket ball, basketball, shooting sports, etc.
  • tactical training e.g., trainee rehabilitation training
  • trainee rehabilitation training e.g., a sport
  • trainee rehabilitation training e.g., a sport
  • the controller 28, 29 detects a home plate near the target zone area in the imagery from the imaging sensor(s) 32
  • baseball or softball may be the type of activity to be trained.
  • the controller 28, 29 can use the imagery from the imaging sensor(s) 32 to determine the type of sport tool 12 to be used, the characteristics of the training field around the target zone 50, gear worn by the trainee 8, equipment held by the trainee 8, etc. to determine the type of activity for which the trainee 8 is training.
  • the various methods of training described in this disclosure can be used for any or all of the types of training activities on which the trainee 8 can be trained.
  • the associated parameters for that type of activity can be retrieved from a database (e.g., activity database 346 in FIG. 6) and used by the delivery device 20 to control delivery of the object 30 toward the target zone.
  • the associated parameters can include parameters to define a target zone 50, training field distances, define screens to avoid when projecting the object 30 (e.g., in segmenting training), etc.
  • the detection and recognition of body movements of the trainee 8 can be used by the controller 28, 29 to provide interactive control of the delivery device 20 by the trainee 8.
  • This can be referred to as “visual servoing” which is a term used to indicate controlling a robot’s movements or actions based on visual actions of the trainee 8. For example, when the type of activity is baseball, then the trainee 8 can raise their hand to indicate to the delivery device 20 to pause projection of the object 30 until the trainee 8 is ready. The trainee 8 can then indicate their readiness by lowering their hand, which can indicate to the delivery device 20 to begin the sequence to deliver the next object 30 toward the target zone 50.
  • Other body movements can also be used by the trainee 8 to interact with the delivery device 20 to control or adjust projection of the object 30 along the pre-determined trajectory.
  • pointing left, right, up, or down can indicate which area of the target zone that the trainee 8 wishes the delivery device 20 to deliver the next object 30; waving can indicate for the delivery device 20 to halt delivering objects 30, increase a speed of the next object 30, or decrease a speed of the next object 30, etc.
  • the detection and recognition of light and color can be used by the controller 28, 29 to control the delivery device 20.
  • the delivery device 20 can be configured to deliver an object 30 to a specific location in the target zone 50, where the location is determined by where a light illuminates a portion of the target zone 50. If multiple objects 30 are to be sequentially projected to the target zone 50, then causing the light to illuminate various locations in the target zone 50 can cause the delivery device 20 to track to deliver the object 30 to the illuminated location.
  • the controller 28, 29 can detect color or color patterns to control the delivery device 20.
  • the color or color patterns can indicate a type of activity for which the delivery device 20 is to be used for training.
  • the color or color patterns can indicate a skill level required for the training, or a skill level of the trainee 8.
  • the color or color patterns can indicate the areas of the target zone 50 at which the current trainee 8 performs best, performs worst, or somewhere in between. This can be referred to as a “heat map” where the colors in the heat map indicate performance levels of the trainee 8.
  • the delivery device 20 can be controlled by the controller 28, 29 to deliver objects to areas of the target zone 50 indicated by the heat map to be trouble spots for the trainee 8. These heat maps can be generated from previous training sessions and updated after each training session.
  • the “Heat Map” is a type of graph, generally with the same dimensions of a target zone.
  • the graph can be used to portray where a specific batter has a greater percentage of hits within the target zone 50.
  • locations within the strike zone of hits can be represented generally by reds/orange and yellow “hot” colors.
  • Specific areas within the target zone 50 represented by blue can portray locations where the trainee 8 is having the least success. These areas can also be known as “soft contact areas.”
  • These blue or weak areas in a trainee’s heat map may indicate a specific vision deficiency for the trainee, such as depth perception, anticipation timing, speed of visual processing, visual reaction, and response timing.
  • An opponent would prefer to throw into the blue areas to have greater probability of success against the trainee 8.
  • Heat maps can be used in multiple sports where a target zone 50 is used, such as a goalie in Soccer, Hockey, or Lacrosse.
  • the heat map can represent where in the target zone the goalie is most vulnerable and gives up the most goals.
  • Heat Maps can be used in Tennis to determine where a player has the least success in returning serves, volleys, etc., on the court or from which side (e.g., back hand or forehand).
  • Heat maps can also be created and utilized in Tactical training to determine where (location) a trainee 8 has the slowest recognition and reaction times.
  • the controller 28, 29 can also create a virtual grid 260 that can be displayed to the trainee 8 via a pair of augmented reality goggles to indicate the portion of the grid to which the next object 30 is going to be projected.
  • the grid can represent a target zone 50 in baseball, softball, soccer, hockey, tennis, etc.
  • the virtual grid 260 can also be for a horizontal playing surface, such as in tennis, table tennis, cricket, etc.
  • Each serve, groundstroke, volley, etc., in a game or practice can be captured via imaging sensors 32 and the controller 28, 29 and stored in a database (e.g., Game statistics database 36 in FIG. 3), according to the specific location on the court (or in the grid) where the ball impacted the court.
  • the delivery device 20 can use the stored grid locations 262 to simulate a real-time game or practice event.
  • the controller 28, 29 can read the stored grid locations 262 and cause the delivery device 20 to project successive objects 30 to the grid locations 262 according to the stored sequence.
  • This particular training method can incorporate various colors where the individual section in the grid turns a specific color depicting the type of stroke that was used. For example, Red for Serves, Yellow for Volleys, Blue for Lobs, etc.
  • the trainee 8 can use Augmented Reality glasses to see the same grid that the delivery device 20 is using and to see where the next stroke will be delivered.
  • the training allows for changing velocity of strokes to match the skill level of trainee 8. For example, although capturing a real-time event between world-class professionals, the locations in the grid can remain the same, but the velocity of strokes can be reduced.
  • An interactive training method 110 can be used to instruct a trainee 8 throughout the training session.
  • a display 340 can be configured to interact between the controller 28, 29, and the trainee 8 during training sessions with the delivery device 20.
  • One-way interactive training method 110 can use interactive videos displayed on the display 340 to provide instructions during the session, provide reminders (such as reminding trainee of previous training aspects or current expectations of performance or actions), and offer encouragement during the training session.
  • the interactive videos can be tailored to a trainee’s current skill level. For example, videos for lower skilled trainees can focus on more basic skills training to lay a foundation that can be built upon in further training sessions, and videos for higher skilled trainees can focus on specialized aspects of the training to provide improvements in more and more specialized skills.
  • the interactive training method 110 can include pre-recorded video files that can be retrieved from a database that is coupled to the controller 28, 29 and played back on the display 340 at appropriate times during the training session.
  • a video can instruct the trainee on how to setup the delivery device 20 and how to generally interact with the delivery device 20 during the session.
  • the interactive video can be used to describe how to setup the training field 100 to perform segmenting training, where one or more screens are positioned between the delivery device 20 and the trainee 8 to hide a portion of the trajectory of the object 30 so the amount of time the trainee 8 has to track the object 30 is varied (such as by moving the screens toward or away from the trainee 8).
  • the interactive video can be used to describe how to setup the training field 100 to perform other training methods using the delivery device 20.
  • the interactive video can describe attributes of the training session prior to projecting an object 30 from the delivery device 20 toward the trainee 8 or a target zone 50.
  • the interactive video on the display 340 can display scoring for the trainee related to their interaction with the objects 30. Based on the scoring, the controller 28, 29 via the interactive video can highlight areas of needed improvement as well as instruct the trainee 8 on how to improve.
  • the interactive video can also suggest other training methods for the trainee 8 that may be more focused on those areas of needed improvement.
  • the interactive video can recommend additional training methods or sessions that can build on the strengths of the trainee 8 or identify other areas where the trainee 8 may be weak. Based on these recommendations, the trainee 8 can setup the delivery device 20 to perform other training methods.
  • the controller 28, 29 can then recall interactive videos for the new training and display the new interactive videos during the new training session.
  • the controller 28, 29 can use imaging sensors 32 to capture imagery of the trainee’s performance and, from analysis of the imagery, determine which instructional interactive video to output to the display 340.
  • the controller 28, 29 can play a video on the display 340 to illustrate to the trainee 8 the desired body movements during swinging at the object 30 prior to the delivery device 20 projecting the next object 30 toward the trainee 8 or target zone 50.
  • the controller 28, 29 detected incorrect positioning of a goalie in front of a hockey net or incorrect defensive movements, then the controller 28, 29 can play a video on the display 340 to illustrate the correct positioning or correct defensive movements prior to the delivery device 20 projecting the next object 30 toward the trainee 8 or target zone 50.
  • the interactive videos may include a demonstration as to why a specific drill is important for vision training, instructions on how to setup the delivery device 20, safety instructions, as well as instructions on how to interact with the delivery device 20 via voice, motion, or other commands (such as via an input device 342).
  • a training curriculum may be several training sessions in length for teaching the trainee 8 full concepts of a training activity or sport with one or more interactive videos initiated throughout the training sessions to instruct or remind the trainee 8.
  • the training curriculum may be used for baseball/softball trainees learning the "Approach" to hitting, plate, or strike zone discipline, 15 seconds of excellence on defense, setting up segmenting screens, etc.
  • the training curriculum may be used for Quick Reaction Drills, where the interactive video can be used to explain how to perform and set up quick reaction drills, how to score the drills, and how to divide into teams.
  • the training curriculum may be used for tennis, where the interactive video can direct a trainee through various ground stroke drills as possibly telling the trainee 8 when to switch to back hand.
  • the interactive videos can be a software format that can be stored in non-transitory memory in the controller 28, 29 and recalled from a database to be displayed on the display 340 for viewing by the trainee 8.
  • the controller 28, 29 can control when the interactive videos are displayed on the display 340 during the training session. For example, the controller 28, 29 can periodically pause operations of the delivery device 20 to allow an interactive video to be played on the display 340. After the video is played, the controller 28, 29 can then resume operations of the delivery device 20 to perform the desired training method.
  • the interactive videos can be recorded by a company that manufactures the delivery device 20 or provides training support, and they can be delivered along with the delivery device 20 or as a separate delivery to the customer. Additionally, the interactive videos can be recorded by a coach, a trainee, a parent or guardian of the trainee, or other individual 4 to insert their own video commands as part of the training sessions. For example, a parent oversees can record an instructional video to be played during the training sessions to encourage or instruct the trainee.
  • a coach can develop their own set of instructional videos to be paired with the various training curriculums and in this way, they can teach multiple players the same instructions without being physically present.
  • the interactive videos can also be used, by the controller 28, 29 to interact with the trainee 8 during training sessions, by requiring inputs from the trainee 8 to progress through the training session.
  • the trainee 8 can command the delivery device 20 to setup for a particular training session by instructing the controller 28, 29, via voice commands, to setup the delivery device 20 for the training session.
  • the delivery device 20 can indicate reception of the voice command by an indicator light or a movement of the delivery device 20.
  • the trainee 8 can then get in position to receive and react to an object 30 projected from the delivery device 20.
  • the trainee 8 can then command the delivery device 20 to begin projecting a sequence of objects from the delivery device 20 by providing a voice command or body movement command to the delivery device 20 (e.g., the controller 28, 29).
  • the trainee 8 can interact with the delivery device 20 throughout the training session via voice commands or body movement commands (e.g., hand movement, head nod, foot movement, sport tool movement, etc.)
  • the trainee 8 can also use voice commands to tailor a training session to project objects 30 to the trainee 8 based on personalized parameters. For example, prior to the start of the training session, the trainee through voice command can ask the delivery device 20 for a “Personal Option.” The delivery device 20 can receive the command and respond to the command by speaking or displaying “What is your personal option?” The trainee 8 may then respond with the personal options for setting up the delivery device 20.
  • the trainee 8 may request that the delivery device 20 project one or more objects 30 as a “righthand slider pitch” at a speed of “80 miles per hour (MPH)”, and to a target zone 50 area “7 (low and away).”
  • the delivery device 20 can respond audibly or visually by asking “Do you have Mask and Safety Glasses on?”
  • the trainee 8 can respond with “Yes” or “No.” If “No”, then the delivery device 20 can pause operation until the trainee 8 responds with “Yes.”
  • the delivery device 20, after recognizing the “yes” response from the trainee 8, can request the trainee 8 to say “Start” when ready to begin.
  • the “Start” can also be communicated to the delivery device 20 via body movements instead of voice commands.
  • the delivery device 20 can then deliver a set of objects 30 to the trainee 8 based on the personalized parameters provided by the trainee 8 (or coach 4). At end of the set of objects 30, the delivery device 20 can pause operations, can display the trainee’s scores for the set of objects 30, recommend additional training sessions and parameters. The trainee 8, can command the delivery device 20 to continue with the current parameters and project another set of objects 30 one after another toward the trainee 8.
  • the trainee 8 can adjust parameters of the delivery device 20 by again asking the delivery device 20 for a “Personal Option” and repeating the process for commanding the delivery device 20 to setup for delivery of additional objects 30 per the new parameters. Then the trainee 8 can interact with the delivery device 20 via voice commands or body movement commands to progress through the training session.
  • FIG. 3 is a representative functional block diagram of an object 30 delivery device 20 that can support the systems and methods of the current disclosure, as well as other systems and methods.
  • the delivery device 20 can include a chassis 22 adjustably mounted to a base 18 that can move the delivery device 20 along a surface 6 in directions 166, 168.
  • the delivery device 20 can include one or more local controllers 28 (referred to as controller 28) that can be communicatively coupled to components within the delivery device 20 via a network 35 as well as communicatively coupled to one or more remote controllers 29, one or more imaging sensors 32, and one or more external databases 36 via one or more networks 33a, 33b, 34.
  • the network 35 can include one or more internal networks 35 for communicating to components of the delivery device 20.
  • the controller 28 can be communicatively coupled to a non-transitory memory 37 that can store a delivery device parameter database 38. Sets of delivery device parameters can be stored in the database 38, where each set can be used to configure, via the controller 28, 29, the delivery device 20 to deliver an object 30 along a respective predetermined trajectory. These internal networks 35 can include networks with standard or custom network protocols to transfer data and commands to/from the delivery device 20 components.
  • the one or more remote controllers 29 (referred to as controller 29) can be communicatively coupled to the local controller 28 via a network 33a that communicatively couples the external network 34 to the internal network 35 (with network 33b not connected). In this configuration, the remote controller 29 can command and control the delivery device 20 components directly without direct intervention of the local controller 28.
  • the controller 29 can be communicatively coupled to the controller 28 via the network 33b, which is not directly coupled to the network 34 (with network 33a not connected). In this configuration, the controller 29 can communicate configuration changes (or other commands and data) for the delivery device 20 to the controller 28, which can then carry out these changes to the components of the delivery device 20. If should be understood, in another configuration, the networks 33a, 33b, 34, 35 can all be connected with the controllers 28, 29 managing the communications over the networks.
  • the delivery device 20 can include a guide 24 that can modify the trajectory and spin of the object 30 as the object 30 is projected toward the target zone 50 or trainee 8.
  • the guide 24 can include a barrel 360 with a center axis 90 through which the object 30 can be projected toward a friction device 200.
  • the friction device 200 can have a center axis 92 and can be rotated about the center axis 92 to alter the engagement of the object 30 when it impacts the friction device 200 at position 30’ ’ ’ .
  • An object 30 can be received from the object storage area 120 and located at position 30’ in a first end of the barrel 360.
  • a pressurized air source 152 can be fluidically coupled to the first end of the barrel 360 via conduit 158, with delivery of a volume of pressurized air controlled by a valve 154.
  • the valve 154 and the air source 152 can be controlled by the controller 28, 29 to adjust the air pressure applied to the object 30 at position 30’ as well as the volume of air applied.
  • pressurized air is only one possible option for delivering a desired force to the object 30 to project the object 30 through the barrel 360.
  • Pneumatics other than pressurized air can be used as well as hydraulics, electrical, electro-mechanical, or mechanical power sources that can supply the desired force to the object 30 to project the object 30 through the barrel 360.
  • an air pressure can be at least 3 PSI (i.e., pressure per square inch), at least 4 PSI, at least 5 PSI, at least 6 PSI, at least 7 PSI, at least 8 PSI, at least 9 PSI, at least 10 PSI, at least 20 PSI, at least 30 PSI, at least 40 PSI, at least 50 PSI, at least 60 PSI, at least 70 PSI, at least 80 PSI, at least 90 PSI, at least 100 PSI.
  • PSI pressure per square inch
  • the air pressure can be less than 220 PSI, less than 210 PSI, less than 200 PSI, less than 190 PSI, less than 180 PSI, less than 170 PSI, less than 160 PSI, less than 150 PSI, less than 140 PSI, less than 130 PSI, less than 120 PSI, less than 110 PSI, less than 100 PSI, or less than 90 PSI.
  • the air pressure may be within a range including any one of the minimum and maximum values noted above, including for example, but not limited to at least 5 PSI and less than 220 PSI inches, or within a range of at least 5 PSI and less than 200 PSI, or within a range of at least 10 PSI and less than 200 PSI, or within a range of at least 5 PSI and less than 180 PSI.
  • a length of the barrel 360 can be at least 2 inches, at least 3 inches, at least 4 inches, at least 4.5 inches, at least 5 inches, at least 5.5 inches, at least 6 inches, at least 7 inches, at least 8 inches, at least 9 inches, at least 10 inches, at least 11 inches, or at least 12 inches.
  • the length of the barrel 360 can be less than 48 inches, less than 36 inches, less than 24 inches, less than 23 inches, less than 22 inches, less than 21 inches, less than 20 inches, less than 19 inches, less than 18 inches, less than 17 inches, less than 16 inches, less than 15 inches, less than 14 inches, less than 13 inches, less than 12 inches, less than 11 inches, less than 10 inches, less than 9 inches, less than 8 inches, less than 7 inches, less than 6 inches, less than 5.5 inches.
  • the length of the barrel 360 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 2 inches and less than 48 inches, or within a range of at least 4.5 inches and less than 24 inches, or within a range of at least 4.5 inches and less than 5.5 inches, or within a range of at least 3 inches and less than 12 inches.
  • a controlled volume of pressurized air (or other pressurized gas) can be delivered to the first end of the barrel 360 for a predetermined length of time to force the object 30 to be propelled through the barrel 360 at a predetermined velocity, such that at position 30” the object 30 achieves a desired velocity vector 174.
  • the velocity vector 174 can range from 25 miles per hour to 135 miles per hour. If the friction device 200 is not in a position to interfere with the trajectory 46 of the object 30 as it is propelled from a second end of the barrel 360, then the object 30 may continue along trajectory 46 and exit the delivery device 20 without having additional spin or deflection imparted to the object 30 by the friction device 200.
  • object 30 can engage (or impact) the friction device 200 at position 30”’, thereby deflecting the object 30 from the axis 90 of the barrel 360 at an angle and imparting a spin 94 to the object. Impacting the friction device 200 can cause the object 30 to begin traveling along a predetermined trajectory 40 with an altered velocity vector 176 at position 30””.
  • the amount of spin 94 and the amount of deflection from trajectory 46 to trajectory 48 can be determined by the velocity vector 174 of the object 30 at position 30”, the spin of the object 30 at position 30”, the azimuthal position of the friction device 200 about its center axis 92, the azimuthal position of the friction device 200 about the center axis 90 of the barrel 360, the incline (arrows 89) of the friction device 200 relative to the center axis 90, the length (arrows 88) of the friction device 200, and the surface material on the friction device 200.
  • the object 30 can then continue along the predetermined trajectory 48 to the target zone 50 or toward the trainee 8.
  • the controller 28, 29 can modify the parameters of the delivery device 20 (such as changing the velocity vector 174 and spin of the object 30 at position 30”, changing the azimuthal position of the friction device 200 about its center axis 92, changing the azimuthal position of the friction device 200 about the center axis 90 of the barrel 360, changing the incline (arrows 89) of the friction device 200 relative to the center axis 90, changing the length (arrows 88) of the friction device 200, or changing the surface material on the friction device 200) to deliver a subsequent object 30 along a new predetermined trajectory 48.
  • the parameters of the delivery device 20 such as changing the velocity vector 174 and spin of the object 30 at position 30”, changing the azimuthal position of the friction device 200 about its center axis 92, changing the azimuthal position of the friction device 200 about the center axis 90 of the barrel 360, changing the incline (arrows 89) of the friction device 200 relative to the center axis 90, changing the length (arrows 88) of the friction device 200, or changing
  • parameters of the barrel position and delivery device 20 chassis 22 position that can be used to alter a trajectory of an object 30 to travel along a predetermined trajectory (e.g., 40, 42, 44) to a target zone (or trainee 8).
  • a predetermined trajectory e.g. 40, 42, 44
  • Some of these parameters affect the orientation of the barrel 360 within the delivery device 20, while others can affect the orientation and position of the chassis 22 of the delivery device 20 relative to a surface 6, while others affect selecting an object 30 to be propelled from the barrel 360.
  • all these parameters can have an impact on the trajectory of the object 30 as it is projected from the delivery device 20 toward the target zone 50 or trainee 8.
  • the barrel 360 can be rotated (arrows 86) about its center axis 90. This can be beneficial if the barrel 360 includes a non-smooth inner surface, such as an internal bore of the barrel 360 with rifling grooves (i.e., a surface with helically oriented ridges or grooves along the internal bore of the barrel 360) that can impart a spin (clockwise or counterclockwise) to the object 30 as the object 30 travels through the internal bore of the barrel 360.
  • rifling grooves i.e., a surface with helically oriented ridges or grooves along the internal bore of the barrel 360
  • Other surface features can also be used on the internal bore of the barrel 360 to affect the spin of the object 30 as it travels through the barrel 360.
  • the barrel 360 can be rotated (arrows 84) about the axis 91 to adjust the direction of the object 30 as it exits the barrel 360.
  • the barrel 360 can also be moved (arrows 85) to adjust a distance between the exit end of the barrel 360 and the friction device 200.
  • the friction device 200 can be coupled to a structure (e.g., structure 210 via support 202) that can be used to rotate the friction device 200 about the center axis 90 of the barrel 360. This can be used to change the deflection angle imparted to the object 30 when it impacts the friction device 200 at position 30’”.
  • a structure e.g., structure 210 via support 202
  • This can be used to change the deflection angle imparted to the object 30 when it impacts the friction device 200 at position 30’”.
  • the chassis 22 can be rotationally mounted to a base 18 at pivot point 148.
  • Actuators 144 can be used to rotate the chassis 22 about the X-axis (arrows 81) or the Y-axis (arrows 82) relative to the surface 6 by extending/retracting.
  • the base 18 can rotate the chassis 22 about the Z-axis (arrows 80) relative to the surface 6.
  • the support 142 can be used to raise or lower (arrows 83) the chassis 22 relative to the surface 6.
  • Supports 146 can be used to stabilize the support 142 to the support structure 160.
  • the support structure 160 can have multiple wheels 164 with multiple axles 162 to facilitate moving the support structure 160 along the surface 6 in the X and Y directions (arrows 166, 168).
  • the support structure 160 can house an optional controller 169 for controlling the articulations of the base 18 to orient the chassis 22 in the desired orientation. This controller 169 can be positioned at any location in or on the base 18 as well as in or on the chassis 22. It is not required that the controller 169 be disposed in the support structure 160.
  • the delivery device 20 can include one or more storage bins 150 for storing objects 30 and delivering an object 30 to the barrel 360 at position 30’.
  • storage bin 150a can contain objects 30a with storage bin 150b containing objects 30b.
  • the controller 28, 29 (or coach 4, or trainee 8, or other individual) can select which storage bin 150a, 150b is to provide the object 30 to the barrel 360 at position 30’. If the object 30a is selected, then the storage bin 150a can release one object 30a that can be directed to the position 30’ via a conduit 156.
  • the storage bin 150b can release one object 30b that can be directed to the position 30’ via a conduit 156. Only one object 30a or 30b is released at a time in this configuration.
  • the conduit 156 can be a collection conduit that receives each object 30a or 30b and holds them in a chronological order in the conduit 156 as to when they were received at the conduit 156 from the storage bins 150a, 150b.
  • a mechanism 155 can be used to release the next object (30a or 30b) into the barrel 360 at position 30’, thereby delivering the objects 30a, 30b to the barrel 360 in the order they were received at the conduit 156.
  • the mechanism 155 can still be used to prevent the escape of pressurized gas into the conduit 156. However, the mechanism 155 is not required. Other means can be provided to prevent loss of pressurized gas through any other path other than through the barrel 360.
  • FIG. 4 is a representative perspective view of a friction device 200 for the delivery device 20.
  • the friction device 200 can be rotated about its axis 92 (arrows 87) as well as being rotated about the axis 90 of the barrel 360 (arrows 96).
  • a support e.g., support 202
  • the barrel 360 can be rotated about the axis 90 (arrows 86) and moved toward or away from the friction device 200 (arrows 85).
  • the object 30 can exit the barrel 360 with a velocity vector 174 at position 30”.
  • a spin 94 can be imparted to the object 30 as well as deflecting the object 30 substantially by an angle Al relative to the center axis 90 of the barrel 360.
  • the object 30 can travel along the resulting trajectory 48 from the object 30 impacting the friction device 200.
  • the object 30 can have a resulting velocity vector 176 at position 30””.
  • the velocity vector 174, 176, 178 can be a velocity directed in any 3D direction, with the velocity of the object 30 being at least 4 MPH (i.e., miles per hour), at least 5 MPH, at least 6 MPH, at least 7 MPH, at least 8 MPH, at least 9 MPH, at least 10 MPH, at least 15 MPH, at least 20 MPH, at least 25 MPH, at least 30 MPH, at least 35 MPH, at least 40 MPH, at least 45 MPH, at least 50 MPH, at least 55 MPH, at least 60 MPH, at least 65 MPH, at least 70 MPH, at least 75 MPH, at least 80 MPH, at least 90 MPH, or at least 100 MPH.
  • MPH i.e., miles per hour
  • the velocity vector 174, 176, 178 can be a velocity directed in any 3D direction, with the velocity of the object 30 being less than 220 MPH, less than 210 MPH, less than 200 MPH, less than 190 MPH, less than 180 MPH, less than 170 MPH, less than 160 MPH, less than 150 MPH, less than 145 MPH, less than 140 MPH, less than 135 MPH, less than 130 MPH, less than 125 MPH, less than 120 MPH, less than 115 MPH, less than 110 MPH, less than 105 MPH, less than 100 MPH, less than 95 MPH, less than 90 MPH, less than 85 MPH, less than 80 MPH, less than 75 MPH, less than 70 MPH, less than 65 MPH, less than 60 MPH, less than 55 MPH, less than 50 MPH, less than 45 MPH, or less than 40 MPH.
  • the velocity of the object 30 at the velocity vector 174, 176, 178 may be within a range including any one of the minimum and maximum values noted above, including for example, but not limited to at least 5 MPH and less than 75 MPH, or within a range of at least 15 MPH and less than 100 RPM, or within a range of at least 15 MPH and less than 220 MPH.
  • the friction device 200 can include a ramp 206 with one or more surface materials attached to it.
  • the surface material controls a friction applied to the object 30 when the object 30 impacts the friction device 200. Therefore, it can be beneficial to allow the delivery device 20 to automatically select between various surface materials (e.g., 204, 205, 208).
  • One side of the ramp 206 can have multiple surface materials 204, 205 attached thereto. Moving the friction device 200 axially (arrows 88) can cause the object to impact either the surface material 204 or 205. If the surface materials 204, 205 have different textures or friction coefficients, then impacting one or the other can alter the spin 94 or trajectory 48 of the object 30 when it impacts the friction device 200.
  • the ramp 206 can also have one or more surface materials (e.g., 208) attached to an opposite side of the ramp 206.
  • the ramp 206 can be configured to rotate about the axis 92 such that the surface material 208 is positioned to impact the object 30 at position 30”’.
  • the surface materials 204, 205, 208 can be various wool fibrous materials, plastics, cottons, foam rubbers, metals such as steel, lead, copper, aluminum, or metal alloys, plant-based material, or fungus-based material.
  • the surface material 204, 205, 208 can have a friction coefficient that is at least 0.010, at least 0.015, at least 0.20, at least 0.25, at least 0.30, at least 0.35, at least 0.40, at least 0.45, at least 0.50, at least 0.55, at least 0.60, at least 0.65, at least 0.70, at least 0.75, at least 0.80, at least 0.85, at least 0.090, at least 0.095, at least 0.10, at least 0.15, at least 0.20, at least 0.30, at least 0.40, at least 0.50, at least 0.60, at least 0.70, at least 0.80, at least 0.90, or at least 1.00.
  • the surface material 204, 205, 208 can have a friction coefficient that is less than 1.50, less than 1.45, less than 1.40, less than 1.35, less than 1.30, less than 1.25, less than 1.20, less than 1.15, less than 1.10, less than 1.05, less than 1.00, less than 0.95, less than 0.90.
  • the friction coefficient may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.20 and less than 1.35, or within a range of at least 0.01 and less than 1.50, or within a range of at least 0.25 inches and less than 1.35.
  • FIG. 5 is a representative partial cross-sectional view of the friction device 200 along line 5-5 as shown in FIGG.
  • the structure 210 can rotate (arrows 95) about the center axis 90 of the barrel 360.
  • the friction device 200 With the friction device 200 coupled to the structure 210 via the rotatable support 202, the friction device 200 can be rotated (arrows 96) about the center axis 90.
  • the friction device 200 can be inclined relative to the center axis 90 by being raised up or down (arrows 89) relative to the center axis 90. Therefore, the friction device 200 can be positioned at any azimuthal position about the center axis 90 as well as being rotated about its own axis 92 (arrows 87).
  • the desired spin 94 to the object 30 can also be represented as the desired yaw, pitch, and roll of the object 30.
  • FIG. 6 is a representative functional block diagram of a control system 350 for a training system 10.
  • the local controller 28 can be communicatively coupled to a remotely located controller 29 via network 33b, the game statistics database 36 via network 33a, and input device 342, and a display 340.
  • the input device 342 can provide a human-machine- interface (HMI) that accepts user inputs (trainee 8, coach 4, or others) including voice commands and transmits the user inputs to one or more processors 330 of the controller 28.
  • HMI human-machine- interface
  • the input device 342 can be a keyboard, mouse, trackball, virtual reality sensors, graphical user interface GUI, a touch screen, mechanical interface panel switches, a button, a stride sensor, microphone to input voice commands recognized by the controller 28, video sensors to detect gestures of a trainee 8 or coach 4 other individual.
  • the display 340 can be used to display performance scores to a user (i.e., trainee 8, coach 4, other individual, etc.), GUI interface windows, training trajectory (single or multiple), emulated game trajectory, and player 14 associated with the game trajectory, video of game trajectory, video of training trajectory while or after object is projected to target zone, training statistics and trends, selection criteria for objects 30, selection criteria for training trajectories 40, delivery device 20 parameters and parameters selected by the input device.
  • a user i.e., trainee 8, coach 4, other individual, etc.
  • GUI interface windows i.e., training trajectory (single or multiple), emulated game trajectory, and player 14 associated with the game trajectory, video of game trajectory, video of training trajectory while or after object is projected to target zone, training statistics and trends, selection criteria for objects 30, selection criteria for training trajectories 40, delivery device 20 parameters and parameters selected by the input device.
  • the display 340 can be used to display a type of pitch, the speed of delivery of object 30 at the target zone 50, a location of delivery of the object 30 at the target zone 50, text messages about the delivered object 30, animations, videos, photos, or alerts about the delivered object 30.
  • the display is intended to provide the trainee 8 or coach 4 immediate feedback about the delivered object 30.
  • the input device 342 and display 340 are shown separate, but they can be integrated together in a device, such as a smartphone, smart tablet, laptop, touchscreen, etc.
  • the network interface 332 can manage network protocols for communicating with external systems (e.g., controller 29, database 36, imagery sensors 32, tracking device 190, etc.) to facilitate communication between the processor(s) 330 and the external systems.
  • external systems e.g., controller 29, database 36, imagery sensors 32, tracking device 190, etc.
  • These external systems are shown connected to the network 34, but they can also be disconnected and reconnected as needed.
  • the tracking device 190 may not be connected to the network until it is positioned on a docking station for downloading its acquired data.
  • the delivery device 20 may not always be connected to an external network. When it is reconnected to an appropriate external network, the communication between the external systems can again be enabled.
  • the processor(s) 330 can be communicatively coupled to a non-transitory memory storage 37 which can be used to store program instructions 334 and information in databases 38, 336, 338, 344, 346.
  • the processor(s) 330 can store and read instructions 334 from the memory 37 and execute these instructions to perform any of the methods and operations described in this disclosure for the delivery device 20.
  • the delivery device parameters (see parameters described above) for each training trajectory 40 can be stored in the delivery device parameter database 38 in the memory 37.
  • This database 38 can be organized such that each training trajectory 40 that has been defined by a set of delivery device parameters can have a trajectory entry in the database 38. When this trajectory entry is accessed, the set of delivery device parameters can be transferred to the processor(s) 330, which can use the parameters to adjust the delivery device 20 components to deliver the predetermined trajectory defined by the trajectory entry.
  • the processor(s) 330 can assemble the sequence of trajectories including their associated delivery device parameters and store the sequence in the sequential trajectory database 336 as a retrievable set of predetermined trajectories.
  • the sequential trajectory database 336 can deliver the set of predetermined trajectories to the processor(s) 330 including the delivery device parameters.
  • the processor(s) 330 can then sequentially setup the delivery device 20 to sequentially project objects one after another to produce the desired set of predetermined trajectories in the desired order.
  • the memory 37 may also contain a game trajectory database 338 which stores the game parameters of the game trajectories that have been received from other sources (such as the tracking device 190, the game statistics database 36, or user inputs) and can save them for later emulation by the delivery device 20.
  • a game trajectory database 338 which stores the game parameters of the game trajectories that have been received from other sources (such as the tracking device 190, the game statistics database 36, or user inputs) and can save them for later emulation by the delivery device 20.
  • FIG. 7 is a representative functional diagram of a system and method for tracking movement of a trainee’s eye (or eyes) and comparing it to a trajectory 40 of an object 30.
  • Imaging sensors 32 can be used to collect imagery of the object 30 as it travels along the trajectory 40 as well as tracking eye characteristics of the trainee’s eye.
  • the eye characteristics can include a direction of the center line of sight (or fovea vision or central vision) of an eye of the trainee 8, movement of the eye in an eye socket, movement of an iris of the eye, size of a pupil of the eye, and combinations thereof.
  • the trainee 8 can attempt to track the object 30 with their eyes. As the object 30 continues along the trajectory 40 the trainee 8 can continue to move their eyes to track the object 30.
  • the imaging sensors 32 can be used to capture imagery that contains the trajectory 40 of the object and the movements of the eye (or eyes) and a time stamp of the movements.
  • the imagery can be transmitted to the controller 28, 29 which can be configured to analyze the trajectory 40 to determine the parameters of the trajectory 40, such as the 3D position of the object 30 in space along the trajectory 40 and the velocity vectors (e.g., 176) of the object 30 as it traveled along the trajectory 40.
  • the controller 28, 29 can also be configured to analyze the recorded eye movements of the trainee’s eye to determine the direction from the eye of the center line of sight 250 of the eye.
  • the controller 28, 29 can correlate the object position along the trajectory 40 with the eye movements based on syncing the time of the position of the object 30 along the trajectory 40 (e.g., position 30’) with the time of the eye movements of the trainee 8.
  • the controller 28, 29 can calculate a deviation L9 between the object 30 and the center line of sight 250. Calculating a deviation L9 for multiple positions along the trajectory 40 can be used to score the ability of the trainee 8 to track the object 30 along the trajectory 40.
  • the deviations L9 can be plotted vs. time to display to the user (trainee 8, coach 4, another individual, etc.) for understanding areas of strength or weakness of the trainee 8 in tracking the object 30 along the trajectory 40.
  • the method for tracking movement of a trainee’s eye (or eyes) and correlating the eye movement to the positions of the object 30 along the trajectory (e.g., 40) can be used with any of the training systems 10 described in this disclosure.
  • the correlation between the trajectory 40 and the eye movement can be used to score the trainee’s ability to track the object 30 through the end portion of the trajectory 40 that can be reduced (e.g., barrier 220 and possibly the light source 230 moved to respective positions 220’, 230’) as the score is improved, or increased (e.g., barrier 220 and possibly the light source 230 moved to original positions) as the score is unchanged or worse.
  • a coach 4 or another individual can score the ability of the trainee 8 to track the object 30 along the trajectory 40 by visually observing the trainee 8 as they attempt to track the object 30. This can be seen as being somewhat less precise than the method of correlating the eye movements to the object positions along the trajectory 40 using the controller 28, 29. However, this manual correlation can also be used to improve the trainee’s ability to track the object 30 along the trajectory 40.
  • the training system 10 in FIG. 7, as well as various other training systems 10 described in this disclosure, can be used to perform strike zone training 119.
  • Strike zone training 119 can be used to improve an ability of the trainee 8 to recognize objects 30 delivered to the target zone 50 (which can be referred to as a strike zone for baseball and softball sports).
  • the trainee 8 can hone their skills for recognizing strikes and those that are not strikes.
  • the strike zone training 119 can occur when the delivery device 20 sequentially projects objects 30 along a predetermined trajectory (e.g., trajectory 40) and the trainee 8 indicates when they believe the object 30 arrives within the target zone 50 by providing a user input to the controller 28, 29 via an HMI device 170.
  • the target zone 50 can include sensors 51 as previously described. These sensors 51 can detect a location in the target zone at which the object 30 arrives.
  • the trainee 8 can actuate or interact with the HMI device 170 to indicate if they think the object 30 arrived in the target zone 50, and the HMI device can transmit the indication to the controller 28, 29, which can compare the indication with actual arrival location of the object 30.
  • the controller 28, 29 can also determine if the object did not arrive within the target zone 50, either due to a lack of indication from the sensors 51 that the object 30 arrived at the target zone 50 or possibly sensors (not shown) that are positioned outside the target zone 50.
  • a high score can be when the indication is received from the HMI device 170 and the object 30 arrives within the target zone 50, or when the object 30 does not arrive in the target zone 50 and no indication is received from the HMI device 170.
  • a low score can be when the indication is not received from the HMI device 170 and the object 30 arrives within the target zone 50, or when the object 30 does not arrive in the target zone 50 and an indication is received from the HMI device 170.
  • the controller 28, 29 can average the individual scores over a period of time or over multiple objects 30 delivered toward the target zone 50. This average score can be used (as well as the individual scores) can be used to provide feedback to the trainee 8 (or the coach 4, other individual, or the controller 28, 29) for improving the trainee’s performance of recognizing objects 30 arriving in the target zone 50 and those that do not arrive in the target zone 50. Training with the smaller object 30 can allow the trainee 8 to recognize regulation game objects even easier during a real-life event and thereby more easily recognize balls and strikes in the real-life event.
  • This training 119 can be well suited for baseball, softball, cricket, soccer, or any sport with a target area for receiving a game object. However, the strike zone training 119 can also be used for other not as well-suited sports or tactical situations to improve eye-body coordination of a trainee 8.
  • the strike training 119 can also be used to improve the trainee’s ability to recognize when the object 30 arrives at the target zone 50. So the trainee 8 can send an indication via the HMI device 170 to the controller 28, 29 when they believe the object 30 arrives at the target zone 50.
  • the controller 28, 29, via comparison to the sensor data received from the sensors 51, can determine a score based on the comparison of the time of arrival of the object 30 at the target zone 50 and when the indication is initiated at the HMI device 170 by the trainee 8.
  • the indication from the HMI device 170 can be initiated by: body movement of the trainee 8; eye movement of the trainee 8; hand movement of the trainee 8; leg movement of the trainee 8; arm movement of the trainee 8; head movement of the trainee 8; audible sound signal from the trainee 8; movement of a sports tool 12; actuation of a key on a keyboard; actuation of a switch; trainee 8 interaction with the HMI device 170; or combinations thereof.
  • a good performance of the trainee 8 regarding the object 30 can be when the actual arrival position is inside the target zone and the indication is received from the HMI device 170 within a pre-determined amount of time before the actual arrival time, or when the actual arrival position is outside the target zone and no indication is received from the HMI device within a pre-determined amount of time before the actual arrival time.
  • a bad performance of the trainee 8 regarding the object 30 can be when the actual arrival position is outside the target zone and the indication is received from the HMI device 170, or when the actual arrival position is inside the target zone and no indication is received from the HMI device 170, or when the actual arrival position is inside the target zone and the indication is received from the HMI device 170 past a pre-determined amount of time prior to the actual arrival time.
  • This vision training for the trainee 8 using various methods of training with the delivery device 20 can be used to focus on improving specific visual abilities of the trainee 8, such as depth perception, visual reaction time and response timing, speed of visual processing, and eye-body-coordination.
  • a target zone training method e.g., baseball strike zone, soccer goal area; etc.
  • the delivery device 20 projects the object 30 to specific areas of the target zone 50 to score the trainee’s ability to determine if the object is inside or outside the perimeter of the target zone 50 or score the ability of a trainee 8 to prevent the object 30 from entering the target zone 50 for a goal at a variety of trajectories and speeds.
  • the visual abilities of the trainee 8 are required to perform at higher and higher levels of efficiency to correctly interact with the object 30 at the target zone 50.
  • a quick reaction training method can also be used to further enhance and improve the visual abilities of the trainee 8.
  • the delivery device 20 is configured to deliver the object 30 toward the trainee 8 in a variety of challenging speeds and trajectories.
  • the object 30 is much smaller than a sport regulation object (e.g., less than the size of a table tennis ball).
  • a goal of the trainee 8 is to prevent the object 30 from entering the target zone 50.
  • the delivery device 20 can continue projecting an object 30 toward the trainee 8 or target zone 50, while the trainee 8 must visually acquire the object 30, track the object 30 along at least a portion of its trajectory to the target zone 50, and cause the body to react in a way (e.g., hand movement, foot movement, head movement, sport tool movement, etc.) as to prevent entry of the object 30 in the target zone 50.
  • a way e.g., hand movement, foot movement, head movement, sport tool movement, etc.
  • the quick reaction training method can also incorporate segmenting training using one or more screens (described previously) to make the reaction and response of the trainee 8 more challenging.
  • the one or more segmenting screens are in place, the trainee 8 cannot see the delivery device 20. Therefore, the trainee has an increasingly shorter amount of time to visually acquire the object 30 since the trainee 8 cannot see the object 30 until it has passed around, under, over, or through the one or more segmenting screens.
  • the quick reaction training method can score the trainee’s visual reaction time, trainee’s response timing, and trainee’s eye-body coordination. The scoring can indicate if the Trainee is having difficulty is specific areas of the target zone 50.
  • the cognitive recognition training method can use the delivery device 20 to project objects 30 of different colors toward the trainee 8.
  • the trainee 8 can then attempt to identify the color as soon as it is delivered from the DD or as it passes around, under, over, or through the one or more segmenting screens.
  • the trainee 8 can be scored on their ability to quickly identify the object 30 and its color.
  • the cognitive recognition training method works to improve the trainee’s speed of visual processing.
  • Scoring for these visual ability training methods can be stored in a database for later retrieval and analysis.
  • the analysis of the scores can indicate trends in performance of the trainee to the various training methods.
  • the scores can also be used to produce or update an individual “heat map” for the trainee 8 to indicate areas of the target zone that the trainee 8 is strongest, weakest, and average.
  • the heat map can be used to control the delivery device 20 to cause the delivery device 20 to project objects 30 to the weaker areas indicated by the heat map.
  • the delivery device 20 can be moved to various training locations to perform the training methods. However, it can be desirable to perform a setup and calibration procedure on the delivery device 20 when it is first installed in a new training field 100. The following operations can be performed to move and recalibrate the delivery device 20 at a new training field 100.
  • the delivery device 20 can be removably attached to a tripod base via a quick release tripod plate.
  • the delivery device 20 can be detached from the tripod base, moved to the new location and reattached to another tripod base, an elevator system, a gantry system, a platform, or a mounting surface via the quick release tripod plate.
  • the trainee 8 or other individual 8 can then power up the delivery device 20 by connecting the delivery device 20 to a power source and energizing the delivery device 20 components.
  • the controller 28, 29 can begin a power-up procedure that can be referred to as an awareness protocol.
  • the controller 28, 29 can check to see if the imaging sensors 32 are connected and in communication with the controller 28, 29 as well as checking communication to the other delivery device 20 components.
  • the controller 28, 29 can then perform any required initialization protocols for the delivery device 20 components.
  • Various sensors can be used to sense the environment of the training field 100 as well as the internal conditions, such as reading the accelerometer to check the orientation of the delivery device 20, motors can be moved to a pre-determined location to resync the motor positions with the controller 28, 29.
  • the controller 28, 29 can then establish connections with peripheral devices, such as a phone, the internet, Augmented Reality glasses, smart TVs, light sources, sensors at the target zone 50, and impact sensors 56.
  • the controller 28, 29 can determine if there are objects 30 available in the storage bin 120, and if the propulsion device to propel the object has power and is ready to project an object 30 from the delivery device 20.
  • Instructional videos can be played on the display 340 to instruct individuals 4 (which can include the trainee 8) to construct a backdrop at the target zone 50, construct a netting tunnel from pipes and drapes, and position the target zone 50.
  • the individuals 4 can setup these things based on the training method to be performed, or the delivery device 20 can sense the training to be performed and then instruct the individuals 4 to construct the training field 100 accordingly.
  • the delivery device 20 Before projecting one or more objects 30 from the delivery device 20 to the target zone 50, the delivery device 20 can be calibrated. A laser pointer on the delivery device 20 can be used to adjust the delivery device 20 to be aimed in the general area where the desired target zone 50 will be. The delivery device 20 can then begin projecting an object 30 to the target zone 50 and capturing the trajectory of the object via the imaging sensors 32. Each captured trajectory can be compared to a desired pre-determined trajectory to determine a score of how well or poorly the delivery device 20 is projecting the object along the desired pre-determined trajectory. The controller 28, 29 scores the performance of the delivery device 20 to project the object 30 along the pre-determined trajectory, and based on the scoring, can adjust one or more parameters of the delivery device 20 to improve the score.
  • Embodiment 1 A method for training comprising: projecting, via a delivery device, one or more objects along one or more predetermined trajectories toward a target zone, wherein each of the one or more objects has a diameter that is less than a diameter of a regulation table tennis ball; a trainee attempting to prevent the one or more objects from entering the target zone; and scoring an ability of the trainee to prevent the one or more objects from entering the target zone.
  • Embodiment 2 The method of embodiment 1, wherein the diameter of each of the one or more objects is at least 0.05 inches (1.27 mm) and less than 1.4 inches (35.56 mm).
  • Embodiment 3 The method of embodiment 1, wherein the target zone is a baseball strike zone and the trainee is training for baseball.
  • Embodiment 4 The method of embodiment 1, wherein the target zone is a soccer goal and the trainee is training for soccer.
  • Embodiment 5 The method of embodiment 1, wherein the trainee attempts to prevent the one or more objects from entering the target zone by swinging a sport tool at the one or more objects to impact the one or more objects.
  • Embodiment 6 The method of embodiment 1, wherein the trainee attempts to prevent the one or more objects from entering the target zone by intersecting the one or more pre-determined trajectories with a hand, a foot, a head, an arm, or a leg of the trainee.
  • Embodiment 7 The method of embodiment 1, wherein the one or more objects are colored objects, and the method further comprises training the trainee to recognize and identify a color of each of the one or more objects before the one or more objects reach the target zone.
  • Embodiment 8 The method of embodiment 7, further comprising: positioning one or more screens between the delivery device and the trainee, wherein the one or more screens blocks the training from viewing the one or more objects along at least a proximal portion of the one or more pre-determined trajectories; projecting the one or more objects around, under, over, or through the one or more screens; and recognizing and identifying the color of each of the one or more objects as the one or more objects travel along a distal end portion of the one or more pre-determined trajectories.
  • Embodiment 9 The method of embodiment 8, further comprising moving the one or more screens toward the target zone to reduce an amount of time the trainee has to recognize and identify a color along the distal end portion of the one or more pre-determined trajectories.
  • Embodiment 10 The method of embodiment 1, further comprising scoring depth perception, anticipation timing, speed of visual processing, visual reaction timing, response timing, or combinations thereof of the trainee.
  • Embodiment 11 A method for training comprising projecting, via a delivery device, an object along a pre-determined trajectory toward a target zone proximate a trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring, via a controller, an ability of the trainee to interact with the object at the target zone.
  • Embodiment 12 The method of embodiment 11, further comprising: prior to projecting the object, displaying, via a display, a video to a trainee, wherein the video explains setup for the delivery device and a training field; and setting up the delivery device and the training field based on the video.
  • Embodiment 13 The method of embodiment 11, further comprising: displaying the scoring of the trainee on the display; and displaying, via the display, a video to the trainee, wherein the video explains how to improve the ability of the trainee to interact with the object at the target zone.
  • Embodiment 14 The method of embodiment 13, further comprising projecting, via the delivery device, one or more objects toward the target zone proximate the trainee; and scoring an ability of the trainee to interact with the one or more objects at the target zone; and displaying the scoring of the trainee on the display.
  • Embodiment 15 The method of embodiment 13, further comprising: adjusting one or more parameters of the delivery device based on the video; projecting a second object along a second pre-determined trajectory toward a target zone; and scoring the ability of the trainee to interact with the second object at the target zone.
  • Embodiment 16 The method of embodiment 11, further comprising: selecting, via the controller, a training method based on the scoring; selecting a video based on the training method, wherein the video instructs the trainee on how to perform the training method, how to score in the training method, what skills are being targeted by the training method, or combinations thereof; displaying the video to the trainee via the display; executing, via the delivery device, the training method for the trainee; and scoring performance of the trainee.
  • Embodiment 17 A method for training comprising: operating of a delivery device in response to a first command from a trainee to perform a training; receiving, by the trainee, a response to the first command from the delivery device, wherein the response is requesting input from the trainee; providing, via a second command from the trainee, the requested input to the delivery device; adjusting, via a controller, one or more parameters of the delivery device based on the second command; projecting, via the delivery device, an object along a pre-determined trajectory toward a target zone proximate the trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring an ability of the trainee to interact with the object at the target zone.
  • Embodiment 18 The method of embodiment 17, wherein the first command is a voice command, a hand gesture, a head movement, or a body movement of the trainee.
  • Embodiment 19 The method of embodiment 17, wherein the response to the first command is displayed to the trainee via a display, which is communicatively coupled to the controller.
  • Embodiment 20 The method of embodiment 17, further comprising: projecting, via the delivery device, a plurality of objects along one or more predetermined trajectories toward a target zone proximate the trainee; scoring the ability of the trainee to interact with the plurality of objects at the target zone; and displaying one or more videos to the trainee during the projecting of the plurality of objects toward the target zone.
  • Embodiment 21 The method of embodiment 20, wherein the one or more videos are instructional videos to remind the trainee about training goals, to reinforce proper performance techniques for interacting with the plurality of objects at the target zone, to encourage the trainee during the training, to share scores with the trainee during the training, or to compare trainee performance to performance of a professional athlete.
  • Embodiment 22 A method for training comprising: projecting, via a delivery device, a plurality of first objects along a pre-determined trajectory toward a first target zone in a first training field, wherein each of the plurality of first objects has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a delivery device to deliver the plurality of first objects at a pre-determined location at the first target zone; adjusting one or more parameters of the delivery device based on the first score; projecting, via the delivery device, a plurality of second objects along a predetermined trajectory toward the first target zone, wherein each of the plurality of second objects has a diameter less than the diameter of a regulation table tennis ball; and determining, via the controller and the imaging sensor, a second score that indicates an ability of a delivery device to deliver the plurality of second objects at the pre-determined location at the first target zone, wherein the second score indicates an improved performance of the delivery device compared to the first score
  • Embodiment 23 The method of embodiment 22, further comprising: moving the delivery device to a second training field, with a second target zone; and repeating the projecting of the first objects, determining the first score for the delivery device, adjusting the one or more parameters, the projecting of the second objects, and determining the second score for the delivery device for the second target zone instead of the first target zone, wherein the second score for the second target zone indicates an improved performance of the delivery device compared to the first score for the second target zone.
  • Embodiment 24 A method for training comprising: projecting, via a delivery device, a first object along a pre-determined trajectory toward a first target zone in a first training field, wherein the first object has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a trainee to interact with the first object at the first target zone; moving the delivery device to a second training field, with a second target zone; projecting, via the delivery device, a second object along the pre-determined trajectory toward the second target zone, wherein the second object has a diameter less than the diameter of the regulation table tennis ball; determining, via the controller and the imaging sensor, a second score that indicates an ability of the delivery device to project the second object along the pre-determined trajectory to the second target zone; adjusting one or more parameters of the delivery device based on the second score; projecting another object toward the second target zone; determining, via the controller and the imaging sensor, the second score and comparing the second score to a
  • Embodiment 25 A system for training configured to perform any of the methods described in this disclosure.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

A method can include projecting one or more objects along one or more pre-determined trajectories toward a target zone, with each having a diameter less than a diameter of a regulation table tennis ball, the trainee attempting to prevent the one or more objects from entering the target zone and scoring the ability of the trainee. A method can include displaying a first video to a trainee that explains setup for a delivery device, setting up the delivery device based on the first video, projecting an object along a pre-determined trajectory toward a target zone proximate the trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball, and scoring an ability of the trainee to interact with the object at the target zone.

Description

TRAINING SYSTEM AND METHOD OF USING SAME
TECHNICAL FIELD
The present invention relates, in general, to the field of training individuals to improve their performance in their field of work or play. More particularly, present embodiments relate to a system and method for projecting an object toward a target and a trainee (or individual) interacting with the object.
SUMMARY
A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions. One general aspect includes a method for training that can include projecting, via a delivery device, one or more objects along one or more pre-determined trajectories toward a target zone, where each of the one or more objects has a diameter that is less than a diameter of a regulation table tennis ball; a trainee attempting to prevent the one or more objects from entering the target zone, and scoring an ability of the trainee to prevent the one or more objects from entering the target zone. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
One general aspect includes a method for training. The method also includes projecting, via a delivery device, an object along a pre-determined trajectory toward a target zone proximate a trainee, where the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring, via a controller, an ability of the trainee to interact with the object at the target zone. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
One general aspect includes a method for training that can include operating of a delivery device in response to a first command from a trainee to perform a training; receiving, by the trainee, a response to the first command from the delivery device, where the response is requesting input from the trainee; providing, via a second command from the trainee, the requested input to the delivery device; adjusting, via a controller, one or more parameters of the delivery device based on the second command; projecting, via the delivery device, an object along a pre-determined trajectory toward a target zone proximate the trainee, where the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring an ability of the trainee to interact with the object at the target zone. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
One general aspect includes a method for training that can include projecting, via a delivery device, a plurality of first objects along a pre-determined trajectory toward a first target zone in a first training field, where each of the plurality of first objects has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a delivery device to deliver the plurality of first objects at a pre-determined location at the first target zone; adjusting one or more parameters of the delivery device based on the first score; projecting, via the delivery device, a plurality of second objects along a pre-determined trajectory toward the first target zone, where each of the plurality of second objects has a diameter less than the diameter of a regulation table tennis ball; and determining, via the controller and the imaging sensor, a second score that indicates an ability of a delivery device to deliver the plurality of second objects at the pre-determined location at the first target zone, where the second score indicates an improved performance of the delivery device compared to the first score. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
One general aspect includes a method for training that can include projecting, via a delivery device, a first object along a pre-determined trajectory toward a first target zone in a first training field, where the first object has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a trainee to interact with the first object at the first target zone; moving the delivery device to a second training field, with a second target zone; projecting, via the delivery device, a second object along the pre-determined trajectory toward the second target zone, where the second object has a diameter less than the diameter of the regulation table tennis ball; determining, via the controller and the imaging sensor, a second score that indicates an ability of the delivery device to project the second object along the predetermined trajectory to the second target zone; adjusting one or more parameters of the delivery device based on the second score; projecting another object toward the second target zone; determining, via the controller and the imaging sensor, the second score and comparing the second score to a desired score; and repeating the adjusting the one or more parameters based on the second score, projecting the another object, and determining the second score until the second score is substantially equal to the desired score. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
BRIEF DESCRIPTION OF THE DRAWINGS
Features, aspects, and advantages of present embodiments will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
FIGS. 1A-1C are representative functional diagrams of systems and methods for training a trainee to improve coordination, vision training and/or tracking capabilities, and vision training and/or timing capabilities, in accordance with certain embodiments;
FIGS. ID- IE are representative diagrams of an object used with a delivery device for training a trainee to improve coordination, vision training and/or tracking capabilities, and vision training and/or timing capabilities, in accordance with certain embodiments;
FIG. 2 includes a representative functional diagram of a system and method for modifying parameters of the delivery device based on detected characteristics, in accordance with certain embodiments; and
FIG. 3 is a representative functional block diagram of an object delivery device that can support the systems and methods of the current disclosure, in accordance with certain embodiments;
FIG. 4 is a representative perspective view of a friction device for the delivery device, in accordance with certain embodiments;
FIG. 5 is a representative partial cross-sectional view along line 7-7 shown in FIGG, in accordance with certain embodiments;
FIG. 6 is a representative functional block diagram of a control system for a training system, in accordance with certain embodiments; and
FIG. 7 is a representative functional diagram of a system and method for training a trainee to improve coordination, vision training, and/or tracking capabilities, in accordance with certain embodiments. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)
The following description in combination with the figures is provided to assist in understanding the teachings disclosed herein. The following discussion will focus on specific implementations and embodiments of the teachings. This focus is provided to assist in describing the teachings and should not be interpreted as a limitation on the scope or applicability of the teachings.
As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having,” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of features is not necessarily limited only to those features but may include other features not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive-or and not to an exclusive-or. For example, a condition A or B is satisfied by any one of the following: A is true (or present), and B is false (or not present), A is false (or not present), and B is true (or present), and both A and B are true (or present).
The use of “a” or “an” is employed to describe elements and components described herein. This is done merely for convenience and to give a general sense of the scope of the invention. This description should be read to include one or at least one and the singular also includes the plural, or vice versa, unless it is clear that it is meant otherwise.
The use of the word “about”, “approximately”, or “substantially” is intended to mean that a value of a parameter is close to a stated value or position. However, minor differences may prevent the values or positions from being exactly as stated. Thus, differences of up to ten percent (10%) for the value are reasonable differences from the ideal goal of exactly as described. A significant difference can be when the difference is greater than ten percent (10%).
FIGS. 1A-1C are representative functional diagrams of a system 10 for training a trainee 8 to improve coordination, vision training, and/or tracking capabilities. Such a system 10 and method of using the system as disclosed according to the embodiments herein may be particularly suited for sports training. However, it will be appreciated that other uses may be possible. Such sports may include, without limitation, baseball (FIG. 1A), tennis (FIG. IB), or hockey (FIG. 1C). Other sports can also benefit from similar training, such as softball, lacrosse, cricket, soccer, table tennis, American football (referred to as “football”), volleyball, basketball, shooting sports, etc. Other training activities can also benefit from similar training using the systems described in this disclosure such as military training, first responder training, search and rescue training, rehabilitation training (e.g., where the trainee 8 is autistic, recovering from a stroke, recovering from an injury, or has other medical conditions), or other trainees that can benefit from eye-body coordination training provided by the training systems described in this disclosure.
Military, first responders, and tactical officers often need to make quick but accurate decisions under stress. By improving time to recognize aspects of the field around them, they can more quickly determine risks and identify threats. Search and Rescue personnel can work in difficult, stressful, or poor operating environments. Enhanced visual skills can help reduce the time to recognize dangers, individuals, and risks of the situation. Visual skills that can be improved by the training systems in this disclosure are, but not limited to:
Dynamic Visual Acuity,
Gaze Stabilization,
Initiation speed,
Peripheral Awareness,
Speed of Visual Processing,
Vision in Dim Illumination,
Visual Discrimination,
Concentration, or
Spatial Awareness.
The figures 1A, IB, 1C show a delivery device 20 that can be used to project an object 30 toward a target zone 50 (or a trainee 8). According to an embodiment, the object 30 may be projected along a trajectory (e.g., 40, 42, 44) in a direction 66 toward the target zone 50 or trainee 8. As used herein, a “trajectory” is a representation of a flight path of an object through a three-dimensional (3D) X, Y, Z coordinate system space, where each point along the trajectory can be represented by a point in the 3D space. Each point along the trajectory can include a velocity vector that is representative of a velocity and direction of travel of the object at that point along the trajectory.
In one embodiment, the projection of the object 30 along the trajectory (40, 42, 44) may be controlled by one or more controllers 28, 29 (also referred to as “controller 28, 29”) capable of controlling various aspects of the process of projection of the object 30, such that the projection is conducted along a predetermined trajectory 40, 42, or 44. The one or more controllers 28, 29 can include only one controller (28 or 29) that can control the aspects of the delivery device 20 and communicate with internal and external data sources for setting parameters of the delivery device 20 to desired values. The one or more controllers 28, 29 can include an internal controller(s) 28 and an external controller(s) 29 that can control the aspects of the delivery device 20 and communicate with each of the controllers and with internal and external data sources for setting parameters of the delivery device 20 to desired values.
A predetermined trajectory can include a trajectory that is estimated (or determined) prior to projection of the object 30. The predetermined trajectory can be selected by the controller 28, 29 which can control one or more components of the delivery device 20 to control the trajectory of the object. The delivery device 20 can include or be communicatively coupled (wired or wirelessly) to the controller 28, 29 that can be configured to control one or more delivery variables associated with delivering the object along a predetermined trajectory 40, 42, or 44. In a non-limiting embodiment, the delivery variables can include, position of the device in 3D-space (i.e., position in space according to X, Y, and Z planes), angle of the device relative to an intended target or trainee, distance from a target or trainee, intended velocity of the object along the intended trajectory between the device and the target or trainee, spin of the object along the intended trajectory between the device and the target or trainee, the weight of the object by selecting an object, surface features of the object by selecting the object, as well as others. Additional delivery variables (or parameters) are defined in the following description at least in regard to FIGS. 3-6. In a nonlimiting embodiment, these parameters can be: selection of a barrel through which to propel the object; an air pressure supplied to the object to propel the object through the barrel with a center axis; an air volume supplied to the object; an inclination of the barrel; an azimuthal orientation of the barrel; a length of the barrel; an inclination of the friction device which comprises a ramp and a surface material on the ramp; an azimuthal orientation of the friction device around the center axis of the barrel; an azimuthal orientation of the friction device about a longitudinal axis of the friction device; a distance of the friction device from the barrel; the surface material of the friction device; an object launch position from the delivery device, the object launch position being a 3D position in X-Y-Z coordinate space relative to the target or the trainee; an object selection; a distance to the target or the trainee; and a height of the target or the trainee.
The delivery device 20 can be moved horizontally shown by arrows 60, 62, or vertically shown by arrows 64. The height LI of the object exiting the delivery device 20 can be adjusted by moving the chassis 22 of the delivery device 20 up or down (arrows 64) a desired distance. This 3D movement of the delivery device 20 can allow users (e.g., coach 4, trainer 4, individual 8, trainee 8, or others) to adjust the position that an object 30 exits the delivery device 20. This can allow the exiting object 30 to be positioned so as to emulate a human or other real-life source for delivery of a regulation object (e.g., a regulation baseball, a regulation softball, a regulation hockey puck, a regulation tennis ball, a regulation table tennis ball, a regulation lacrosse ball, a regulation cricket ball, a regulation football, and a regulation soccer ball) such as by a pitcher for baseball or softball, a quarterback for football, a skeet delivery device for shooting sports, a soccer player making shots on goal, a hockey player making shots on goal, etc. As used herein, a “real-life” event refers to a game, practice session, or tactical situation for which the trainee is training to improve performance. The real-life event would be those events that use regulation equipment to perform the sport or tactical operations or situations.
Additionally, the object 30 trajectory can be projected from the delivery device 20 at an appropriate angle relative to a surface 6. A guide 24 can be used to cause the object to exit the delivery device 20 at an angle and cause the object to experience varied resistance when it is ejected from the guide 24. The guide 24 can include a barrel and a friction device for imparting spin and deflection to the object to project the object 30 along a predetermined trajectory. A controller 28, 29 can control the angle and position of the guide 24, as well as select the predetermined (or desired, or expected) trajectory from a plurality of trajectories or define the predetermined trajectory based on collected data from data sources. In a nonlimiting embodiment, each predetermined trajectory (e.g., trajectories 40, 42, 44) can include any parameters needed to setup the delivery device 20 to deliver the object 30 along that particular predetermined trajectory (e.g., trajectories 40, 42, 44). In a non-limiting embodiment, the parameters can include an azimuthal direction of the guide 24 to produce a desired azimuthal direction of an object 30 exiting the delivery device 20. The parameters can also include the amount and location of resistance to be applied to the object as the object is propelled toward the exit of the delivery device 20. These will be described in more detail below with regard to the delivery device 20.
In a non-limiting embodiment, the parameters can also include the force to be applied to the object 30 that will propel the object 30 from the delivery device 20 and cause the object to travel along the predetermined trajectory (e.g., trajectories 40, 42, 44). In a non-limiting embodiment, the force can be applied to the object 30 via pneumatic, hydraulic, electrical, electro-mechanical, or mechanical power sources that can selectively vary the amount of force applied to the object 30. The parameters can also include which one of a plurality of objects 30 and which one of a plurality of barrels 360 should be chosen to provide the desired trajectory. The plurality of objects 30 can have many different features which are described in more detail below. The controller 28, 29 can select the object 30 that is needed to produce the desired trajectory. The controller 28, 29 can control an alert feature 26 (such as turn ON or OFF a light, turn ON or OFF an audible signal, play a synchronized video of a real-life delivery source, etc.) to indicate that an object 30 is about to be projected from the delivery device 20 toward the target zone 50. The alert feature 26 can be any device that can alert the trainee 8 to be ready for the object 30 to exit the delivery device 20.
In a non-limiting embodiment, the object 30 can be a spherical or substantially spherical object used for training purposes. The object 30 may be shaped to represent a desired sport. In a non-limiting embodiment, the object 30 can come in different colors such as white, yellow, orange, red, blue, tan, grey, black, or a luminescent color. The color of the object 30 can be selected for the sport for which the trainee 8 is being trained or for the type of training being used. In a non-limiting embodiment, a colored pattern (e.g., red, yellow, white, green, blue, orange, or black pattern) can be applied on the object 30 to differentiate it from other objects 30. The colored pattern can be used to assist the trainee 8 in focusing intently on the object 30 so that they may pick up and track a particular sports ball quicker. The object may have one or more surface features (e.g., smooth, dimples, bumps, recesses, ridges, grainy texture, etc.) that facilitate delivery along various trajectories. In a nonlimiting embodiment, the object 30 can be made from a material such as acrylonitrile butadiene styrene, polylactic acid, calcium carbonate, recycled paper, cotton, foam, plastics, calcites, rubber, a metal such as steel, lead, copper, aluminum, or metal alloys, a plant-based material, or a fungus-based material.
In at least one embodiment, the device can include a magazine that may contain a plurality of objects. The objects 30 in the magazine can be substantially the same or at least a portion of the objects 30 can have varied characteristics relative to the other objects 30. Object characteristics can include but are not limited to, shape, size (e.g., longest dimension or length of the object, which in the case of a sphere is the diameter and in the case of a disk is the diameter along a major surface), color, surface features, density, material (e.g., inorganic, organic, metal, polymer, ceramic, or any combination thereof), or any combination thereof. In one embodiment, the delivery device 20 can include a first magazine with a first portion of objects having a first object characteristic, and a second magazine with a second portion of objects having a second object characteristic different from the first object characteristic. In one embodiment, the device is capable of selecting a single object from the first portion or the second portion. Various parameters may be used to select different objects, which may include, but is not limited to, a method of training (e.g., a preselected training protocol), a measured or scored capability of a trainee, a selection by the trainee, an instruction from one or more devices (e.g., data input from a sensor, such as a sensor associated with an impact device) communicatively coupled to the controller 28, 29.
In a non-limiting embodiment, it can be desirable for the object 30 to be sized such that it is significantly smaller than a corresponding regulation object. A corresponding regulation object is determined based upon the intended sport for which the trainee is training. For example, when training for baseball, the corresponding regulation object would be the regulation size of a baseball. It should be noted that there can be multiple regulation sizes in a particular sport. For example, the size of a soccer ball for professional soccer can be different than a size of a soccer ball for youth soccer, yet, both soccer balls are regulation size. For example, the size of a football for professional football can be different than a size of a football for youth football, yet, both footballs are regulation size. The object 30 of the current disclosure is significantly smaller than any of the regulation sizes for footballs or any other regulation objects. The delivery device of the current disclosure does not project objects of the same size of a regulation object for the intended sport or activity for which the trainee is training.
In one non-limiting embodiment, the difference in size between the object 30 and a corresponding regulation object can be expressed as a value of Lo/Lr, wherein Lo is the largest dimension (i.e., length) of the object 30 and Lr is the largest dimension (i.e., length) of the regulation object. In at least one embodiment, the difference in size (or ration Lo/Lr) can be less than 0.9 or less than 0.8 or less than 0.7 or less than 0.6 or less than 0.5 or less than 0.4 or less than 0.3 or less than 0.2 or less than 0.1. Still, in another non-limiting embodiment, the difference in size can be at least 0.001 or at least 0.002 or at least 0.004 or at least 0.006 or at least 0.008 or at least 0.01 or at least 0.02 or at least 0.03 or at least 0.05 or at least 0.07 or at least 0.1 or at least 0.15 or at least 0.2 or at least 0.25 or at least 0.3. It will be appreciated that the difference in size between the object 30 and a corresponding regulation object (Lo/Lr) can be within a range including any of the minimum and maximum values noted above, including, for example, but not limited to at least 0.001 and less than 0.9 or within a range of at least 0.001 and less than 0.5 or within a range of at least 0.002 and less than 0.006.
In a non-limiting embodiment, the diameter DI (see FIGS. ID, IE) of the object 30 can be at least 0.05 inches, at least 0.06 inches, at least 0.07 inches, at least 0.08 inches, at least 0.09 inches, at least 0.10 inches, at least 0.110 inches, at least 0.118 inches, at least 0.120 inches, at least 0.125 inches, at least 0.130 inches, at least 0.135 inches, at least 0.140 inches, at least 0.145 inches, at least 0.150 inches, at least 0.20 inches, or at least 0.25 inches.
In another non-limiting embodiment, the diameter DI of the object 30 can be less than 2.0 inches, less than 1.90 inches, less than 1.80 inches, less than 1.70 inches, less than 1.60 inches, less than 1.50 inches, less than 1.40 inches, less than 1.30 inches, less than 1.20 inches, less than 1.10 inches, less than 1.00 inches, less than 0.90 inches, less than 0.85 inches, less than 0.80 inches, less than 0.75 inches, less than 0.70 inches, less than 0.65 inches, less than 0.60 inches, less than 0.59 inches, less than 0.55 inches, less than 0.50 inches, less than 0.45 inches, less than 0.40 inches.
It will be appreciated that the diameter of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.05 inches and less than 2.0 inches, or within a range of at least 0.05 inches and less than 1.10 inches, or within a range of at least 0.07 inches and less than 1.00 inch.
In a non-limiting embodiment, the size of the object 30 can be at least 120 times smaller than a baseball, at least 220 times smaller than a softball, at least 400 times smaller than a soccer ball, at least 25 times smaller than a table tennis ball, at least 90 times smaller than a lacrosse ball, at least 40 times smaller than a hockey puck, at least 70 times smaller than a clay pigeon (for shooting sports), or at least 110 times smaller than a cricket ball.
In a non-limiting embodiment, the size of the object 30 can be smaller than a regulation table tennis ball, where the regulation table tennis ball is spherical, and its diameter is 1.57 inches (40 mm), where the diameter DI of the object 30 can be less than 1.57 inches (40 mm).
In a non-limiting embodiment, the weight of the object 30 can be at least 0.001 ounces, at least 0.002 ounces, at least 0.003 ounces, at least 0.004 ounces, at least 0.005 ounces, at least 0.006 ounces, at least 0.007 ounces, at least 0.008 ounces, at least 0.009 ounces, at least 0.010 ounces, at least 0.011 ounces, at least 0.012 ounces, at least 0.013 ounces, at least 0.014 ounces, at least 0.015 ounces, at least 0.20 ounces, at least 0.25 ounces, at least 0.30 ounces, at least 0.35 ounces, at least 0.40 ounces, at least 0.45 ounces, at least 0.50 ounces, at least 0.55 ounces, or at least 0.60 ounces.
In another non-limiting embodiment, the weight of the object 30 can be less than 10 ounces, less than 9 ounces, less than 8 ounces, less than 7 ounces, less than 6 ounces, less than 5 ounces, less than 4 ounces, less than 3 ounces, less than 2 ounces, less than 1.5 ounces, less than 1 ounce, less than 0.9 ounces, less than 0.8 ounces, less than 0.7 ounces, less than 0.6 ounces, less than 0.5 ounces, less than 0.4 ounces, less than 0.3 ounces, less than 0.2 ounces, less than 0.1 ounces, less than 0.09 ounces, less than 0.08 ounces, or less than 0.05 ounces.
It will be appreciated that the weight of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.001 ounces and less than 10 ounces, or within a range of at least 0.07 ounces and less than 0.9 ounces, or within a range of at least 0.002 ounces and less than 5 ounces, or within a range of at least 0.002 ounces and less than 1.5 ounces. In a non-limiting embodiment, other sizes and weights of the object 30 can be used with the delivery device 20 to project the object 30 toward the target zone 50.
The weight of the object 30 can be adjusted for different training purposes and achieving various predetermined trajectories (e.g., 40, 42, 44). The weight can depend on the size and materials used for the specific object 30 that support different training processes. The variation of weight can result in speed changes of the object 30.
In a non-limiting embodiment, the shape of the object 30 can be substantially spherical. In another non-limiting embodiment, the object can be non-spherical, such as spheroidal. In another non-limiting embodiment, the object 30 can also have surface features (e.g., dimples, divots, holes, recesses, ridges, bumps, grainy textures, etc.) for trajectory modification. The shape of the object 30 can be tailored to emulate certain predetermined trajectories such as knuckle ball throws, kicks from a soccer ball, etc.
In a non-limiting embodiment, the materials that make up the object 30 can be acrylonitrile butadiene styrene, polylactic acid, calcium carbonate, paper, cotton, or foam, any poly-based plastics, or plastics in general, calcites, metal such as steel, lead, copper or aluminum, rubber, a plant-based material, or a fungus-based material. In a non-limiting embodiment, the object 30 can be coated with glow in the dark colors. This can be used in various training methods for vision training, such as segmenting training and strike zone training (described later).
In a non-limiting embodiment, the object 30 can be illuminated by ultraviolet lights such as black lights for isolated training processes for vision tracking. Being smaller than the regulation objects, the object 30 can be safer than regulation objects. A user may need to only wear safety glasses or a mask.
The delivery device 20 can be positioned at a distance L2 from a target zone 50 or trainee 8. In a non-limiting embodiment, the distance L2 can be at least 3 feet, at least 4 feet, at least 5 feet, at least 6 feet, at least 7 feet, at least 8 feet, at least 9 feet, at least 10 feet, at least 11 feet, at least 12 feet, at least 13 feet, at least 14 feet, at least 15 feet, at least 16 feet, at least 17 feet, at least 18 feet, at least 19 feet, at least 20 feet, at least 25 feet, at least 30 feet, at least 35 feet, or at least 40 feet.
In another non-limiting embodiment, the distance L2 can be less than 210 feet, less than 205 feet, less than 200 feet, less than 190 feet, less than 180 feet, less than 170 feet, less than 160 feet, less than 150 feet, less than 140 feet, less than 130 feet, less than 120 feet, less than 110 feet, less than 100 feet, less than 90 feet, less than 80 feet, less than 70 feet, less than 60 feet, less than 55 feet, less than 50 feet, less than 45 feet, less than 40 feet, less than 35 feet, less than 30 feet, less than 25 feet, or less than 20 feet.
It will be appreciated that the distance L2 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 5 feet and less than 200 feet, or within a range of at least 5 feet and less than 55 feet, or within a range of at least 15 feet and less than 50 feet, or within a range of at least 15 feet and less than 40 feet, or within a range of at least 5 feet and less than 15 feet, or within a range of at least 10 feet and less than 25 feet.
However, farther distances are achievable with increased power projecting the object 30 toward the target zone 50. In a non-limiting embodiment, the target zone 50 can be a rectangle defined by a height L5 and a width L4 and can represent a relative position in space, or the target zone 50 can be a physical collection device that captures the objects 30 that enter individual target segments 76. The target can be moved up or down (arrows 68, FIG. 2) to position the target zone 50 at the desired height L3. An imaging sensor 32 can capture imagery of the trainee 8 and communicate the imagery to the controller 28, 29. In a non-limiting embodiment, the imaging sensor 32 can include a camera, a 2D camera, a 3D camera, a LiDAR sensor, a smartphone, a tablet, a laptop, or other video recorders. The target zone 50 can be divided into a plurality of target segments 76 and the controller 28, 29 can initiate the projecting of the object 30 through a predetermined trajectory (e.g., trajectories 40, 42, 44) toward a specific target segment 76 or toward an area outside of the target zone 50 for various training methods. For example, as in baseball or softball training, in the beginning of a training session, the controller 28, 29 (e.g., via selections from a coach/trainer 4, the trainee 8 or another user) can deliver fast balls along the trajectory 42 that can arrive at the target zone 50 in the center target segment 76 (or any other appropriate segment 76). This can be used to help train the trainee 8 to recognize the object 30 and track it through the trajectory 42 through consistent training using the trajectory 42.
When scoring of this activity indicates that the trainee 8 has mastered tracking the object 30 through at least a portion of the trajectory 42, then other trajectories can be selected for additional training. These other trajectories can be designed by the trainee 8, the coach 4, other individual, or controller 28, 29 for the particular training method. These other trajectories can also be designed to mimic at least a portion of the trajectories of a sports object that was projected through one or more game trajectories in a real-life event by a real- life athlete. In this type of training, the trainee 8 can train like they are facing the real-life athlete that projected the sports object along the one or more game trajectories. The scoring can be determined via imagery captured by one or more imaging sensors or by a coach/trainer 4 visually observing the interaction of the trainee 8 with the object 30. The controller 28, 29 can analyze the imagery to determine the performance of the trainee 8 to the training goals or criteria for the training method being performed. The controller 28, 29 can then establish a score for the trainee 8, which can be used to provide feedback to the trainee 8, coach/trainer 4, or other user for improving the trainee’s performance. The score can be compared to previous scores to identify trends in the trainee’s performance.
For a fast ball simulation, the object 30 can be projected by the delivery device 20 along the trajectory 42. The object 30 can be seen traveling along the trajectory 42 as indicated by the object position 30”. For other trajectories, such as 40, 44 (which can be more complex trajectories), the object 30 can be seen traveling along the trajectory 40, 44 as indicated by positions 30’ and 30’”.
FIGS. ID, IE are representative side views of an example object 30 which can be various shapes and sizes. In a non-limiting embodiment, the object 30 in FIG. ID is shown to be a sphere with center axis 31 and diameter DI. The object 30, when projected by the delivery device 20, can have a spin 94 imparted to the object 30 by the delivery device 20. The spin 94 can be in any rotational direction around the axis 31. In another non-limiting embodiment, the object 30 in FIG. IE is shown to be a spheroid with center axis 31 and diameter DI that is the shortest diameter of the spheroid shape. The object 30, when projected by the delivery device 20, can have a spin 94 imparted to the object 30 by the delivery device 20. The spin 94 can be in any rotational direction around the axis 31. The spin 94 is shown to rotate the object 30 about the axis 31 similar to a spiral throw of a football. However, the spin 94 can also rotate the object 30 end over end about the axis 31 and any rotational direction in between.
In a non-limiting embodiment, the spin 94 can be “0” zero, at least 1 RPM, at least 2 RPMS, at least 3 RPMS, at least 4 RPMS, at least 5 RPMS, at least 10 RPMS, at least 20 RPMS, at least 50 RPMS, at least 100 RPMS, at least 200 RPMS, or at least 300 RPMS.
In a non-limiting embodiment, the spin 94 can be less than 120,000 RPMs, less than 116,000 RPMs, greater than 115,000 RPMs, less than 110,000 RPMs, less than 105,000 RPMs, less than 100,000 RPMs, less than 90,000 RPMs, less than 80,000 RPMs, less than 70,000 RPMs, less than 60,000 RPMs, less than 50,000 RPMs, less than 40,000 RPMs, less than 30,000 RPMs, less than 20,000 RPMs, less than 15,000 RPMs, less than 14,000 RPMs, less than 13,000 RPMs, less than 12,000 RPMs, less than 11,000 RPMs, less than 10,000 RPMs, less than 9,000 RPMs, less than 8,000 RPMs, less than 7,000 RPMs, less than 6,000 RPMs, or less than 5,000 RPMs.
It will be appreciated that the spin 94 of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least “0” zero RPMs and less than 11,000 RPMs, or within a range of at least 1 RPM and less than 116,000 RPMs, or within a range of at least 1 RPM and less than 115,000 RPMs, or within a range of at least 100 RPMs and less than 10,000 RPMs.
FIG. 2 is a representative functional diagram of a system 10 for training a trainee 8 to improve eye-body coordination in various sports. The delivery device 20 can be adjusted in various ways to facilitate projecting the object 30 along a predetermined trajectory 40, 42, 44 toward a target zone positioned a distance L2 from the delivery device 20. Distance L2 can be at least 5 feet, or at least 10 feet, or at least 15 feet, or at least 20 feet, or at least 25 feet, or at least 30 feet, or at least 35 feet, or at least 40 feet, or at least 45 feet, or at least 50 feet, at least 55 feet, or at least 75 feet, or up to 100 feet.
One or more imaging sensors 32 can be used to capture and record the travel of the object 30 along a predetermined trajectory (e.g., 40, 42, 44). The imaging sensors 32 can be placed at any position around the system 10 with two possible positions indicated in FIG. 2. Users (e.g., a coach 4, trainer 4, trainee 8, individual 8, or others) can also track the object along the predetermined trajectory and score the repeatability of the object 30 to travel along the predetermined trajectory. The imaging sensors 32 can capture and record how the eyes of the trainee 8 track the object 30 along the predetermined trajectory. Imagery collected via the imagery sensors 32 can be analyzed by a local controller 28 or a remotely located controller 29 to determine how the trainee 8 tracks the object 30 along a predetermined trajectory and the controller(s) 28, 29 can score the ability of the trainee 8 to track the object 30 along a predetermined trajectory. The score can be used to improve the capability of the trainee to track the object 30, adjust the delivery device 20, or select subsequent trajectories of another object 30 (such as when the trainee 8 performs well enough to progress to a more difficult trajectory).
In a non-limiting embodiment, the imaging sensors 32 can capture physical characteristics of the trainee 8 as well as attributes of a training field 100. Imagery from the imaging sensors 32 can be used by the controller 28, 29 to perform facial recognition of the trainee 8, voice recognition (via audio sensors in the imaging sensors 32), detection and recognition of body movements of the trainee 8, detection and recognition of objects in the training field 100, detection and recognition of body movements of the trainee 8 for controlling the delivery device 20 (e.g., “visual servoing”), detection and recognition of light and color for controlling the delivery device 20, creation of a virtual Grid, and combinations thereof to control operation of the delivery device 20.
The detection and recognition of body movements of the trainee 8, and facial recognition of the trainee 8 can be used by the controller 28, 29 to adjust one or more of the parameters of the delivery device 20 to deliver objects to a target zone, where the target zone 50 has been adjusted based on the identified trainee 8. The physical characteristics of the trainee 8 can be retrieved from a database (e.g., trainee database 344 in FIG. 6) once the trainee 8 has been identified. From these physical characteristics, the controller 28, 29 can adjust the target zone 50 accordingly to accommodate the trainee 8.
The physical characteristics of the trainee 8 can include one or more of the following: an overall height of the trainee; a height of a knee of the trainee; a height of a shoulder of the trainee; a length of a leg of the trainee; a length of an arm of the trainee; a position of the trainee; and a width of the trainee. The voice recognition can be used by the controller 28, 29 to identify the trainee 8 or the coach 4 and adjust one or more of the parameters of the delivery device 20 to deliver objects to a target zone, where the target zone 50 has been adjusted based on the identified trainee 8. The voice recognition can also be used to identify if the trainee 8 or coach 4 is approved for use of the delivery device 20 and enable operation is approved or disable operation if not approved. If approved, then the trainee 8 or coach 4 can issue voice commands to the delivery device 20 to control operation of the delivery device 20, such as pause, resume, start session, select mode, end session, select training session, provide score, provide training statistics, etc.
The detection and recognition of objects in the training field 100 can be used by the controller 28, 29 to determine the type of activity for which training is to be administered to the trainee 8, where the type of activity can be for a sport (e.g., baseball, softball, cricket, hockey, tennis, table tennis, football, soccer, lacrosse, handball, racket ball, basketball, shooting sports, etc.), tactical training, trainee rehabilitation training, or any other activity that can be benefit from improving eye -body coordination of the trainee 8. For example, if the controller 28, 29 detects a home plate near the target zone area in the imagery from the imaging sensor(s) 32, then baseball or softball may be the type of activity to be trained. For example, if the controller 28, 29 detects a hockey goal near the target zone area in the imagery from the imaging sensor(s) 32, the hockey can be the training activity.
The controller 28, 29 can use the imagery from the imaging sensor(s) 32 to determine the type of sport tool 12 to be used, the characteristics of the training field around the target zone 50, gear worn by the trainee 8, equipment held by the trainee 8, etc. to determine the type of activity for which the trainee 8 is training. The various methods of training described in this disclosure can be used for any or all of the types of training activities on which the trainee 8 can be trained. When the type of activity is determined, then the associated parameters for that type of activity can be retrieved from a database (e.g., activity database 346 in FIG. 6) and used by the delivery device 20 to control delivery of the object 30 toward the target zone. The associated parameters can include parameters to define a target zone 50, training field distances, define screens to avoid when projecting the object 30 (e.g., in segmenting training), etc.
The detection and recognition of body movements of the trainee 8 can be used by the controller 28, 29 to provide interactive control of the delivery device 20 by the trainee 8. This can be referred to as “visual servoing” which is a term used to indicate controlling a robot’s movements or actions based on visual actions of the trainee 8. For example, when the type of activity is baseball, then the trainee 8 can raise their hand to indicate to the delivery device 20 to pause projection of the object 30 until the trainee 8 is ready. The trainee 8 can then indicate their readiness by lowering their hand, which can indicate to the delivery device 20 to begin the sequence to deliver the next object 30 toward the target zone 50. Other body movements, such as another hand gesture, a voice command, a head nod, etc., can also be used by the trainee 8 to interact with the delivery device 20 to control or adjust projection of the object 30 along the pre-determined trajectory. For example, pointing left, right, up, or down, can indicate which area of the target zone that the trainee 8 wishes the delivery device 20 to deliver the next object 30; waving can indicate for the delivery device 20 to halt delivering objects 30, increase a speed of the next object 30, or decrease a speed of the next object 30, etc.
The detection and recognition of light and color can be used by the controller 28, 29 to control the delivery device 20. For example, the delivery device 20 can be configured to deliver an object 30 to a specific location in the target zone 50, where the location is determined by where a light illuminates a portion of the target zone 50. If multiple objects 30 are to be sequentially projected to the target zone 50, then causing the light to illuminate various locations in the target zone 50 can cause the delivery device 20 to track to deliver the object 30 to the illuminated location.
Additionally, the controller 28, 29 can detect color or color patterns to control the delivery device 20. The color or color patterns can indicate a type of activity for which the delivery device 20 is to be used for training. The color or color patterns can indicate a skill level required for the training, or a skill level of the trainee 8. The color or color patterns can indicate the areas of the target zone 50 at which the current trainee 8 performs best, performs worst, or somewhere in between. This can be referred to as a “heat map” where the colors in the heat map indicate performance levels of the trainee 8. The delivery device 20 can be controlled by the controller 28, 29 to deliver objects to areas of the target zone 50 indicated by the heat map to be trouble spots for the trainee 8. These heat maps can be generated from previous training sessions and updated after each training session.
The “Heat Map” is a type of graph, generally with the same dimensions of a target zone. The graph can be used to portray where a specific batter has a greater percentage of hits within the target zone 50. Although there are a variety of types of “Heat Maps,” locations within the strike zone of hits can be represented generally by reds/orange and yellow “hot” colors. Specific areas within the target zone 50 represented by blue can portray locations where the trainee 8 is having the least success. These areas can also be known as “soft contact areas.”
These blue or weak areas in a trainee’s heat map may indicate a specific vision deficiency for the trainee, such as depth perception, anticipation timing, speed of visual processing, visual reaction, and response timing. An opponent would prefer to throw into the blue areas to have greater probability of success against the trainee 8.
Heat maps can be used in multiple sports where a target zone 50 is used, such as a goalie in Soccer, Hockey, or Lacrosse. The heat map can represent where in the target zone the goalie is most vulnerable and gives up the most goals. Heat Maps can be used in Tennis to determine where a player has the least success in returning serves, volleys, etc., on the court or from which side (e.g., back hand or forehand). Heat maps can also be created and utilized in Tactical training to determine where (location) a trainee 8 has the slowest recognition and reaction times.
The controller 28, 29 can also create a virtual grid 260 that can be displayed to the trainee 8 via a pair of augmented reality goggles to indicate the portion of the grid to which the next object 30 is going to be projected. The grid can represent a target zone 50 in baseball, softball, soccer, hockey, tennis, etc. By providing the anticipated destination of the next object 30, the trainee 8 can focus on interacting with the object 30 without as much emphasis being required for determining its destination and then reacting to location.
The virtual grid 260 can also be for a horizontal playing surface, such as in tennis, table tennis, cricket, etc. Each serve, groundstroke, volley, etc., in a game or practice can be captured via imaging sensors 32 and the controller 28, 29 and stored in a database (e.g., Game statistics database 36 in FIG. 3), according to the specific location on the court (or in the grid) where the ball impacted the court. By capturing the grid locations 262 for each successive serve or volley, the delivery device 20 can use the stored grid locations 262 to simulate a real-time game or practice event. The controller 28, 29 can read the stored grid locations 262 and cause the delivery device 20 to project successive objects 30 to the grid locations 262 according to the stored sequence.
This particular training method can incorporate various colors where the individual section in the grid turns a specific color depicting the type of stroke that was used. For example, Red for Serves, Yellow for Volleys, Blue for Lobs, etc. The trainee 8 can use Augmented Reality glasses to see the same grid that the delivery device 20 is using and to see where the next stroke will be delivered. The training allows for changing velocity of strokes to match the skill level of trainee 8. For example, although capturing a real-time event between world-class professionals, the locations in the grid can remain the same, but the velocity of strokes can be reduced.
An interactive training method 110 can be used to instruct a trainee 8 throughout the training session. A display 340 can be configured to interact between the controller 28, 29, and the trainee 8 during training sessions with the delivery device 20. One-way interactive training method 110 can use interactive videos displayed on the display 340 to provide instructions during the session, provide reminders (such as reminding trainee of previous training aspects or current expectations of performance or actions), and offer encouragement during the training session. The interactive videos can be tailored to a trainee’s current skill level. For example, videos for lower skilled trainees can focus on more basic skills training to lay a foundation that can be built upon in further training sessions, and videos for higher skilled trainees can focus on specialized aspects of the training to provide improvements in more and more specialized skills.
The interactive training method 110 can include pre-recorded video files that can be retrieved from a database that is coupled to the controller 28, 29 and played back on the display 340 at appropriate times during the training session. At the beginning of the training session, a video can instruct the trainee on how to setup the delivery device 20 and how to generally interact with the delivery device 20 during the session. The interactive video can be used to describe how to setup the training field 100 to perform segmenting training, where one or more screens are positioned between the delivery device 20 and the trainee 8 to hide a portion of the trajectory of the object 30 so the amount of time the trainee 8 has to track the object 30 is varied (such as by moving the screens toward or away from the trainee 8). The interactive video can be used to describe how to setup the training field 100 to perform other training methods using the delivery device 20.
The interactive video can describe attributes of the training session prior to projecting an object 30 from the delivery device 20 toward the trainee 8 or a target zone 50. After the delivery device 20 projects one or more objects 30 along predetermined trajectories, then the interactive video on the display 340 can display scoring for the trainee related to their interaction with the objects 30. Based on the scoring, the controller 28, 29 via the interactive video can highlight areas of needed improvement as well as instruct the trainee 8 on how to improve. The interactive video can also suggest other training methods for the trainee 8 that may be more focused on those areas of needed improvement.
If the trainee 8 scoring is above a pre-determined value or score, then the interactive video can recommend additional training methods or sessions that can build on the strengths of the trainee 8 or identify other areas where the trainee 8 may be weak. Based on these recommendations, the trainee 8 can setup the delivery device 20 to perform other training methods. The controller 28, 29 can then recall interactive videos for the new training and display the new interactive videos during the new training session. The controller 28, 29 can use imaging sensors 32 to capture imagery of the trainee’s performance and, from analysis of the imagery, determine which instructional interactive video to output to the display 340.
For example, if a batter drops their shoulder at the incorrect time during their swing to impact the object 30, then the controller 28, 29 can play a video on the display 340 to illustrate to the trainee 8 the desired body movements during swinging at the object 30 prior to the delivery device 20 projecting the next object 30 toward the trainee 8 or target zone 50. For another example, if the controller 28, 29 detected incorrect positioning of a goalie in front of a hockey net or incorrect defensive movements, then the controller 28, 29 can play a video on the display 340 to illustrate the correct positioning or correct defensive movements prior to the delivery device 20 projecting the next object 30 toward the trainee 8 or target zone 50.
In a non-limiting embodiment, the interactive videos may include a demonstration as to why a specific drill is important for vision training, instructions on how to setup the delivery device 20, safety instructions, as well as instructions on how to interact with the delivery device 20 via voice, motion, or other commands (such as via an input device 342). A training curriculum may be several training sessions in length for teaching the trainee 8 full concepts of a training activity or sport with one or more interactive videos initiated throughout the training sessions to instruct or remind the trainee 8. For example, the training curriculum may be used for baseball/softball trainees learning the "Approach" to hitting, plate, or strike zone discipline, 15 seconds of excellence on defense, setting up segmenting screens, etc. For example, the training curriculum may be used for Quick Reaction Drills, where the interactive video can be used to explain how to perform and set up quick reaction drills, how to score the drills, and how to divide into teams. For example, the training curriculum may be used for tennis, where the interactive video can direct a trainee through various ground stroke drills as possibly telling the trainee 8 when to switch to back hand.
The interactive videos can be a software format that can be stored in non-transitory memory in the controller 28, 29 and recalled from a database to be displayed on the display 340 for viewing by the trainee 8. The controller 28, 29 can control when the interactive videos are displayed on the display 340 during the training session. For example, the controller 28, 29 can periodically pause operations of the delivery device 20 to allow an interactive video to be played on the display 340. After the video is played, the controller 28, 29 can then resume operations of the delivery device 20 to perform the desired training method.
The interactive videos can be recorded by a company that manufactures the delivery device 20 or provides training support, and they can be delivered along with the delivery device 20 or as a separate delivery to the customer. Additionally, the interactive videos can be recorded by a coach, a trainee, a parent or guardian of the trainee, or other individual 4 to insert their own video commands as part of the training sessions. For example, a parent oversees can record an instructional video to be played during the training sessions to encourage or instruct the trainee.
For example, a coach can develop their own set of instructional videos to be paired with the various training curriculums and in this way, they can teach multiple players the same instructions without being physically present.
The interactive videos can also be used, by the controller 28, 29 to interact with the trainee 8 during training sessions, by requiring inputs from the trainee 8 to progress through the training session. The trainee 8 can command the delivery device 20 to setup for a particular training session by instructing the controller 28, 29, via voice commands, to setup the delivery device 20 for the training session. The delivery device 20 can indicate reception of the voice command by an indicator light or a movement of the delivery device 20. The trainee 8 can then get in position to receive and react to an object 30 projected from the delivery device 20. The trainee 8 can then command the delivery device 20 to begin projecting a sequence of objects from the delivery device 20 by providing a voice command or body movement command to the delivery device 20 (e.g., the controller 28, 29). The trainee 8 can interact with the delivery device 20 throughout the training session via voice commands or body movement commands (e.g., hand movement, head nod, foot movement, sport tool movement, etc.)
The trainee 8 can also use voice commands to tailor a training session to project objects 30 to the trainee 8 based on personalized parameters. For example, prior to the start of the training session, the trainee through voice command can ask the delivery device 20 for a “Personal Option.” The delivery device 20 can receive the command and respond to the command by speaking or displaying “What is your personal option?” The trainee 8 may then respond with the personal options for setting up the delivery device 20. For example, the trainee 8 may request that the delivery device 20 project one or more objects 30 as a “righthand slider pitch” at a speed of “80 miles per hour (MPH)”, and to a target zone 50 area “7 (low and away).” The delivery device 20 can respond audibly or visually by asking “Do you have Mask and Safety Glasses on?” The trainee 8 can respond with “Yes” or “No.” If “No”, then the delivery device 20 can pause operation until the trainee 8 responds with “Yes.” The delivery device 20, after recognizing the “yes” response from the trainee 8, can request the trainee 8 to say “Start” when ready to begin. The “Start” can also be communicated to the delivery device 20 via body movements instead of voice commands. The delivery device 20 can then deliver a set of objects 30 to the trainee 8 based on the personalized parameters provided by the trainee 8 (or coach 4). At end of the set of objects 30, the delivery device 20 can pause operations, can display the trainee’s scores for the set of objects 30, recommend additional training sessions and parameters. The trainee 8, can command the delivery device 20 to continue with the current parameters and project another set of objects 30 one after another toward the trainee 8.
Alternatively, or in addition to, the trainee 8 can adjust parameters of the delivery device 20 by again asking the delivery device 20 for a “Personal Option” and repeating the process for commanding the delivery device 20 to setup for delivery of additional objects 30 per the new parameters. Then the trainee 8 can interact with the delivery device 20 via voice commands or body movement commands to progress through the training session.
FIG. 3 is a representative functional block diagram of an object 30 delivery device 20 that can support the systems and methods of the current disclosure, as well as other systems and methods. The delivery device 20 can include a chassis 22 adjustably mounted to a base 18 that can move the delivery device 20 along a surface 6 in directions 166, 168. The delivery device 20 can include one or more local controllers 28 (referred to as controller 28) that can be communicatively coupled to components within the delivery device 20 via a network 35 as well as communicatively coupled to one or more remote controllers 29, one or more imaging sensors 32, and one or more external databases 36 via one or more networks 33a, 33b, 34. In some network configurations, the network 35 can include one or more internal networks 35 for communicating to components of the delivery device 20. The controller 28 can be communicatively coupled to a non-transitory memory 37 that can store a delivery device parameter database 38. Sets of delivery device parameters can be stored in the database 38, where each set can be used to configure, via the controller 28, 29, the delivery device 20 to deliver an object 30 along a respective predetermined trajectory. These internal networks 35 can include networks with standard or custom network protocols to transfer data and commands to/from the delivery device 20 components. The one or more remote controllers 29 (referred to as controller 29) can be communicatively coupled to the local controller 28 via a network 33a that communicatively couples the external network 34 to the internal network 35 (with network 33b not connected). In this configuration, the remote controller 29 can command and control the delivery device 20 components directly without direct intervention of the local controller 28. However, in a preferred embodiment, the controller 29 can be communicatively coupled to the controller 28 via the network 33b, which is not directly coupled to the network 34 (with network 33a not connected). In this configuration, the controller 29 can communicate configuration changes (or other commands and data) for the delivery device 20 to the controller 28, which can then carry out these changes to the components of the delivery device 20. If should be understood, in another configuration, the networks 33a, 33b, 34, 35 can all be connected with the controllers 28, 29 managing the communications over the networks.
In a non-limiting embodiment, the delivery device 20 can include a guide 24 that can modify the trajectory and spin of the object 30 as the object 30 is projected toward the target zone 50 or trainee 8. The guide 24 can include a barrel 360 with a center axis 90 through which the object 30 can be projected toward a friction device 200. The friction device 200 can have a center axis 92 and can be rotated about the center axis 92 to alter the engagement of the object 30 when it impacts the friction device 200 at position 30’ ’ ’ . An object 30 can be received from the object storage area 120 and located at position 30’ in a first end of the barrel 360. A pressurized air source 152 can be fluidically coupled to the first end of the barrel 360 via conduit 158, with delivery of a volume of pressurized air controlled by a valve 154. The valve 154 and the air source 152 can be controlled by the controller 28, 29 to adjust the air pressure applied to the object 30 at position 30’ as well as the volume of air applied. It should be understood that pressurized air is only one possible option for delivering a desired force to the object 30 to project the object 30 through the barrel 360. Pneumatics other than pressurized air can be used as well as hydraulics, electrical, electro-mechanical, or mechanical power sources that can supply the desired force to the object 30 to project the object 30 through the barrel 360.
In a non-limiting embodiment, an air pressure can be at least 3 PSI (i.e., pressure per square inch), at least 4 PSI, at least 5 PSI, at least 6 PSI, at least 7 PSI, at least 8 PSI, at least 9 PSI, at least 10 PSI, at least 20 PSI, at least 30 PSI, at least 40 PSI, at least 50 PSI, at least 60 PSI, at least 70 PSI, at least 80 PSI, at least 90 PSI, at least 100 PSI.
In another non-limiting embodiment, the air pressure can be less than 220 PSI, less than 210 PSI, less than 200 PSI, less than 190 PSI, less than 180 PSI, less than 170 PSI, less than 160 PSI, less than 150 PSI, less than 140 PSI, less than 130 PSI, less than 120 PSI, less than 110 PSI, less than 100 PSI, or less than 90 PSI.
It will be appreciated that the air pressure may be within a range including any one of the minimum and maximum values noted above, including for example, but not limited to at least 5 PSI and less than 220 PSI inches, or within a range of at least 5 PSI and less than 200 PSI, or within a range of at least 10 PSI and less than 200 PSI, or within a range of at least 5 PSI and less than 180 PSI.
In a non-limiting embodiment, a length of the barrel 360 can be at least 2 inches, at least 3 inches, at least 4 inches, at least 4.5 inches, at least 5 inches, at least 5.5 inches, at least 6 inches, at least 7 inches, at least 8 inches, at least 9 inches, at least 10 inches, at least 11 inches, or at least 12 inches.
In another non-limiting embodiment, the length of the barrel 360 can be less than 48 inches, less than 36 inches, less than 24 inches, less than 23 inches, less than 22 inches, less than 21 inches, less than 20 inches, less than 19 inches, less than 18 inches, less than 17 inches, less than 16 inches, less than 15 inches, less than 14 inches, less than 13 inches, less than 12 inches, less than 11 inches, less than 10 inches, less than 9 inches, less than 8 inches, less than 7 inches, less than 6 inches, less than 5.5 inches.
It will be appreciated that the length of the barrel 360 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 2 inches and less than 48 inches, or within a range of at least 4.5 inches and less than 24 inches, or within a range of at least 4.5 inches and less than 5.5 inches, or within a range of at least 3 inches and less than 12 inches.
When the valve 154 is actuated, a controlled volume of pressurized air (or other pressurized gas) can be delivered to the first end of the barrel 360 for a predetermined length of time to force the object 30 to be propelled through the barrel 360 at a predetermined velocity, such that at position 30” the object 30 achieves a desired velocity vector 174. The velocity vector 174 can range from 25 miles per hour to 135 miles per hour. If the friction device 200 is not in a position to interfere with the trajectory 46 of the object 30 as it is propelled from a second end of the barrel 360, then the object 30 may continue along trajectory 46 and exit the delivery device 20 without having additional spin or deflection imparted to the object 30 by the friction device 200. This may be used for delivering “fast balls” along the predetermined trajectory 42 since the object does not engage the friction device 200 before it exits the delivery device 20. However, if the friction device 200 is positioned to interfere with the object 30 as it is propelled from the second end of the barrel 360, then object 30 can engage (or impact) the friction device 200 at position 30”’, thereby deflecting the object 30 from the axis 90 of the barrel 360 at an angle and imparting a spin 94 to the object. Impacting the friction device 200 can cause the object 30 to begin traveling along a predetermined trajectory 40 with an altered velocity vector 176 at position 30””. The amount of spin 94 and the amount of deflection from trajectory 46 to trajectory 48 can be determined by the velocity vector 174 of the object 30 at position 30”, the spin of the object 30 at position 30”, the azimuthal position of the friction device 200 about its center axis 92, the azimuthal position of the friction device 200 about the center axis 90 of the barrel 360, the incline (arrows 89) of the friction device 200 relative to the center axis 90, the length (arrows 88) of the friction device 200, and the surface material on the friction device 200. The object 30 can then continue along the predetermined trajectory 48 to the target zone 50 or toward the trainee 8.
If another trajectory is desired, then the controller 28, 29 can modify the parameters of the delivery device 20 (such as changing the velocity vector 174 and spin of the object 30 at position 30”, changing the azimuthal position of the friction device 200 about its center axis 92, changing the azimuthal position of the friction device 200 about the center axis 90 of the barrel 360, changing the incline (arrows 89) of the friction device 200 relative to the center axis 90, changing the length (arrows 88) of the friction device 200, or changing the surface material on the friction device 200) to deliver a subsequent object 30 along a new predetermined trajectory 48.
In a non-limiting embodiment, in addition to these parameters mentioned above, there are also parameters of the barrel position and delivery device 20 chassis 22 position that can be used to alter a trajectory of an object 30 to travel along a predetermined trajectory (e.g., 40, 42, 44) to a target zone (or trainee 8). Some of these parameters affect the orientation of the barrel 360 within the delivery device 20, while others can affect the orientation and position of the chassis 22 of the delivery device 20 relative to a surface 6, while others affect selecting an object 30 to be propelled from the barrel 360. In a non-limiting embodiment, all these parameters can have an impact on the trajectory of the object 30 as it is projected from the delivery device 20 toward the target zone 50 or trainee 8.
The barrel 360 can be rotated (arrows 86) about its center axis 90. This can be beneficial if the barrel 360 includes a non-smooth inner surface, such as an internal bore of the barrel 360 with rifling grooves (i.e., a surface with helically oriented ridges or grooves along the internal bore of the barrel 360) that can impart a spin (clockwise or counterclockwise) to the object 30 as the object 30 travels through the internal bore of the barrel 360. Other surface features can also be used on the internal bore of the barrel 360 to affect the spin of the object 30 as it travels through the barrel 360.
The barrel 360 can be rotated (arrows 84) about the axis 91 to adjust the direction of the object 30 as it exits the barrel 360. The barrel 360 can also be moved (arrows 85) to adjust a distance between the exit end of the barrel 360 and the friction device 200.
The friction device 200 can be coupled to a structure (e.g., structure 210 via support 202) that can be used to rotate the friction device 200 about the center axis 90 of the barrel 360. This can be used to change the deflection angle imparted to the object 30 when it impacts the friction device 200 at position 30’”.
The chassis 22 can be rotationally mounted to a base 18 at pivot point 148. Actuators 144 can be used to rotate the chassis 22 about the X-axis (arrows 81) or the Y-axis (arrows 82) relative to the surface 6 by extending/retracting. There can be four actuators 144 positioned circumferentially about the center axis 93. The base 18 can rotate the chassis 22 about the Z-axis (arrows 80) relative to the surface 6. The support 142 can be used to raise or lower (arrows 83) the chassis 22 relative to the surface 6. Supports 146 can be used to stabilize the support 142 to the support structure 160. The support structure 160 can have multiple wheels 164 with multiple axles 162 to facilitate moving the support structure 160 along the surface 6 in the X and Y directions (arrows 166, 168). The support structure 160 can house an optional controller 169 for controlling the articulations of the base 18 to orient the chassis 22 in the desired orientation. This controller 169 can be positioned at any location in or on the base 18 as well as in or on the chassis 22. It is not required that the controller 169 be disposed in the support structure 160.
In a non-limiting embodiment, the delivery device 20 can include one or more storage bins 150 for storing objects 30 and delivering an object 30 to the barrel 360 at position 30’. In the example shown in FIG. 3, there are two storage bins 150a, 150b, but it should be understood that more or fewer storage bins 150 can be used in keeping with the principles of this disclosure. Storage bin 150a can contain objects 30a with storage bin 150b containing objects 30b. The controller 28, 29 (or coach 4, or trainee 8, or other individual) can select which storage bin 150a, 150b is to provide the object 30 to the barrel 360 at position 30’. If the object 30a is selected, then the storage bin 150a can release one object 30a that can be directed to the position 30’ via a conduit 156. If the object 30b is selected, then the storage bin 150b can release one object 30b that can be directed to the position 30’ via a conduit 156. Only one object 30a or 30b is released at a time in this configuration. However, the conduit 156 can be a collection conduit that receives each object 30a or 30b and holds them in a chronological order in the conduit 156 as to when they were received at the conduit 156 from the storage bins 150a, 150b. A mechanism 155 can be used to release the next object (30a or 30b) into the barrel 360 at position 30’, thereby delivering the objects 30a, 30b to the barrel 360 in the order they were received at the conduit 156. Even if only one object 30a, 30b is released to the conduit 156, the mechanism 155 can still be used to prevent the escape of pressurized gas into the conduit 156. However, the mechanism 155 is not required. Other means can be provided to prevent loss of pressurized gas through any other path other than through the barrel 360.
FIG. 4 is a representative perspective view of a friction device 200 for the delivery device 20. As similarly described above, the friction device 200 can be rotated about its axis 92 (arrows 87) as well as being rotated about the axis 90 of the barrel 360 (arrows 96). A support (e.g., support 202) can be used to support and rotate the friction device 200 about the axis 92. The barrel 360 can be rotated about the axis 90 (arrows 86) and moved toward or away from the friction device 200 (arrows 85). The object 30 can exit the barrel 360 with a velocity vector 174 at position 30”. If the object 30 impacts the friction device 200 at position 30’”, then a spin 94 can be imparted to the object 30 as well as deflecting the object 30 substantially by an angle Al relative to the center axis 90 of the barrel 360. The object 30 can travel along the resulting trajectory 48 from the object 30 impacting the friction device 200. The object 30 can have a resulting velocity vector 176 at position 30””.
In a non-limiting embodiment, the velocity vector 174, 176, 178 can be a velocity directed in any 3D direction, with the velocity of the object 30 being at least 4 MPH (i.e., miles per hour), at least 5 MPH, at least 6 MPH, at least 7 MPH, at least 8 MPH, at least 9 MPH, at least 10 MPH, at least 15 MPH, at least 20 MPH, at least 25 MPH, at least 30 MPH, at least 35 MPH, at least 40 MPH, at least 45 MPH, at least 50 MPH, at least 55 MPH, at least 60 MPH, at least 65 MPH, at least 70 MPH, at least 75 MPH, at least 80 MPH, at least 90 MPH, or at least 100 MPH.
In another non-limiting embodiment, the velocity vector 174, 176, 178 can be a velocity directed in any 3D direction, with the velocity of the object 30 being less than 220 MPH, less than 210 MPH, less than 200 MPH, less than 190 MPH, less than 180 MPH, less than 170 MPH, less than 160 MPH, less than 150 MPH, less than 145 MPH, less than 140 MPH, less than 135 MPH, less than 130 MPH, less than 125 MPH, less than 120 MPH, less than 115 MPH, less than 110 MPH, less than 105 MPH, less than 100 MPH, less than 95 MPH, less than 90 MPH, less than 85 MPH, less than 80 MPH, less than 75 MPH, less than 70 MPH, less than 65 MPH, less than 60 MPH, less than 55 MPH, less than 50 MPH, less than 45 MPH, or less than 40 MPH.
It will be appreciated that the velocity of the object 30 at the velocity vector 174, 176, 178 may be within a range including any one of the minimum and maximum values noted above, including for example, but not limited to at least 5 MPH and less than 75 MPH, or within a range of at least 15 MPH and less than 100 RPM, or within a range of at least 15 MPH and less than 220 MPH.
In a non-limiting embodiment, the friction device 200 can include a ramp 206 with one or more surface materials attached to it. The surface material controls a friction applied to the object 30 when the object 30 impacts the friction device 200. Therefore, it can be beneficial to allow the delivery device 20 to automatically select between various surface materials (e.g., 204, 205, 208). One side of the ramp 206 can have multiple surface materials 204, 205 attached thereto. Moving the friction device 200 axially (arrows 88) can cause the object to impact either the surface material 204 or 205. If the surface materials 204, 205 have different textures or friction coefficients, then impacting one or the other can alter the spin 94 or trajectory 48 of the object 30 when it impacts the friction device 200. The ramp 206 can also have one or more surface materials (e.g., 208) attached to an opposite side of the ramp 206. The ramp 206 can be configured to rotate about the axis 92 such that the surface material 208 is positioned to impact the object 30 at position 30”’. The surface materials 204, 205, 208 can be various wool fibrous materials, plastics, cottons, foam rubbers, metals such as steel, lead, copper, aluminum, or metal alloys, plant-based material, or fungus-based material.
In a non-limiting embodiment, the surface material 204, 205, 208 can have a friction coefficient that is at least 0.010, at least 0.015, at least 0.20, at least 0.25, at least 0.30, at least 0.35, at least 0.40, at least 0.45, at least 0.50, at least 0.55, at least 0.60, at least 0.65, at least 0.70, at least 0.75, at least 0.80, at least 0.85, at least 0.090, at least 0.095, at least 0.10, at least 0.15, at least 0.20, at least 0.30, at least 0.40, at least 0.50, at least 0.60, at least 0.70, at least 0.80, at least 0.90, or at least 1.00.
In another non-limiting embodiment, the surface material 204, 205, 208 can have a friction coefficient that is less than 1.50, less than 1.45, less than 1.40, less than 1.35, less than 1.30, less than 1.25, less than 1.20, less than 1.15, less than 1.10, less than 1.05, less than 1.00, less than 0.95, less than 0.90.
It will be appreciated that the friction coefficient may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.20 and less than 1.35, or within a range of at least 0.01 and less than 1.50, or within a range of at least 0.25 inches and less than 1.35.
FIG. 5 is a representative partial cross-sectional view of the friction device 200 along line 5-5 as shown in FIGG. The structure 210 can rotate (arrows 95) about the center axis 90 of the barrel 360. With the friction device 200 coupled to the structure 210 via the rotatable support 202, the friction device 200 can be rotated (arrows 96) about the center axis 90. The friction device 200 can be inclined relative to the center axis 90 by being raised up or down (arrows 89) relative to the center axis 90. Therefore, the friction device 200 can be positioned at any azimuthal position about the center axis 90 as well as being rotated about its own axis 92 (arrows 87). When the object 30, traveling along the trajectory 46, impacts the friction device 200 it can be deflected from the friction device 200 along a trajectory 48 with a spin 94 at position 30””. The desired spin 94 to the object 30 can also be represented as the desired yaw, pitch, and roll of the object 30.
FIG. 6 is a representative functional block diagram of a control system 350 for a training system 10. The local controller 28 can be communicatively coupled to a remotely located controller 29 via network 33b, the game statistics database 36 via network 33a, and input device 342, and a display 340. The input device 342 can provide a human-machine- interface (HMI) that accepts user inputs (trainee 8, coach 4, or others) including voice commands and transmits the user inputs to one or more processors 330 of the controller 28. In a non-limiting embodiment, the input device 342 can be a keyboard, mouse, trackball, virtual reality sensors, graphical user interface GUI, a touch screen, mechanical interface panel switches, a button, a stride sensor, microphone to input voice commands recognized by the controller 28, video sensors to detect gestures of a trainee 8 or coach 4 other individual.
In a non-limiting embodiment, the display 340 can be used to display performance scores to a user (i.e., trainee 8, coach 4, other individual, etc.), GUI interface windows, training trajectory (single or multiple), emulated game trajectory, and player 14 associated with the game trajectory, video of game trajectory, video of training trajectory while or after object is projected to target zone, training statistics and trends, selection criteria for objects 30, selection criteria for training trajectories 40, delivery device 20 parameters and parameters selected by the input device. For example, in baseball or softball training, the display 340 can be used to display a type of pitch, the speed of delivery of object 30 at the target zone 50, a location of delivery of the object 30 at the target zone 50, text messages about the delivered object 30, animations, videos, photos, or alerts about the delivered object 30. The display is intended to provide the trainee 8 or coach 4 immediate feedback about the delivered object 30. The input device 342 and display 340 are shown separate, but they can be integrated together in a device, such as a smartphone, smart tablet, laptop, touchscreen, etc.
The network interface 332 can manage network protocols for communicating with external systems (e.g., controller 29, database 36, imagery sensors 32, tracking device 190, etc.) to facilitate communication between the processor(s) 330 and the external systems. These external systems are shown connected to the network 34, but they can also be disconnected and reconnected as needed. For example, the tracking device 190 may not be connected to the network until it is positioned on a docking station for downloading its acquired data. Additionally, the delivery device 20 may not always be connected to an external network. When it is reconnected to an appropriate external network, the communication between the external systems can again be enabled.
In a non-limiting embodiment, the processor(s) 330 can be communicatively coupled to a non-transitory memory storage 37 which can be used to store program instructions 334 and information in databases 38, 336, 338, 344, 346. The processor(s) 330 can store and read instructions 334 from the memory 37 and execute these instructions to perform any of the methods and operations described in this disclosure for the delivery device 20. The delivery device parameters (see parameters described above) for each training trajectory 40 can be stored in the delivery device parameter database 38 in the memory 37. This database 38 can be organized such that each training trajectory 40 that has been defined by a set of delivery device parameters can have a trajectory entry in the database 38. When this trajectory entry is accessed, the set of delivery device parameters can be transferred to the processor(s) 330, which can use the parameters to adjust the delivery device 20 components to deliver the predetermined trajectory defined by the trajectory entry.
If a user wishes to define a canned sequence of trajectories, then the processor(s) 330 (based on inputs from the input device) can assemble the sequence of trajectories including their associated delivery device parameters and store the sequence in the sequential trajectory database 336 as a retrievable set of predetermined trajectories. When accessed by the processor(s) 330, the sequential trajectory database 336 can deliver the set of predetermined trajectories to the processor(s) 330 including the delivery device parameters. The processor(s) 330 can then sequentially setup the delivery device 20 to sequentially project objects one after another to produce the desired set of predetermined trajectories in the desired order. The memory 37 may also contain a game trajectory database 338 which stores the game parameters of the game trajectories that have been received from other sources (such as the tracking device 190, the game statistics database 36, or user inputs) and can save them for later emulation by the delivery device 20.
FIG. 7 is a representative functional diagram of a system and method for tracking movement of a trainee’s eye (or eyes) and comparing it to a trajectory 40 of an object 30. Imaging sensors 32 can be used to collect imagery of the object 30 as it travels along the trajectory 40 as well as tracking eye characteristics of the trainee’s eye. The eye characteristics can include a direction of the center line of sight (or fovea vision or central vision) of an eye of the trainee 8, movement of the eye in an eye socket, movement of an iris of the eye, size of a pupil of the eye, and combinations thereof.
For example, as the object 30 is projected along the trajectory 40, the trainee 8 can attempt to track the object 30 with their eyes. As the object 30 continues along the trajectory 40 the trainee 8 can continue to move their eyes to track the object 30. The imaging sensors 32 can be used to capture imagery that contains the trajectory 40 of the object and the movements of the eye (or eyes) and a time stamp of the movements. The imagery can be transmitted to the controller 28, 29 which can be configured to analyze the trajectory 40 to determine the parameters of the trajectory 40, such as the 3D position of the object 30 in space along the trajectory 40 and the velocity vectors (e.g., 176) of the object 30 as it traveled along the trajectory 40.
The controller 28, 29 can also be configured to analyze the recorded eye movements of the trainee’s eye to determine the direction from the eye of the center line of sight 250 of the eye. At any position along the trajectory 40 (such as position 30’), the controller 28, 29 can correlate the object position along the trajectory 40 with the eye movements based on syncing the time of the position of the object 30 along the trajectory 40 (e.g., position 30’) with the time of the eye movements of the trainee 8. With the center line of sight 250 correlated to the object position (e.g., 30’), then the controller 28, 29 can calculate a deviation L9 between the object 30 and the center line of sight 250. Calculating a deviation L9 for multiple positions along the trajectory 40 can be used to score the ability of the trainee 8 to track the object 30 along the trajectory 40. The larger the deviation L9, the lower the score. The deviations L9 can be plotted vs. time to display to the user (trainee 8, coach 4, another individual, etc.) for understanding areas of strength or weakness of the trainee 8 in tracking the object 30 along the trajectory 40.
The method for tracking movement of a trainee’s eye (or eyes) and correlating the eye movement to the positions of the object 30 along the trajectory (e.g., 40) can be used with any of the training systems 10 described in this disclosure. For example, during segmenting training 118, the correlation between the trajectory 40 and the eye movement can be used to score the trainee’s ability to track the object 30 through the end portion of the trajectory 40 that can be reduced (e.g., barrier 220 and possibly the light source 230 moved to respective positions 220’, 230’) as the score is improved, or increased (e.g., barrier 220 and possibly the light source 230 moved to original positions) as the score is unchanged or worse.
It should also be understood that a coach 4 or another individual can score the ability of the trainee 8 to track the object 30 along the trajectory 40 by visually observing the trainee 8 as they attempt to track the object 30. This can be seen as being somewhat less precise than the method of correlating the eye movements to the object positions along the trajectory 40 using the controller 28, 29. However, this manual correlation can also be used to improve the trainee’s ability to track the object 30 along the trajectory 40.
The training system 10 in FIG. 7, as well as various other training systems 10 described in this disclosure, can be used to perform strike zone training 119. Strike zone training 119 can be used to improve an ability of the trainee 8 to recognize objects 30 delivered to the target zone 50 (which can be referred to as a strike zone for baseball and softball sports). By using the smaller object 30 training as described herein, the trainee 8 can hone their skills for recognizing strikes and those that are not strikes.
The strike zone training 119 can occur when the delivery device 20 sequentially projects objects 30 along a predetermined trajectory (e.g., trajectory 40) and the trainee 8 indicates when they believe the object 30 arrives within the target zone 50 by providing a user input to the controller 28, 29 via an HMI device 170. The target zone 50 can include sensors 51 as previously described. These sensors 51 can detect a location in the target zone at which the object 30 arrives. The trainee 8 can actuate or interact with the HMI device 170 to indicate if they think the object 30 arrived in the target zone 50, and the HMI device can transmit the indication to the controller 28, 29, which can compare the indication with actual arrival location of the object 30. The controller 28, 29 can also determine if the object did not arrive within the target zone 50, either due to a lack of indication from the sensors 51 that the object 30 arrived at the target zone 50 or possibly sensors (not shown) that are positioned outside the target zone 50.
A high score can be when the indication is received from the HMI device 170 and the object 30 arrives within the target zone 50, or when the object 30 does not arrive in the target zone 50 and no indication is received from the HMI device 170. A low score can be when the indication is not received from the HMI device 170 and the object 30 arrives within the target zone 50, or when the object 30 does not arrive in the target zone 50 and an indication is received from the HMI device 170.
The controller 28, 29 can average the individual scores over a period of time or over multiple objects 30 delivered toward the target zone 50. This average score can be used (as well as the individual scores) can be used to provide feedback to the trainee 8 (or the coach 4, other individual, or the controller 28, 29) for improving the trainee’s performance of recognizing objects 30 arriving in the target zone 50 and those that do not arrive in the target zone 50. Training with the smaller object 30 can allow the trainee 8 to recognize regulation game objects even easier during a real-life event and thereby more easily recognize balls and strikes in the real-life event. This training 119 can be well suited for baseball, softball, cricket, soccer, or any sport with a target area for receiving a game object. However, the strike zone training 119 can also be used for other not as well-suited sports or tactical situations to improve eye-body coordination of a trainee 8.
The strike training 119 can also be used to improve the trainee’s ability to recognize when the object 30 arrives at the target zone 50. So the trainee 8 can send an indication via the HMI device 170 to the controller 28, 29 when they believe the object 30 arrives at the target zone 50. The controller 28, 29, via comparison to the sensor data received from the sensors 51, can determine a score based on the comparison of the time of arrival of the object 30 at the target zone 50 and when the indication is initiated at the HMI device 170 by the trainee 8.
The indication from the HMI device 170 can be initiated by: body movement of the trainee 8; eye movement of the trainee 8; hand movement of the trainee 8; leg movement of the trainee 8; arm movement of the trainee 8; head movement of the trainee 8; audible sound signal from the trainee 8; movement of a sports tool 12; actuation of a key on a keyboard; actuation of a switch; trainee 8 interaction with the HMI device 170; or combinations thereof. Additionally, a good performance of the trainee 8 regarding the object 30 can be when the actual arrival position is inside the target zone and the indication is received from the HMI device 170 within a pre-determined amount of time before the actual arrival time, or when the actual arrival position is outside the target zone and no indication is received from the HMI device within a pre-determined amount of time before the actual arrival time. A bad performance of the trainee 8 regarding the object 30 can be when the actual arrival position is outside the target zone and the indication is received from the HMI device 170, or when the actual arrival position is inside the target zone and no indication is received from the HMI device 170, or when the actual arrival position is inside the target zone and the indication is received from the HMI device 170 past a pre-determined amount of time prior to the actual arrival time.
This vision training for the trainee 8 using various methods of training with the delivery device 20 can be used to focus on improving specific visual abilities of the trainee 8, such as depth perception, visual reaction time and response timing, speed of visual processing, and eye-body-coordination.
For example, a target zone training method (e.g., baseball strike zone, soccer goal area; etc.) where the delivery device 20 projects the object 30 to specific areas of the target zone 50 to score the trainee’s ability to determine if the object is inside or outside the perimeter of the target zone 50 or score the ability of a trainee 8 to prevent the object 30 from entering the target zone 50 for a goal at a variety of trajectories and speeds. The visual abilities of the trainee 8 are required to perform at higher and higher levels of efficiency to correctly interact with the object 30 at the target zone 50.
For example, a quick reaction training method can also be used to further enhance and improve the visual abilities of the trainee 8. In the quick reaction training, the delivery device 20 is configured to deliver the object 30 toward the trainee 8 in a variety of challenging speeds and trajectories. The object 30 is much smaller than a sport regulation object (e.g., less than the size of a table tennis ball). A goal of the trainee 8 is to prevent the object 30 from entering the target zone 50. The delivery device 20 can continue projecting an object 30 toward the trainee 8 or target zone 50, while the trainee 8 must visually acquire the object 30, track the object 30 along at least a portion of its trajectory to the target zone 50, and cause the body to react in a way (e.g., hand movement, foot movement, head movement, sport tool movement, etc.) as to prevent entry of the object 30 in the target zone 50.
The quick reaction training method can also incorporate segmenting training using one or more screens (described previously) to make the reaction and response of the trainee 8 more challenging. When the one or more segmenting screens are in place, the trainee 8 cannot see the delivery device 20. Therefore, the trainee has an increasingly shorter amount of time to visually acquire the object 30 since the trainee 8 cannot see the object 30 until it has passed around, under, over, or through the one or more segmenting screens. As the one or more segmenting screens are progressively placed in closer proximity to the trainee 8, the quick reaction training becomes increasingly more challenging and quicker visual and physical reactions are needed. The quick reaction training method can score the trainee’s visual reaction time, trainee’s response timing, and trainee’s eye-body coordination. The scoring can indicate if the Trainee is having difficulty is specific areas of the target zone 50.
The cognitive recognition training method can use the delivery device 20 to project objects 30 of different colors toward the trainee 8. The trainee 8 can then attempt to identify the color as soon as it is delivered from the DD or as it passes around, under, over, or through the one or more segmenting screens. The trainee 8 can be scored on their ability to quickly identify the object 30 and its color. The cognitive recognition training method works to improve the trainee’s speed of visual processing.
Scoring for these visual ability training methods can be stored in a database for later retrieval and analysis. The analysis of the scores can indicate trends in performance of the trainee to the various training methods. The scores can also be used to produce or update an individual “heat map” for the trainee 8 to indicate areas of the target zone that the trainee 8 is strongest, weakest, and average. The heat map can be used to control the delivery device 20 to cause the delivery device 20 to project objects 30 to the weaker areas indicated by the heat map.
Referring again to FIGS. 2-6, the delivery device 20 can be moved to various training locations to perform the training methods. However, it can be desirable to perform a setup and calibration procedure on the delivery device 20 when it is first installed in a new training field 100. The following operations can be performed to move and recalibrate the delivery device 20 at a new training field 100.
The delivery device 20 can be removably attached to a tripod base via a quick release tripod plate. The delivery device 20 can be detached from the tripod base, moved to the new location and reattached to another tripod base, an elevator system, a gantry system, a platform, or a mounting surface via the quick release tripod plate.
The trainee 8 or other individual 8 can then power up the delivery device 20 by connecting the delivery device 20 to a power source and energizing the delivery device 20 components. After receiving power, the controller 28, 29 can begin a power-up procedure that can be referred to as an awareness protocol. The controller 28, 29 can check to see if the imaging sensors 32 are connected and in communication with the controller 28, 29 as well as checking communication to the other delivery device 20 components. The controller 28, 29 can then perform any required initialization protocols for the delivery device 20 components. Various sensors, including the imaging sensors 32, can be used to sense the environment of the training field 100 as well as the internal conditions, such as reading the accelerometer to check the orientation of the delivery device 20, motors can be moved to a pre-determined location to resync the motor positions with the controller 28, 29. The controller 28, 29 can then establish connections with peripheral devices, such as a phone, the internet, Augmented Reality glasses, smart TVs, light sources, sensors at the target zone 50, and impact sensors 56. The controller 28, 29 can determine if there are objects 30 available in the storage bin 120, and if the propulsion device to propel the object has power and is ready to project an object 30 from the delivery device 20.
Instructional videos can be played on the display 340 to instruct individuals 4 (which can include the trainee 8) to construct a backdrop at the target zone 50, construct a netting tunnel from pipes and drapes, and position the target zone 50. The individuals 4 can setup these things based on the training method to be performed, or the delivery device 20 can sense the training to be performed and then instruct the individuals 4 to construct the training field 100 accordingly.
Before projecting one or more objects 30 from the delivery device 20 to the target zone 50, the delivery device 20 can be calibrated. A laser pointer on the delivery device 20 can be used to adjust the delivery device 20 to be aimed in the general area where the desired target zone 50 will be. The delivery device 20 can then begin projecting an object 30 to the target zone 50 and capturing the trajectory of the object via the imaging sensors 32. Each captured trajectory can be compared to a desired pre-determined trajectory to determine a score of how well or poorly the delivery device 20 is projecting the object along the desired pre-determined trajectory. The controller 28, 29 scores the performance of the delivery device 20 to project the object 30 along the pre-determined trajectory, and based on the scoring, can adjust one or more parameters of the delivery device 20 to improve the score. This process can be repeated as often as needed to verify that the delivery device 20 is operating properly. After the delivery device 20 has been calibrated and has an acceptable score for delivering the object 30 along the pre-determined trajectory, the trainee 8 can commence their training sessions. VARIOUS EMBODIMENTS
Embodiment 1. A method for training comprising: projecting, via a delivery device, one or more objects along one or more predetermined trajectories toward a target zone, wherein each of the one or more objects has a diameter that is less than a diameter of a regulation table tennis ball; a trainee attempting to prevent the one or more objects from entering the target zone; and scoring an ability of the trainee to prevent the one or more objects from entering the target zone.
Embodiment 2. The method of embodiment 1, wherein the diameter of each of the one or more objects is at least 0.05 inches (1.27 mm) and less than 1.4 inches (35.56 mm).
Embodiment 3. The method of embodiment 1, wherein the target zone is a baseball strike zone and the trainee is training for baseball.
Embodiment 4 The method of embodiment 1, wherein the target zone is a soccer goal and the trainee is training for soccer.
Embodiment 5. The method of embodiment 1, wherein the trainee attempts to prevent the one or more objects from entering the target zone by swinging a sport tool at the one or more objects to impact the one or more objects.
Embodiment 6. The method of embodiment 1, wherein the trainee attempts to prevent the one or more objects from entering the target zone by intersecting the one or more pre-determined trajectories with a hand, a foot, a head, an arm, or a leg of the trainee.
Embodiment 7. The method of embodiment 1, wherein the one or more objects are colored objects, and the method further comprises training the trainee to recognize and identify a color of each of the one or more objects before the one or more objects reach the target zone.
Embodiment 8. The method of embodiment 7, further comprising: positioning one or more screens between the delivery device and the trainee, wherein the one or more screens blocks the training from viewing the one or more objects along at least a proximal portion of the one or more pre-determined trajectories; projecting the one or more objects around, under, over, or through the one or more screens; and recognizing and identifying the color of each of the one or more objects as the one or more objects travel along a distal end portion of the one or more pre-determined trajectories. Embodiment 9. The method of embodiment 8, further comprising moving the one or more screens toward the target zone to reduce an amount of time the trainee has to recognize and identify a color along the distal end portion of the one or more pre-determined trajectories.
Embodiment 10. The method of embodiment 1, further comprising scoring depth perception, anticipation timing, speed of visual processing, visual reaction timing, response timing, or combinations thereof of the trainee.
Embodiment 11. A method for training comprising projecting, via a delivery device, an object along a pre-determined trajectory toward a target zone proximate a trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring, via a controller, an ability of the trainee to interact with the object at the target zone.
Embodiment 12. The method of embodiment 11, further comprising: prior to projecting the object, displaying, via a display, a video to a trainee, wherein the video explains setup for the delivery device and a training field; and setting up the delivery device and the training field based on the video.
Embodiment 13. The method of embodiment 11, further comprising: displaying the scoring of the trainee on the display; and displaying, via the display, a video to the trainee, wherein the video explains how to improve the ability of the trainee to interact with the object at the target zone.
Embodiment 14. The method of embodiment 13, further comprising projecting, via the delivery device, one or more objects toward the target zone proximate the trainee; and scoring an ability of the trainee to interact with the one or more objects at the target zone; and displaying the scoring of the trainee on the display.
Embodiment 15. The method of embodiment 13, further comprising: adjusting one or more parameters of the delivery device based on the video; projecting a second object along a second pre-determined trajectory toward a target zone; and scoring the ability of the trainee to interact with the second object at the target zone.
Embodiment 16. The method of embodiment 11, further comprising: selecting, via the controller, a training method based on the scoring; selecting a video based on the training method, wherein the video instructs the trainee on how to perform the training method, how to score in the training method, what skills are being targeted by the training method, or combinations thereof; displaying the video to the trainee via the display; executing, via the delivery device, the training method for the trainee; and scoring performance of the trainee.
Embodiment 17. A method for training comprising: operating of a delivery device in response to a first command from a trainee to perform a training; receiving, by the trainee, a response to the first command from the delivery device, wherein the response is requesting input from the trainee; providing, via a second command from the trainee, the requested input to the delivery device; adjusting, via a controller, one or more parameters of the delivery device based on the second command; projecting, via the delivery device, an object along a pre-determined trajectory toward a target zone proximate the trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring an ability of the trainee to interact with the object at the target zone.
Embodiment 18. The method of embodiment 17, wherein the first command is a voice command, a hand gesture, a head movement, or a body movement of the trainee.
Embodiment 19. The method of embodiment 17, wherein the response to the first command is displayed to the trainee via a display, which is communicatively coupled to the controller.
Embodiment 20. The method of embodiment 17, further comprising: projecting, via the delivery device, a plurality of objects along one or more predetermined trajectories toward a target zone proximate the trainee; scoring the ability of the trainee to interact with the plurality of objects at the target zone; and displaying one or more videos to the trainee during the projecting of the plurality of objects toward the target zone.
Embodiment 21. The method of embodiment 20, wherein the one or more videos are instructional videos to remind the trainee about training goals, to reinforce proper performance techniques for interacting with the plurality of objects at the target zone, to encourage the trainee during the training, to share scores with the trainee during the training, or to compare trainee performance to performance of a professional athlete.
Embodiment 22. A method for training comprising: projecting, via a delivery device, a plurality of first objects along a pre-determined trajectory toward a first target zone in a first training field, wherein each of the plurality of first objects has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a delivery device to deliver the plurality of first objects at a pre-determined location at the first target zone; adjusting one or more parameters of the delivery device based on the first score; projecting, via the delivery device, a plurality of second objects along a predetermined trajectory toward the first target zone, wherein each of the plurality of second objects has a diameter less than the diameter of a regulation table tennis ball; and determining, via the controller and the imaging sensor, a second score that indicates an ability of a delivery device to deliver the plurality of second objects at the pre-determined location at the first target zone, wherein the second score indicates an improved performance of the delivery device compared to the first score.
Embodiment 23. The method of embodiment 22, further comprising: moving the delivery device to a second training field, with a second target zone; and repeating the projecting of the first objects, determining the first score for the delivery device, adjusting the one or more parameters, the projecting of the second objects, and determining the second score for the delivery device for the second target zone instead of the first target zone, wherein the second score for the second target zone indicates an improved performance of the delivery device compared to the first score for the second target zone.
Embodiment 24. A method for training comprising: projecting, via a delivery device, a first object along a pre-determined trajectory toward a first target zone in a first training field, wherein the first object has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a trainee to interact with the first object at the first target zone; moving the delivery device to a second training field, with a second target zone; projecting, via the delivery device, a second object along the pre-determined trajectory toward the second target zone, wherein the second object has a diameter less than the diameter of the regulation table tennis ball; determining, via the controller and the imaging sensor, a second score that indicates an ability of the delivery device to project the second object along the pre-determined trajectory to the second target zone; adjusting one or more parameters of the delivery device based on the second score; projecting another object toward the second target zone; determining, via the controller and the imaging sensor, the second score and comparing the second score to a desired score; and repeating the adjusting the one or more parameters based on the second score, projecting the another object, and determining the second score until the second score is substantially equal to the desired score.
Embodiment 25. A system for training configured to perform any of the methods described in this disclosure.
While the present disclosure may be susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and tables and have been described in detail herein. However, it should be understood that the embodiments are not intended to be limited to the particular forms disclosed. Rather, the disclosure is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure as defined by the following appended claims. Further, although trainee embodiments are discussed herein, the disclosure is intended to cover all combinations of these embodiments.

Claims

CLAIMS:
1. A method for training comprising: projecting, via a delivery device, one or more objects along one or more predetermined trajectories toward a target zone, wherein each of the one or more objects has a diameter that is less than a diameter of a regulation table tennis ball; a trainee attempting to prevent the one or more objects from entering the target zone; and scoring an ability of the trainee to prevent the one or more objects from entering the target zone.
2. The method of claim 1, wherein the diameter of each of the one or more objects is at least 0.05 inches (1.27 mm) and less than 1.4 inches (35.56 mm).
3. The method of claim 1, wherein the target zone is a baseball strike zone and the trainee is training for baseball.
4. The method of claim 1, wherein the target zone is a soccer goal and the trainee is training for soccer.
5. The method of claim 1, wherein the trainee attempts to prevent the one or more objects from entering the target zone by swinging a sport tool at the one or more objects to impact the one or more objects.
6. The method of claim 1, wherein the trainee attempts to prevent the one or more objects from entering the target zone by intersecting the one or more pre-determined trajectories with a hand, a foot, a head, an arm, or a leg of the trainee.
7. The method of claim 1, wherein the one or more objects are colored objects, and the method further comprises: training the trainee to recognize and identify a color of each of the one or more objects before the one or more objects reach the target zone.
8. The method of claim 7, further comprising: positioning one or more screens between the delivery device and the trainee, wherein the one or more screens blocks the training from viewing the one or more objects along at least a proximal portion of the one or more pre-determined trajectories; projecting the one or more objects around, under, over, or through the one or more screens; and recognizing and identifying the color of each of the one or more objects as the one or more objects travel along a distal end portion of the one or more pre-determined trajectories.
9. The method of claim 8, further comprising: moving the one or more screens toward the target zone to reduce an amount of time the trainee has to recognize and identify a color along the distal end portion of the one or more pre-determined trajectories.
10. The method of claim 1, further comprising: scoring depth perception, anticipation timing, speed of visual processing, visual reaction timing, response timing, or combinations thereof of the trainee.
11. A method for training comprising: projecting, via a delivery device, an object along a pre-determined trajectory toward a target zone proximate a trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring, via a controller, an ability of the trainee to interact with the object at the target zone.
12. The method of claim 11, further comprising: prior to projecting the object, displaying, via a display, a video to a trainee, wherein the video explains setup for the delivery device and a training field; and setting up the delivery device and the training field based on the video.
13. The method of claim 11, further comprising: displaying the scoring of the trainee on the display; and displaying, via the display, a video to the trainee, wherein the video explains how to improve the ability of the trainee to interact with the object at the target zone.
14. The method of claim 13, further comprising projecting, via the delivery device, one or more objects toward the target zone proximate the trainee; and scoring an ability of the trainee to interact with the one or more objects at the target zone; and displaying the scoring of the trainee on the display.
15. The method of claim 13, further comprising: adjusting one or more parameters of the delivery device based on the video; projecting a second object along a second pre-determined trajectory toward a target zone; and scoring the ability of the trainee to interact with the second object at the target zone.
16. The method of claim 11, further comprising: selecting, via the controller, a training method based on the scoring; selecting a video based on the training method, wherein the video instructs the trainee on how to perform the training method, how to score in the training method, what skills are being targeted by the training method, or combinations thereof; displaying the video to the trainee via the display; executing, via the delivery device, the training method for the trainee; and scoring performance of the trainee.
17. A method for training comprising: operating of a delivery device in response to a first command from a trainee to perform a training; receiving, by the trainee, a response to the first command from the delivery device, wherein the response is requesting input from the trainee; providing, via a second command from the trainee, the requested input to the delivery device; adjusting, via a controller, one or more parameters of the delivery device based on the second command; projecting, via the delivery device, an object along a pre-determined trajectory toward a target zone proximate the trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring an ability of the trainee to interact with the object at the target zone.
18. The method of claim 17, wherein the response to the first command is displayed to the trainee via a display, which is communicatively coupled to the controller.
19. The method of claim 17, further comprising: projecting, via the delivery device, a plurality of objects along one or more predetermined trajectories toward a target zone proximate the trainee; scoring the ability of the trainee to interact with the plurality of objects at the target zone; and displaying one or more videos to the trainee during the projecting of the plurality of objects toward the target zone.
20. The method of claim 19, wherein the one or more videos are instructional videos to remind the trainee about training goals, to reinforce proper performance techniques for interacting with the plurality of objects at the target zone, to encourage the trainee during the training, to share scores with the trainee during the training, or to compare trainee performance to performance of a professional athlete.
PCT/US2023/061577 2022-01-31 2023-01-30 Training system and method of using same WO2023147548A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263267373P 2022-01-31 2022-01-31
US63/267,373 2022-01-31

Publications (1)

Publication Number Publication Date
WO2023147548A1 true WO2023147548A1 (en) 2023-08-03

Family

ID=87431325

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/061577 WO2023147548A1 (en) 2022-01-31 2023-01-30 Training system and method of using same

Country Status (2)

Country Link
US (1) US20230241456A1 (en)
WO (1) WO2023147548A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060068365A1 (en) * 2004-09-09 2006-03-30 Kirby Smith Vision training system
KR101435506B1 (en) * 2013-04-05 2014-09-02 권성태 Coaching robot for training trainee to hit object and mehtod for controlling the same
US20170266530A1 (en) * 2016-03-16 2017-09-21 Francis P. Pepe Ball tossing and training device and system
KR102265914B1 (en) * 2020-09-04 2021-06-16 권예찬 Tennis autonomous training system
JP2021146193A (en) * 2020-03-17 2021-09-27 エスジーエム・カンパニー・リミテッドSGM Co., Ltd. Virtual sport device, virtual sport system, and method of executing command in virtual sport system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060068365A1 (en) * 2004-09-09 2006-03-30 Kirby Smith Vision training system
KR101435506B1 (en) * 2013-04-05 2014-09-02 권성태 Coaching robot for training trainee to hit object and mehtod for controlling the same
US20170266530A1 (en) * 2016-03-16 2017-09-21 Francis P. Pepe Ball tossing and training device and system
JP2021146193A (en) * 2020-03-17 2021-09-27 エスジーエム・カンパニー・リミテッドSGM Co., Ltd. Virtual sport device, virtual sport system, and method of executing command in virtual sport system
KR102265914B1 (en) * 2020-09-04 2021-06-16 권예찬 Tennis autonomous training system

Also Published As

Publication number Publication date
US20230241456A1 (en) 2023-08-03

Similar Documents

Publication Publication Date Title
US9345929B2 (en) Trajectory detection and feedback system
US7850552B2 (en) Trajectory detection and feedback system
US9821220B2 (en) Sport and game simulation systems with user-specific guidance and training using dynamic playing surface
US11596852B2 (en) Swing alert system and method
US20190126120A1 (en) Tennis training device using virtual targets
JP5965089B1 (en) Screen baseball system competition method
US9358455B2 (en) Method and apparatus for video game simulations using motion capture
US11911682B2 (en) Training systems and methods
US9454825B2 (en) Predictive flight path and non-destructive marking system and method
KR102344429B1 (en) Two-environment game play system
US20230241456A1 (en) Training system and method of using same
US20230241455A1 (en) Training system and method of using same
WO2022032271A1 (en) Training systems and methods
KR102704606B1 (en) Hybrid golf system and method for automatic scoring using user's mobile terminal in the same
KR20220084684A (en) Non-face-to-face billiard match system
KR20180086863A (en) An outdoor golf practice method in which a golf course is displayed, and a portable device/computer

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23747935

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE