US20200086199A1 - Virtual reality sports training systems and methods - Google Patents

Virtual reality sports training systems and methods Download PDF

Info

Publication number
US20200086199A1
US20200086199A1 US16/690,501 US201916690501A US2020086199A1 US 20200086199 A1 US20200086199 A1 US 20200086199A1 US 201916690501 A US201916690501 A US 201916690501A US 2020086199 A1 US2020086199 A1 US 2020086199A1
Authority
US
United States
Prior art keywords
life
baseball
real
user
virtual reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/690,501
Other versions
US10821347B2 (en
Inventor
Brendan Reilly
Yazhou HUANG
Lloyd CHURCHES
Chris O'Dowd
Sebastien Goisbeault
Mats Johansson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Win Reality LLC
Original Assignee
Win Reality LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/694,770 external-priority patent/US20160314620A1/en
Assigned to WIN REALITY, LLC reassignment WIN REALITY, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: EON SPORTS, LLC
Priority to US16/690,501 priority Critical patent/US10821347B2/en
Application filed by Win Reality LLC filed Critical Win Reality LLC
Assigned to EON REALITY SPORTS, LLC reassignment EON REALITY SPORTS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHURCHES, LLOYD, JOHANSSON, MATS, O'DOWD, Chris, REILLY, BRENDAN, GOISBEAULT, SEBASTIEN, HUANG, YAZHOU
Assigned to EON SPORTS, LLC reassignment EON SPORTS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EON REALITY SPORTS, LLC
Publication of US20200086199A1 publication Critical patent/US20200086199A1/en
Priority to US17/087,121 priority patent/US11278787B2/en
Application granted granted Critical
Publication of US10821347B2 publication Critical patent/US10821347B2/en
Assigned to LAGO INNOVATION FUND, LLC reassignment LAGO INNOVATION FUND, LLC SECURITY AGREEMENT Assignors: WIN REALITY, LLC
Priority to US17/700,803 priority patent/US11826628B2/en
Assigned to LAGO INNOVATION FUND III, LLC reassignment LAGO INNOVATION FUND III, LLC SECURITY AGREEMENT Assignors: WIN REALITY, INC., WIN REALITY, LLC
Assigned to WIN REALITY, LLC reassignment WIN REALITY, LLC RELEASE OF IP SECURITY INTEREST Assignors: LAGO INNOVATION FUND, LLC
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0002Training appliances or apparatus for special sports for baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0015Training appliances or apparatus for special sports for cricket
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/002Training appliances or apparatus for special sports for football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0024Training appliances or apparatus for special sports for hockey
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0071Training appliances or apparatus for special sports for basketball
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • G06K9/00671
    • G06K9/00724
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/026Control of mixing and/or overlay of colours in general
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/18Timing circuits for raster scan displays
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0002Training appliances or apparatus for special sports for baseball
    • A63B2069/0004Training appliances or apparatus for special sports for baseball specially adapted for particular training aspects
    • A63B2069/0008Training appliances or apparatus for special sports for baseball specially adapted for particular training aspects for batting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/06363D visualisation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0658Position or arrangement of display
    • A63B2071/0661Position or arrangement of display arranged on the user
    • A63B2071/0666Position or arrangement of display arranged on the user worn on the head or face, e.g. combined with goggles or glasses
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0025Football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0033Handball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0037Basketball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0066Rugby; American football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2243/00Specific ball sports not provided for in A63B2102/00 - A63B2102/38
    • A63B2243/0066Rugby; American football
    • A63B2243/007American football
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • G06K9/00342
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels

Definitions

  • the present invention relates in general to systems and methods for training athletes. More particularly, the invention is directed to virtual reality simulated sports training systems and methods.
  • Virtual reality environments may provide users with simulated experiences of sporting events. Such virtual reality environments may be particularly useful for sports such as American football in which players may experience many repetitions of plays while avoiding the chronic injuries that may otherwise result on real-world practice fields. However, conventional virtual reality sports simulators may not provide meaningful training experiences and feedback of the performance of a player.
  • a virtual reality projection system configured to project a virtual reality baseball environment shown in a first-person perspective of a user training a batting swing in the virtual reality baseball environment; and a graphics engine module connected to the electronic display, the graphics engine module configured to: generate the virtual reality baseball environment; generate a digitized image of a user selected real-life baseball pitcher; retrieve from an electronic database, real-life pitching data of the user selected real-life baseball pitcher; display in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user; display in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and display in the virtual reality environment, a replicated release of a digital baseball from a digital hand of the digitized image from a release point replicating a pitch thrown by the user selected real-
  • a machine readable non-transitory medium storing executable program instructions which when executed cause a data processing system to perform a method comprising: generating a virtual reality baseball environment in an electronic display; generating a digitized image of a user selected real-life baseball pitcher; retrieving from an electronic database, real-life pitching data of the user selected real-life baseball pitcher; displaying in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user; displaying in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and displaying in the virtual reality environment, a replicated release of a digital baseball from the digitized image from a release point positioned relative to a digital body of the digitized image, replicating a real-life pitch thrown by the user selected real-life baseball pitcher, wherein the replicated release
  • a method of simulating a baseball pitcher's pitch comprises: generating a virtual reality baseball environment in an electronic display; generating a digitized image of a user selected real-life baseball pitcher; retrieving from an electronic database, real-life pitching data of the user selected real-life baseball pitcher; displaying in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user; displaying in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and displaying in the virtual reality environment, a replicated release of a digital baseball from the digitized image from a release point based on the real-life pitching data of the user selected real-life baseball pitcher, replicating a real-life pitch thrown by the user selected real-life baseball pitcher, the replicated release of the digital baseball is displayed continuing in a
  • FIG. 1 is an exemplary flowchart illustrating a method for implementing a virtual reality sports training program.
  • FIG. 2 is an exemplary flowchart illustrating the calculation of the decision and timing scores.
  • FIG. 3 is an exemplary flowchart illustrating the decision scoring for a football quarterback.
  • FIG. 4 is an exemplary flowchart illustrating the timing scoring for a football quarterback.
  • FIG. 5 is a front, perspective view of a user in an immersive virtual reality environment.
  • FIG. 6 is a side, perspective view of a user wearing a virtual reality head-mounted display showing a virtual reality environment.
  • FIG. 7 is a front, perspective view of a simulated environment of a football game and a handheld game controller for the user to interact with the game.
  • FIG. 8 is a front, perspective view of the simulated environment of a football game just before the initiation of a play.
  • FIG. 9 is a front, perspective view of the simulated environment of a football game immediately after the initiation of a play.
  • FIG. 10 is a front, perspective view of the simulated environment of a football game showing the correct decision.
  • FIG. 11 is a front, perspective view of a simulated environment of a football game showing a multiple choice question presented to the user.
  • FIG. 12 is a front, perspective view of a simulated environment of a football game showing the possible areas from which the user may select.
  • FIG. 13 is a front, perspective view of a simulated environment of a football game showing the possible areas from which the user may select in an embodiment.
  • FIG. 14 is a front, perspective view of a simulated environment of a football game showing the possible areas from which the user may select in an embodiment.
  • FIG. 15 is a front, perspective view of a simulated environment of a football game where the user is asked to read the defense.
  • FIG. 16 is a front, perspective view of the simulated environment of a football game showing possible areas of weakness from which a user may select.
  • FIG. 17 is a front, perspective view of the simulated environment of a football game showing offensive player running patterns.
  • FIG. 18 is a front, perspective view of a user selecting the area of weakness in an embodiment.
  • FIG. 19 is a front, perspective view of a user interacting with a virtual reality environment via a virtual pointer.
  • FIG. 20 is a front, perspective view of a user selecting an audio recording in an embodiment.
  • FIG. 21 is a front, perspective view of a user selecting a lesson with the virtual pointer.
  • FIG. 22 is a front, perspective view of a user selecting from multiple choices using the virtual pointer.
  • FIG. 23 is a front, perspective view of a user receiving the score of performance in an embodiment.
  • FIG. 24 is a front view of a playlist menu in one or more embodiments.
  • FIG. 25 is a front view of a football field diagram showing details of a play in one or more embodiments.
  • FIG. 26 is a front, perspective view of a simulated environment of a football game immediately before a play is executed.
  • FIG. 27 is a front, perspective view of a user selecting a football player with the virtual pointer in one or more embodiments.
  • FIG. 28 is a front, perspective view of a user receiving the score of performance in an embodiment.
  • FIG. 29 is a schematic block diagram illustrating the devices for implementing the virtual reality simulated environment of a sporting event.
  • FIG. 30 is an exemplary flowchart showing the process of implementing the virtual reality simulated environment of a sporting event on a smartphone or tablet.
  • FIG. 31 is an exemplary flowchart showing the process of implementing the virtual reality simulated environment of a sporting event on a mobile device.
  • FIG. 32 is an exemplary flowchart showing the process of implementing the virtual reality simulated environment of a sporting event on a larger system.
  • FIG. 33 is a schematic block diagram illustrating a mobile device for implementing the virtual reality simulated environment of a sporting event.
  • FIG. 34 is a stereoscopic front perspective view of a three dimensional panoramic virtual reality environment system simulating a baseball pitch thrown from a digitized avatar of a real-life pitcher, as seen from a baseball batter's perspective, according to an exemplary embodiment.
  • FIG. 35 is a stereoscopic front perspective view of the system of FIG. 34 with a user wearing a tracked stereoscopic glasses and swinging a bat at a simulated pitch according to an exemplary embodiment.
  • FIG. 36 is an enlarged view of a digitized avatar of a real-life pitcher within a three dimensional panoramic virtual reality environment system simulating a baseball pitching sequence according to an exemplary embodiment.
  • FIG. 37 is a perspective view displayed inside a head-mounted display, which shows a digitally generated strike zone location box from a batter's perspective displayed at the end of the pitching sequence of FIG. 36 , according to an exemplary embodiment.
  • FIG. 38 is a perspective view displayed inside a head-mounted display, which shows a strike or ball decision graphic with countdown timer and the gaze-controlled crosshair according to an exemplary embodiment.
  • FIG. 39 is an illustration of a digitized real-life pitcher performing a pitching sequence and reference points determined in generating a release point in a virtual reality environment according to an exemplary embodiment.
  • FIG. 40 is an illustration of digitized real-life pitcher performing several pitch sequences with different heights in the ball release points, and reference points determined for each pitcher in generating respective release points at arbitrary locations in a virtual reality environment according to an exemplary embodiment.
  • FIG. 41 is a side schematic view illustrating before the alignment of the digitized avatar of a pitcher's release point and pitch release relative to a position of a user batter according to an exemplary embodiment.
  • FIG. 42 is a side schematic view illustrating after the alignment of the digitized avatar of a pitcher's release point and pitch release relative to a position of a user batter according to an exemplary embodiment.
  • FIG. 43 is a side schematic view illustrating generating a field background and stadium environment relative to a position of the digitized pitcher's avatar and the user batter according to an exemplary embodiment.
  • FIG. 44 is a side schematic view illustrating the tracking of a user's bat swing to determine if a simulated pitch was hit in the system of FIG. 35 .
  • FIG. 45 shows a top view schematic and a side view schematic of a tracked bat swing to determine if the bat reaches a point of contact of the simulated pitch in a plane that intersects a simulated flight path of the simulated pitched baseball.
  • FIG. 46 shows a top view schematic and a side view schematic of a tracked bat swing that was determined to have hit the simulated pitched baseball.
  • FIG. 47 is a schematic view illustrating two algorithms for determining whether a tracked swing of a baseball makes contact with a simulated baseball pitched in the system of FIG. 35 according to an exemplary embodiment.
  • FIG. 48 is a flowchart of a method of matching a perspective of a batter user with the generated digitized avatar of a real-life pitcher in a three dimensional virtual reality environment according to an exemplary embodiment.
  • FIG. 49 is a flowchart of a method of simulating a flight path of a simulated pitched baseball from a digitized avatar of a real-life pitcher in a three dimensional virtual reality environment according to an exemplary embodiment.
  • FIG. 50 is a block diagram of a system for generating a three dimensional virtual reality simulation of baseball pitches from digitized avatars of real-life baseball pitchers according to an exemplary embodiment.
  • Virtual reality environments provide users with computer-generated virtual objects which create an illusion to the users that they are physically present in the virtual reality environment.
  • Users typically interact with the virtual reality environment by employing some type of device such as headset goggles, glasses, or mobile devices having displays, augmented reality headgear, or through a cave automatic virtual environment (CAVE) immersive virtual reality environment where projectors project images to the walls, ceiling, and floor of a cube-shaped room.
  • some type of device such as headset goggles, glasses, or mobile devices having displays, augmented reality headgear, or through a cave automatic virtual environment (CAVE) immersive virtual reality environment where projectors project images to the walls, ceiling, and floor of a cube-shaped room.
  • CAVE cave automatic virtual environment
  • a sports training simulated environment is contemplated. While in a virtual reality environment, a user views a simulated sporting event.
  • a user acting as a football quarterback in the virtual reality environment may see a pre-snap formation of computer-generated defensive and offensive football players.
  • the virtual football is snapped, the play is initiated, and the offensive and defensive players move accordingly.
  • the user sees several of his virtual teammates cross the field, and the user must decide among his teammates to whom he should throw the ball.
  • the user makes his selection, and the sports training simulated environment scores the user's decisions. Scores may be based on timing (i.e., how quickly the user decides) and/or on selection (i.e., did the user select the correct player).
  • the user may repeat the virtual play or may move on to additional plays.
  • the scores are stored in a cloud based storage. The user's progress (or regression) over time will be tracked and monitored. The user can access the scores and data via a personalized dashboard in either a webpage or mobile application.
  • the user may be queried on other aspects of a simulated sporting event. For example, a user acting as a virtual quarterback may be asked to read a defense and identify areas of weakness against a play. In one or more embodiments, multiple possible answers are presented to the user, and the user selects the answer he believes is correct.
  • a comprehensive virtual reality sports training environment is contemplated.
  • a coach may develop customized plays for his team.
  • the players of the team then interact individually with the virtual reality simulated sporting event and have their performances scored.
  • the scores of the players may then be interpreted and reviewed by the coach.
  • One or more embodiments provide a means for improving athlete decision-making. Athletes in a virtual environment may experience an increased number of meaningful play repetitions without the risk of injury. Such environments may maximize effective practice time for users, and help develop better players with improved decision-making skills.
  • Embodiments described herein refer to virtual reality simulated environments. However, it shall be understood that one or more embodiments may employ augmented reality environments comprising both virtual and real world objects.
  • simulated environments simulated
  • virtual virtual
  • augmented virtual reality environment
  • simulated environments may refer to environments or video displays comprising computer-generated virtual objects or computer-generated virtual objects that are added to a display of a real scene, and may include computer-generated icons, images, virtual objects, text, or photographs.
  • Reference made herein to a mobile device is for illustration purposes only and shall not be deemed limiting.
  • Mobile device may be any electronic computing device, including handheld computers, smart phones, tablets, laptop computers, smart devices, GPS navigation units, or personal digital assistants for example.
  • Embodiments described herein make reference to a training systems and methods for American football; however, it shall be understood that one or more embodiments may provide training systems and methods for other sports including, but not limited to, soccer, baseball, hockey, basketball, rugby, cricket, and handball for example.
  • a “play” is a plan or action for one or more players to advance the team in the sporting event.
  • Embodiments described herein may employ head mounted displays or immersive systems as specific examples of virtual reality environments. It shall be understood that embodiments may employ head mounted displays, immersive systems, mobile devices, projection systems, or other forms of simulated environment displays.
  • FIG. 1 is an exemplary flowchart illustrating a machine-implemented method 101 for implementing a virtual reality sports training program.
  • the process begins with a 2 dimensional (“2D”) play editor program (step 110 ) in which a coach or another person may either choose an existing simulated play to modify (step 112 ) or else create a completely new simulated play from scratch (step 114 ).
  • the simulated play may be created by a second user such as a coach, where the simulated play defines the pre-determined location of the simulated players and the movements of the simulated players during the play.
  • An existing play may be modified by adjusting player attributes from an existing play (step 116 ).
  • a new play may be created by assigning a player's speed, animation, stance, and other attributes for a given play (step 118 ).
  • the simulated sports training environment provides a simulated environment of a sporting event to a user by one or more computing devices, where the simulated environment depicting the sporting event appears to be in the immediate physical surroundings of the user.
  • the simulated sports training environment generates simulated players in the video display of the sporting event, where each of the simulated players is located in a pre-determined location.
  • the simulated sports training environment initiates a simulated play for a period of time, where one or more simulated players move in response to the play. Assignment of ball movement throughout the play determines the correct answer in the “Challenge mode” (step 120 ).
  • a 3D viewing mode supports video game visualization and stereoscopic viewing of the play created (step 124 ).
  • the view of the play is from the individual player's helmet view point (step 126 ).
  • the user may choose the navigation throughout the environment (step 128 ). All of these modes may be viewed in any virtual reality system (step 134 ).
  • the user may be faced with a challenge mode (step 130 ) where the user's interactions with the play are monitored and scored (step 132 ).
  • the sports training environment presents a question or query to the user, receives a response from the user, and scores the response.
  • the ball movement is assigned to a player throughout a play to signify the player who has possession of the ball.
  • the simulated sports training environment may send the scored response to another user such as the coach.
  • the simulated sports training environment may send the scored response to another user to challenge the other user to beat one's score.
  • FIG. 2 is an exemplary flowchart illustrating the method 201 for calculating the decision and timing scores in the challenge mode.
  • the user interacts with a play in a virtual reality environment, and a question or query is posed to the player (step 210 ).
  • the player then interacts with the virtual reality system through a handheld controller, player gestures, or player movements in one or more embodiments (step 212 ).
  • the interaction is monitored by the virtual reality environment which determines if the interaction results in a correct answer (step 214 ), correct timing (step 216 ), incorrect answer (step 218 ) or incorrect timing (step 220 ).
  • Each of these decision and timing results are compared to an absolute score for the correct answer (step 222 ), an absolute score for the correct timing (step 224 ), an absolute score for the incorrect answer (step 226 ), or the absolute score for incorrect timing (step 228 ).
  • the comparison of the absolute scores from the correct answer (step 222 ) and a comparison of the absolute score of the incorrect answer (step 226 ) determines the decision score (step 230 ).
  • the comparison of the absolute score of the correct timing (step 224 ) and the absolute score of the incorrect timing (step 228 ) determines the timing score (step 232 ).
  • the decision score (step 230 ) is assigned a number of stars for providing feedback to the player.
  • a decision score of 60% or less generates no stars (step 234 ), a decision score of 60-70% results in one star (step 236 ), a decision score of 70-90% results in two stars (step 238 ), and a decision score of 90% or greater results in three stars (step 240 ) in one or more embodiments.
  • the timing score (step 232 ) is assigned a number of stars for providing feedback to the player.
  • a timing score of 60% or less generates no stars (step 242 ), a timing score of 60-70% results in one star (step 244 ), a timing score of 70-90% results in two stars (step 246 ), and a timing score of 90% or greater results in three stars (step 248 ) in one or more embodiments.
  • FIG. 3 is an exemplary flowchart illustrating a method 301 for determining the decision scoring for a football quarterback in one or more embodiments.
  • a coach or another person creates a play (step 310 ), where, in this example, the wide receiver is chosen as the correct teammate for receiving the football (step 312 ). The wide receiver chosen by the coach as the recipient of the ball is deemed the correct answer for the player in the challenge mode.
  • a play commences in a simulated environment, and the challenge mode is initiated as the player goes through a simulated play where the player interacts with a device and choses the player to receive the ball (step 314 ). The answer or response is chosen through a player interacting with a gamepad controller, on a tablet by clicking on an individual, or by positional head tracking.
  • the player decision is monitored, resulting in either a correct answer (step 316 ) or an incorrect answer (step 318 ).
  • the incorrect answer results from the player selecting players other than the player selected by the coach or play creator.
  • the score is calculated by dividing the correct answers by the total number of questions asked (step 320 ).
  • FIG. 4 is an exemplary flowchart illustrating a method 351 for determining the timing scoring for a football quarterback.
  • a coach or another person creates a play (step 352 ), where, in this example, the ball movement is chosen (step 354 ).
  • the coach determines the time that the ball is chosen to move from the quarterback to the wide receiver that will be deemed as the correct timing in the challenge mode.
  • a play commences in a virtual reality environment, and the challenge mode is initiated as the player goes through a simulated play where the player interacts with a device and chooses the player to whom the quarterback will throw the ball (step 356 ).
  • the player timing is monitored, resulting in either a correct timing (step 358 ) or an incorrect timing (step 360 ).
  • the timing is determined from the time of the snap of the football to the point in time that the quarterback throws the ball to the wide receiver.
  • the incorrect timing is a time of release of the football that is inconsistent with the timing established by the coach.
  • the time score is calculated by dividing the correct answers by the total number of questions (step 362 ).
  • FIG. 5 is a front, perspective view of a user 510 in an immersive virtual reality environment 501 which may be referred to as a CAVE.
  • a user 510 typically stands in a cube-shaped room in which video images are projected onto walls 502 , 503 , and 504 .
  • the real player 510 is watching the virtual player 512 running across a virtual football field.
  • the player 510 will be acting in the role of a quarterback.
  • Players in the CAVE can see virtual players in a virtual football field, and can move around them to get a view of the virtual player from a different perspective.
  • Sensors in the virtual reality environment track markers attached to the user to determine the user's movements.
  • FIG. 6 is a side, perspective view 521 of a user wearing a virtual reality head-mounted display 522 showing a virtual reality environment.
  • the head-mounted display 522 has a display positioned inches away from the eyes of the user.
  • the head-mounted display 522 monitors the movements of the user 510 . These movements are fed back to a computer controller generating the virtual images in the display 522 .
  • the virtual reality images react to the user's 510 motions so that the user perceives himself to be part of the virtual reality experience.
  • a smartphone or mobile device may be employed.
  • FIGS. 7-11 depict a sequence of virtual reality images for testing and training a quarterback to select a teammate in best position to receive the ball.
  • FIG. 7 is a front, perspective view of a simulated environment 601 of a virtual football game and a handheld controller 610 for the user to interact with the simulated football game 601 .
  • the hand held controller 610 has a joystick 612 and four buttons 614 (labeled as “A”), 616 (“B”), 618 (“X”), and 620 (“Y”).
  • the simulated football game image shows both the offensive and defensive teams.
  • Player 624 is labeled as “A”, player 626 as “B”, player 628 as “X”, and player 630 as “Y.”
  • a user acting as a quarterback, will see the simulated players move across the field, and will be required to decide when to throw a football and to select which of the simulated players 624 , 626 , 628 , and 630 will be selected for receiving the ball in an embodiment.
  • a user may select player 624 by pressing the button 614 (“A”), player 626 by pressing the button 616 (“B”), player 628 by pressing button 618 (“X”), and player 630 by pressing button 620 (“Y”).
  • real time decisions are made by the user by pressing buttons on a controller which correspond to the same icon above the head of the player.
  • FIG. 8 shows the positions of the players just before the initiation of a play.
  • a play is initiated and in real-time, the user must choose which player to whom he will throw the football.
  • the user selects which player will be selected to receive the football.
  • virtual ovals 625 , 627 , 629 , and 631 encircle the players 624 , 626 , 628 , and 630 .
  • oval 627 is cross hatched or has a color different from that of the ovals to indicate to the user that player 626 was the correct choice. If the user does not decide correctly, the user can redo the play until he makes the correct choice.
  • the score is based on how quickly decisions are made, and the number of correct decisions compared to total testing events.
  • FIGS. 11 and 12 illustrate that the simulated environment 701 can be configured to challenge users with multiple choice questions.
  • FIG. 11 shows a quarterback's perspective of a simulated football game before a play is initiated.
  • the virtual reality environment shows a pop up window 710 presenting a question posed to the user.
  • the virtual reality environment is asking the user to identify the areas of vulnerability of a coverage shell.
  • FIG. 12 the play is initiated and the user is presented with four areas 712 , 714 , 716 , and 718 representing possible choices for the answer to the question posed.
  • the user may select the area by interacting with a hand held controller, or by making gestures or other motions.
  • FIG. 13 is a front, perspective view of a simulated environment 801 of a virtual football game showing the possible areas from which the user may select in an alternative embodiment.
  • the user is presented with a view before the initiation of a play, and a pop-up window 810 appears and asks the user to identify the Mike linebacker (i.e., the middle or inside linebacker who may have specific duties to perform).
  • Three frames 812 , 814 , and 816 appear around three players and the user is given the opportunity to choose the player he believes to be the correct player.
  • FIG. 14 is a front, perspective view of a simulated environment of a football game in an embodiment.
  • the user is presented with a view before the initiation of a play, and a pop-up window 910 appears and asks the user to identify the defensive front.
  • Four frames 912 , 914 , 916 , and 918 have possible answers to the question posed.
  • the user is given the opportunity to choose the answer he believes is correct.
  • the user may choose the correct answer through interacting with a game controller, or by making gestures or other movements.
  • One or more embodiments train users to read a defense.
  • Man and Zone Reads such as picking up Man and Cover 2 defense
  • embodiments may have 20 play packages such as combo, under, curl, and so forth.
  • Embodiments enable users to practice reading Man vs. Zone defenses. Users may recognize each defense against the formation, such as against a cover 2 , and may highlight areas of the field that are exposed because the defense is a cover 2 against that specific play. Players and areas may be highlighted.
  • a training package may consist of 20 plays each against Man and Cover 2 and highlighted teaching point.
  • FIGS. 15-18 are front, perspective views of a simulated environment 901 of a football game where the user is asked to read the defense.
  • the user is asked to identify the defensive coverage, which is cover 2 in this example.
  • the user is asked to identify areas of weakness or exposure against the play, represented as areas 1012 , 1014 , 1016 , 1018 , 1020 , and 1022 .
  • the user is asked to create a mismatch against a zone and expose the areas.
  • the play is represented by players 1030 , 1032 , and 1034 traversing the field as depicted in running patterns 1031 , 1033 , and 1035 .
  • the user may use his hand 1050 to assess and identify the areas of the field that have “weak points” against that coverage.
  • FIGS. 19-23 are front, perspective views of a user interacting with a virtual reality environment 1101 .
  • the user may be wearing a head mounted display as depicted in FIG. 6 , or may be wearing glasses having markers in an immersive virtual reality environment as depicted in FIG. 5 .
  • the user sees an environment 1101 having a window 1110 describing the current lesson, an icon 1112 for activating audio instructions, and a window 1116 describing a virtual pointer 1120 represented here as a virtual crosshairs.
  • the virtual pointer may be fixed with respect to the display of the device.
  • a user may interact with the virtual reality environment by aiming the virtual pointer toward a virtual object.
  • the virtual pointer provides a live, real time ability to interact with a three-dimensional virtual reality environment.
  • the virtual pointer 1120 moves across the virtual reality environment 1101 as the user moves his head. As depicted in FIG. 20 , the user moves the virtual pointer 1120 over the icon 1112 to activate audio instructions for the lesson. The user may then use the virtual pointer 1120 to interact with the virtual reality environment 1101 such as by selecting the player that will receive the ball. As shown in FIG. 21 , the user may then either replay the audio instructions or move to the drill by sweeping the virtual pointer 1120 over and selecting icon 1122 .
  • FIG. 22 is a front, perspective view of a simulated environment 1201 illustrating that the virtual pointer 1120 may enable a user to select answers from a multiple choice test.
  • a window 1208 may pose a question to the user, where the user selects between answers 1210 and 1212 .
  • the user moves virtual pointer 1120 over the selected answer in response to the question.
  • the virtual reality environment generates a score for the user, and the user is able to attempt the test again or move to the next level.
  • FIG. 24 is a front view of a menu 1301 in one or more embodiments.
  • the user is presented with a series of plays in a playlist.
  • the user navigates the application (“app”) by successfully completing a play which then unlocks the next play in the playlist.
  • apps the application
  • icons 1310 , 1312 , 1314 , 1316 , 1318 and so forth represent plays the user has successfully completed.
  • Each of the icons may have a series of stars such as star 1311 which represents the score for that play.
  • the icons 1360 , 1362 , 1364 , and 1366 represent the “locked” plays that later become accessible as the user completes the series of plays.
  • FIGS. 25-28 illustrate a training lesson in one or more embodiments.
  • FIG. 25 is a diagram 1401 of a pre-snap formation showing details of a basic play concept shown to the user in one or more embodiments.
  • the user selects the center 1510 with the virtual pointer 1120 to snap the ball.
  • the play is executed and the user decides to which player he will throw the ball.
  • the user selects player 1512 and the user's actions are monitored and scored.
  • the user is presented with his score 1526 as well as star icons 1524 indicating performance. The user may choose between icons 1520 and 1522 with the virtual pointer 1120 to select the next action.
  • FIG. 29 is a schematic block diagram illustrating the system 1601 for implementing the virtual reality simulated sporting event.
  • the system 1601 may comprise a web-based system 1610 having a controller or processor 1611 , a computer system 1612 having another controller or processor 1613 , and website/cloud storage 1616 also having a controller or processor 1617 .
  • Both the web-based system 1610 and the computer system 1612 may be employed for creating, editing, importing, and matching plays, as well as for setting up the interaction/assessment, evaluating the interaction, viewing options, and handling feedback.
  • the web-based system 1610 and the computer system 1612 communicate to the website/cloud storage 1616 through an encoding layer such as Extensible Markup Language (“XML”) converter 1614 .
  • XML Extensible Markup Language
  • the website/cloud storage 1616 may be employed for storing plays, handling interaction outcomes, playlists, feedback, and analysis of player decisions, timing, location, and position.
  • the website/cloud storage 1616 may interface with several types of virtual reality systems including smartphones and tablets 1634 , native apps 1630 running on mobile viewers 1632 , or other computers 1620 .
  • a USB file transfer/mass storage device 1618 receives data from a computer system 1612 and provides the data to the single computer 1620 .
  • the single computer 1620 may interface with a single projector system 1626 in one or more embodiments.
  • the single computer 1620 may interface with a cluster of multiple computers 1622 , which, in turn, drive an Icube/CAVE projector system 1624 .
  • FIG. 30 is an exemplary flowchart showing the method 1701 of implementing the virtual reality simulated sporting event on a smartphone or tablet.
  • a play is developed on a desktop computer (step 1710 ).
  • the files are then uploaded to a cloud (step 1712 ).
  • the cloud then may download the play onto a mobile device (step 1714 ) such as a smartphone simulator virtual reality headset (step 1716 ), a tablet 3D view (step 1718 ), augmented reality (step 1720 ), or a video game view (step 1722 ).
  • a mobile device such as a smartphone simulator virtual reality headset (step 1716 ), a tablet 3D view (step 1718 ), augmented reality (step 1720 ), or a video game view (step 1722 ).
  • FIG. 31 is an exemplary flowchart showing the method 1801 of implementing the virtual reality simulated sporting event employing a desktop or a web-based platform.
  • a desktop computer may be employed as a play creation and editing tool.
  • the file type is formatted through an encoding layer such as XML, and is saved as a “*.play” file. Once created, the file can be sent to the XML converter.
  • the file type is XML, and it is saved as a .play file.
  • a user or coach may use the web-based version of the editing and play creation tool. The web-based version will also send the file to the XML convertor.
  • the desktop (step 1810 ) and the web-based platform (step 1812 ) interact with the XML convertor (step 1814 ).
  • the XML converter transfers data to the website (step 1816 ) having the cloud-based play storage (step 1818 ).
  • the play is then able to be stored on the website.
  • This website serves as the cloud based storage facility to host, manage, and categorize the play files.
  • the plays are downloaded to a mobile viewer (step 1820 ) where the user interacts with the simulated play (step 1822 ).
  • the website is integrated with a mobile app that automatically updates when new play files are added to the cloud based storage in the website.
  • the mobile viewer employing an app, interprets the play file.
  • the user then can experience the play, and be given the result of their actions within the play. This data is then sent back to the app/website.
  • Data is captured from the user interactions (step 1824 ) and is stored (step 1826 ). Once the data is captured, the system will display the data on the app or website so the athlete can monitor progress, learn information about his performance, and review his standing among other members from their age group. The data is accessed by the end user and the scores and progress are tracked (step 1828 ). The data capturing is the most important aspect in one or more embodiments. This data can then be used to challenge other users, invite other users to join in the same simulation, and to track and monitor a user's progress throughout their lifetime.
  • FIG. 32 is an exemplary flowchart showing the method 1901 of implementing the virtual reality simulated sporting event on a larger system. Plays are created on a desktop (step 1910 ) and files are sent to an internal network (step 1912 ), a USB mass storage device (step 1914 ), or to the cloud (step 1916 ). The data is then downloaded to a program on a local computer (step 1918 ) and is then forwarded to TV based systems (step 1920 ), projector based systems (step 1922 ), or large immersive displays integrated with motion capture (step 1924 ). Examples of such large immersive displays include Icube/CAVE environments (step 1926 ), Idome (step 1928 ), Icurve (step 1930 ), or mobile Icubes (step 1932 ).
  • FIG. 33 shows an embodiment of a mobile device 2010 .
  • the mobile device has a processor 2032 which controls the mobile device 2010 .
  • the various devices in the mobile device 2010 may be coupled by one or more communication buses or signal lines.
  • the processor 2032 may be a general purpose computing device such as a controller or microprocessor for example.
  • the processor 2032 may be a special purpose computing device such as an Application Specific Integrated Circuit (“ASIC”), a Digital Signal Processor (“DSP”), or a Field Programmable Gate Array (“FPGA”).
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • the mobile device 2010 has a memory 2028 which communicates with the processor 2032 .
  • the memory 2028 may have one or more applications such as the Virtual Reality (“VR”) or Augmented Reality (“AR”) application 2030 .
  • the memory 2028 may reside in a computer or machine readable non-transitory medium 2026 which, when executed, cause a data processing system or processor 2032 to perform
  • the mobile device 2010 has a set of user input devices 2024 coupled to the processor 2032 , such as a touch screen 2012 , one or more buttons 2014 , a microphone 2016 , and other devices 2018 such as keypads, touch pads, pointing devices, accelerometers, gyroscopes, magnetometers, vibration motors for haptic feedback, or other user input devices coupled to the processor 2032 , as well as other input devices such as USB ports, Bluetooth modules, WIFI modules, infrared ports, pointer devices, or thumb wheel devices.
  • user input devices 2024 such as a touch screen 2012 , one or more buttons 2014 , a microphone 2016 , and other devices 2018 such as keypads, touch pads, pointing devices, accelerometers, gyroscopes, magnetometers, vibration motors for haptic feedback, or other user input devices coupled to the processor 2032 , as well as other input devices such as USB ports, Bluetooth modules, WIFI modules, infrared ports, pointer devices, or thumb wheel devices.
  • the touch screen 2012 and a touch screen controller may detect contact, break, or movement using touch screen technologies such as infrared, resistive, capacitive, surface acoustic wave technologies, as well as proximity sensor arrays for determining points of contact with the touch screen 2012 .
  • touch screen technologies such as infrared, resistive, capacitive, surface acoustic wave technologies, as well as proximity sensor arrays for determining points of contact with the touch screen 2012 .
  • proximity sensor arrays for determining points of contact with the touch screen 2012 .
  • microphones for accepting voice commands
  • a rear-facing or front-facing camera for recognizing facial expressions or actions of the user
  • accelerometers for detecting motions of the device
  • magnetometers for detecting motions of the device
  • annunciating speakers for tone or sound generation are contemplated in one or more embodiments.
  • the mobile device 2010 may also have a camera 2020 , depth camera, positioning sensors 2021 , and a power source 2022 .
  • the positioning sensors 2021 may include GPS sensors or proximity sensors for example.
  • the power source 2022 may be a battery such as a rechargeable or non-rechargeable nickel metal hydride or lithium battery for example.
  • the processor 2032 may be coupled to an antenna system 2042 configured to transmit or receive voice, digital signals, and media signals.
  • the mobile device 2010 may also have output devices 2034 coupled to the processor 2032 .
  • the output devices 2034 may include a display 2036 , one or more speakers 2038 , vibration motors for haptic feedback, and other output devices 2040 .
  • the display 2036 may be an LCD display device, or OLED display device.
  • the mobile device may be in the form of hand-held, or head-mounted.
  • aspects of the subject technology may generate a baseball related simulation that may be beneficial for users (as batters) to practice hitting in preparation of competition against real-life pitchers.
  • aspects of the embodiments described below may digitize the image of a real-life baseball pitcher and display the image as a digital avatar within a baseball replicated environment.
  • the act of hitting a pitched baseball is often regarded as perhaps being the single most difficult act in all of competitive team sports. The challenge of hitting a professionally pitched baseball is more challenging than for most.
  • the speed of a pitched baseball can often reach somewhere in the range of 90 miles per hour (MPH) to over 100 MPH for today's pitchers.
  • MPH miles per hour
  • the distance from the pitcher's plate (also known as a pitcher's rubber) to home plate is 60 feet and 6 inches by Major League Baseball (MLB) rules as well as for most baseball levels from high school and up.
  • MLB Major League Baseball
  • the typical time for a pitch to reach the plate once it leaves the pitcher's hand is approximately 400 to 500 milliseconds.
  • the human eye has been found to only recognize the pitched baseball in just a few points along the path of trajectory from the pitcher's hand to the point of contact near the home plate.
  • the common approach for batters is to practice against live pitching or to use a mechanical pitching machine to practice form and timing.
  • this approach is limited by the fact that every pitcher's delivery has its own subtle characteristics that heretofore are not replicated by conventional tools and practices.
  • one pitcher may release a pitch using what is known as a true overhand delivery (from a “12 o'clock to 6 o'clock” release), while some may throw from a three-quarters slot, some from a side arm release, and a few from what is known as a submariner's release where the arm lowers down to below the waist as the arm is swung upward and forward.
  • Some video games can generally replicate a general delivery type as described above and may use a generic pitcher avatar to “simulate” a certain pitcher for a certain team, however an actual real-life pitch and delivery may vary significantly from the video game generated delivery of said pitcher.
  • some video games include a relative speed of a pitch and calculate whether a user's action (which may be tracked by a game controller) resulted in contact, users often experience a substantial lag (for example, from tracking of said game controller) in the triggering action as well as an unnatural perspective because the pitcher often looks larger than life on a monitor.
  • the release point between pitchers may remain static and becomes easy to predict and is not reflective of the pitcher's real-life release point. So while a user may become proficient at hitting a pitch from a static release point and may physically compensate for the video game system's lag, such practice does not translate into successful use in a real-life situation because the timing and release point recognition is very different.
  • pitchers Even amongst pitchers using a similar delivery type (overhead, three-quarters, etc.), pitchers have varying physical attributes (for example, height, arm length, etc.) and delivery mechanics that vary the release point that batters see from pitcher to pitcher. Recognizing and being prepared for delivery from a particular pitcher's release point is a strong tool for a batter's preparation against an opponent. This approach eliminates one of the decision making factors described above that need to be made within the split seconds available during a pitch. However, there remains the daunting task of actually recognizing a pitch type, timing and the trajectory of the pitched ball.
  • aspects of the subject technology provide a first-person perspective of pitches that simulate the actual real-life delivery and trajectory of pitches from real-life pitchers.
  • the user (sometimes referred to generally as the “batter”) may watch a digitized avatar of a real-life pitcher perform their own pitching sequence with a simulated baseball delivered from the pitcher's tracked release point that leaves the pitcher's hand with the spin pattern of a selected pitch type from the pitcher's known pitch types.
  • pitch data is used to simulate the flight path for a pitch type at the spin rate (and spin axis) associated with the pitcher's pitch and within the known range of speed, velocity and acceleration for a pitch type from the selected pitcher.
  • the system may control whether a pitch will enter the strike zone or pass outside the strike zone so the batter can recognize with practice whether some pitches released at various release points for the pitcher will be a strike or ball.
  • a virtual reality projection system 2100 (sometimes referred to generally as the “system”) is shown both in illustration and block diagram embodiments.
  • the system 2100 generates a three dimensional, panoramic first person view of a pitcher 2150 and the surrounding environment.
  • the surrounding environment may include a home plate, batters' boxes, the pitching mound, base lines, bases, dirt and grass infield parts, an outfield, an outfield wall, stadium seating, and any other elements one may find for a stadium setting.
  • Some embodiments have images of professional (or otherwise) stadiums that can replicate the scene a player is about to experience. In use, the batter may look around and experience the same visual sensation one would perceive at the actual stadium.
  • the pitcher 2150 may be a digitized avatar of a real-life pitcher with movements created by replicating real-life movements of said pitcher from stored video of past pitching performances.
  • the pitcher 2150 may also be a digitized avatar of a real-life pitcher with movements created by animating the body joints of said avatar through techniques known as key-framing or motion capture.
  • the system 2100 may thus enhance the practice experience by seeing the pitcher in the same environment the batter will perform. The processes for simulating the pitcher's appearance in a selected environment is described in more detail further below.
  • the system 2100 generally includes a projection system 2130 coupled to a processor 2160 , and a graphics engine module 2170 .
  • the graphics engine module 2170 may be dedicated hardware, software, or some combination thereof and is generally configured to generate the virtual reality graphics simulating a selected pitcher's delivery including delivery from a calculated release point, simulating the trajectory and spin of a thrown pitch, generating the virtual reality surrounding environment of both the digitized pitcher 2150 and the stadium, any pop-up elements, and any simulated hits of a pitch by the user.
  • the processor 2160 may be similar to any of the processors described in FIGS. 1-33 above and/or may be a dedicated graphics processor.
  • Some embodiments also include a memory bank 2180 for temporary storage and retrieval of data being processed and a database or memory storage 2190 . Some embodiments may also include a camera or set of cameras 2140 connected to the processor 2160 and/or other elements shown. Any data related to a pitcher may be stored and accessed from the database 2190 .
  • the database 2190 may store data associated with a plurality of pitchers. Information retrieved from the database 2190 and used by the system 2100 includes for example initial position (ball release point), initial velocity, initial acceleration, spin rate and other data tracked from real-life pitchers' previous games.
  • the database 2190 may be updated manually or automatically as more pitching data become available, for example as new baseball games are carried out in the season.
  • the pitch data may be streamed via a streaming protocol as a means of the update, for example, the HMD user may choose to stream new pitch practice data remotely via wireless network from a network server or cloud.
  • a user device 2105 may be coupled to the processor 2160 and may vary from embodiment to embodiment.
  • the user device 2105 maybe a CAVE type projection room with the projection system 2130 projecting generated virtual reality scene segments onto four walls (front, left, right and floor) 2120 to generate a panoramic virtual reality scene 2110 .
  • the user may wear a pair of stereoscopic glasses 2102 that turn the images on the surrounding walls 2120 into a stereoscopic view.
  • One or more walls may be removed to form a more portable system, for example a system with only the front wall and floor.
  • Cameras 2140 may track the position of the user (and the user's eyes as described in some processes below) in the CAVE as well as any accessories carried by the user (including but not limited to a baseball bat, an interactive controller).
  • a pitch is thrown, some embodiments will provide a trail 2111 of the pitch's flight path (for example as a series of balls 2101 placed along the pitch's trajectory), with the ball's seam position at each interval along the baseball's flight path shown from the batter's perspective so that the batter may learn to identify one pitch type from another based on the pitch trajectory and rotation of the seam pattern visible.
  • Some embodiments will provide a mechanism for detecting if the pitch was hit with the tracked baseball bat 2104 .
  • HMD head mounted display
  • FIGS. 36-38 a head mounted display unit
  • the batter may use gaze directed crosshairs to interact with pop-up elements such as a quiz question asking about the type of pitch, location of the pitch on strike zone plane, ball or strike ( FIG. 38 ).
  • the batter's bat swing may also be tracked in relation to the virtual environment, and checked if the pitch was hit.
  • Some embodiments include a sound generation module 2195 that generates an impact sound dependent on the quality of the hit (for example, foul ball, hard/weak ground ball, line drive, pop fly, or homerun).
  • a pitcher's profile may be selected from the database 2190 by the user so that the selected pitcher's digitized avatar 2150 is displayed in the virtual reality scene 2110 .
  • the selected pitcher's profile includes video footage of the pitcher's delivery and general body movements that are part of the delivery.
  • FIG. 36 shows an enlarged view of a selected pitcher 2150 in mid-delivery during a wind-up type sequence. Some simulations will show pitchers delivering from a stretch position. As will be appreciated, every pitcher has his own arm/hand positioning as well as step placement during a delivery which affects the timing of delivery. Some pitchers go so far as to intentionally include hitches in their delivery to throw off a batter's timing.
  • the user may also focus on practicing against a specific type of pitch, by filtering out all other types of pitches, and working on the swing timing and/or swing position.
  • the user may also focus on practicing against a specific range for the pitch speed, by filtering out all other ranges of pitch speed, and working on the swing timing.
  • the user may also focus on practicing against a specific opposing pitcher by loading previous real-life at-bats data of such pitcher.
  • the pitcher's profile data also includes for example, each pitch type (including but not limited to two seam fastball, four seam fastball, change of speed, curveball, sinker, slider, split finger fastball, cut fastball, and knuckleball) that the pitcher has thrown and has been recorded along with a range of speed for each pitch type, the release point of each pitch type, trajectory paths, strike zone pitch distribution, and spin rates and spin patterns for each pitch type.
  • the pitch type may be based on the initial grip of the baseball and the manner of hand movement used when releasing the grip (for example, degrees of supination, pronation, etc.).
  • a user may input into the system a setting which repeats the same pitch type over different parts of the strike zone or according to the selected pitcher's statistical preferences, or the user may ask for random pitch types thrown so that he can learn to recognize and distinguish one pitch type from another.
  • FIGS. 37 and 38 other embodiments may provide practice against a selected pitcher for determining whether a simulated pitch will be a strike or a ball (outside the strike zone).
  • the system 2100 is able to use the selected pitcher's tracked data to accurately replicate known release points and simulate the same flight paths of pitches.
  • a change in a pitcher's release point may be the difference in whether a same pitch type is either a strike or a ball.
  • a pop-up strike zone field 2115 FIG. 37
  • Some embodiments may provide a pause in the pitch's flight to present the user a displayed query graphic 2125 which may ask for example, whether the pitch is a strike or a ball.
  • a timer 2135 may be displayed requiring the user to select a choice before the end of the timer. As shown, it will be appreciated that the timer may reflect a real-life time available to decide an aspect of the pitch. While the example is shown with respect to the decision of a ball or strike, it will be understood that other aspects may be presented in the query graphic 2125 , such as asking for the pitch type, estimated pitch speed, in case of strike which of the 9 (3 by 3) strike zones did the pitch go through.
  • the digitized avatar 2150 for a pitcher may be replicated from actual game footage provided from any number of known third party sources.
  • footage data used may be organized into data entries by pitch type so that the release point for each pitch type a pitcher uses is charted and used in the virtual reality setting rather than just showing for example an average release point for all the pitcher's pitches.
  • the release point for each pitch type may be further organized by strike or ball results so that a batter may practice distinguishing from each for the same pitch type.
  • Each video clip of a pitch thrown by the selected pitcher may be annotated to mark the player's height, the pitching mound apex (adjacent the pitcher's plate), and the height of the baseball from the mound apex level at the point of release for recorded pitch.
  • each pitcher has several video clips per pitch type in the database, with variations in the height of the ball release point recorded depending on the different pitches, as represented by avatars 2150 a , 2150 b , and 2150 c each showing the baseball being released at a different height.
  • the processor 2160 and graphics engine module 2170 may adjust the virtual reality graphics to replicate the release point 2155 based on the annotated data.
  • the system 2100 provides an enhanced realistic experience by simulating a user's environmental experience as close to the real-life experience of batting against a selected pitcher as possible.
  • the virtual reality scene 2110 may be displayed so that the depth perception and scale of the digitized pitcher's avatar 2150 appears natural.
  • each pitch video may be pre-processed to turn all background pixels to full transparency, but the pitcher's image and baseball's image may be left as-is. This is sometimes known as rotoscoping.
  • Each pitch video is annotated to mark the ball release point, pitcher's height, and locations of pitcher mound apex where the pitcher's pivot foot is placed.
  • the ball release point is acquired from the ball pitch flight data, which typically is 50-55 feet away from home plate. Based on the pitch type and also height of the ball release point for this particular pitch, the best matched pitch video (rotoscoped and annotated) is selected from this specific pitcher's pool of video clips, in a way that the ball release point from the video is the closet to the actual ball release point for this particular pitch.
  • This pitcher's avatar and associated video may be displayed on a video plane (for example in the form of a rectangular plane geometry), always facing the batter (user “U” as shown in FIG. 41 ), and the video plane is positioned so that the actual ball release point 2155 lies within the video plane.
  • the eye position of the batter may be acquired ( 2210 ) from the motion tracking system which tracks the batter's stereoscopic glasses (for example glasses 2102 of FIGS. 34 and 35 ).
  • the position of batter's eyes is updated ( 2220 ) in real-time.
  • This pitch video plane may be anchored in a way that the pitcher's pivot foot is perceived to be fixated right at the apex of the pitching mound (because of line of sight) only from the perspective of the batter as shown in FIG. 41 .
  • the pitch video plane may be constantly adjusted in real-time to maintain the aforementioned line of sight, so that the pitcher's pivot foot appears to be exactly on top of the pitch's mound apex (essentially on top of pitcher's plate).
  • the video pitcher may be constantly scaled and translated so that from the batter's perspective, the pitch release point 2155 ′ shown on the video plane matches exactly the release point 2155 of the actual pitch being simulated (because of line of sight).
  • the release point 2155 ′ that appears in the video plane may not appear to be in the correct point of release in a natural environment as the perspective is off, because the projection surface may be in actuality closer than the 60 feet 6 inches a real-life pitcher would be.
  • the height and pivot foot location of the perceived pitcher as well as the pitch release point 2155 ′ are calculated in real-time based on the annotation data in the pitch video clip.
  • the pitcher video plane is then scaled ( 2230 ) in a way that the ball is perceived to be released exactly from the actual ball release point 2155 , while keeping pitcher's pivot foot anchored on top of pitcher's plate, as shown in FIG. 42 .
  • the video pitcher is perceived as if he is standing on the pitcher's mound at 60 feet 6 inches away from home plate.
  • the exemplary process 2200 for matching perspective of the user also includes adjusting the stadium and the baseball field.
  • the stadium including the field may be captured as a 360 degrees panorama photo or video and applied to a sphere (represented by the arc 2165 ′), or may be geometrically modeled, or may be a hybrid of the two, for example the field is geometrically modeled and the stadium is captured as a 360 degrees panorama photo.
  • the stadium including the field may be proportionally scaled ( 2240 ) in a way that the pitcher appears to be the same height. For example, if the perceived pitcher may be scaled to be slightly taller than his actual height (in order to match the actual pitch release point), the stadium may be scaled slightly larger by the same proportion towards a reference point 2166 on the arc 2165 ′.
  • the scaled stadium is represented by the arc 2165 which is scaled up from arc 2165 ′.
  • the perceived pitcher thus does not appear taller than he actually is.
  • Each step from ( 2210 ) to ( 2240 ) may be repeated in real-time for consistency with movement of the user's eyes.
  • the pitcher's avatar is being anchored and scaled in real-time based on batter's eye position, so from the batter's perspective, the pitcher is perceived to be on the pitcher's mound pitching the ball, and the ball release point from the video pitcher exactly matches the actual pitch release point from the pitch database throughout the entire pitch, given that the batter stays within a reasonable range from the home plate.
  • the stadium is being scaled in real-time based on batter's eye position, so from the batter's point of view, the pitcher is perceived to be in accordance to his actual body height relative to the size of the stadium.
  • the exemplary process 2200 may need to handle a special case which is caused by the lack of variations in the height of pitch release points (such variation is depicted in FIG. 40 ). Hypothetically if only one pitch release point at a specific height is available from a particular pitcher's video database, assuming it happens to be a relatively low release point (for example one similar to 2150 a ), when a particular pitch performed by this particular pitcher with a relatively high release point is being simulated, this may cause the process 2200 to scale up the video pitcher so much that he becomes out of proportion. To prevent this issue, the exemplary process 2200 is designed to limit the scaling of the video pitcher (and also the corresponding scaling of the stadium) in a way that the overall virtual reality scene looks natural.
  • the exemplary process 2200 may alter the actual pitch release point 2155 for a particular pitch in order to match the release point 2155 ′ from the scaled video pitcher that is already scaled at the scaling limit.
  • a drawback is that the system may not able to utilize the pitch data 100% original due to said shifting of pitch release height, however an option is added where the batter may filter out pitches with such shifts, to be able to utilize 100% original pitch data.
  • the system may also use computer generated (CG) pitcher avatars that are animated (also known as key-framed) either manually or using motion capture data.
  • CG avatar is geometrically modeled in three dimensional space with articulated bones and body joints.
  • Such CG avatars may eliminate the need for matching the perspective of the batter.
  • CG avatar may be anchored directly on top of the pitching mound with pivot foot placed on pitcher's plate.
  • the CG avatar may be scaled so that the pitch release point from the animated pitch sequence exactly matches the release point from the actual ball release point 2155 .
  • the pitch motion of the CG avatar may also be slightly altered using an algorithm known as Inverse Kinematics (IK), where the joint angles of the CG avatar's arm and/or body may be mathematically computed in a way that to control end-effector (in this case the pitch's hand) to reach any feasible location within the three dimensional space, specifically in this case control the CG avatar's hand to reach exactly the actual pitch release point 2155 , without the need to alter the scaling of the said CG avatar nor the stadium.
  • IK Inverse Kinematics
  • the CG avatar and associated variation of animated pitch sequences may be very difficult to acquire, which may pose a huge challenge when deploying such baseball virtual reality training system where large quantity and variations of CG pitchers and animated pitch sequences are required; the fidelity of the CG avatars as well as the pitch motions may be subpar compared to pitchers simulated using video sources.
  • Pitch flight parameters are loaded ( 2310 ) in from a database that stores pitching data associated with a selected pitcher.
  • a pitch timer may be started ( 2320 ) at the exact moment when the ball is released from the pitcher's hand, which is associated with the flight time of a virtual pitch.
  • the simulated baseball's position along the trajectory and its rotational property may be updated ( 2340 ).
  • a determination ( 2350 ) may be made whether a user's batting swing made virtual contact with the simulated pitch. If the user missed, the process determines ( 2360 ) whether the pitch (flight path) has terminated. If the user makes contact, a second timer may be started ( 2370 ) tracking the length of time a hit ball has been travelling in flight. Some embodiments may calculate the trajectory, exit velocity, exit angle and distance of a simulated hit depending on the pitch speed, pitch spin, speed of the swing, angle of swing attack, and position of the bat barrel where contact occurred. A determination ( 2380 ) may be made on whether the flight of the hit ball's timer has elapsed. If so, the flight path of the hit ball may be updated ( 2390 ) for position and rotation until an affirmative determination ( 2395 ) is achieved that the hit ball's flight timer has ended.
  • FIGS. 44-47 show schematics for determining whether a user makes contact with a simulated pitch.
  • a real bat 2104 may be used and retrofitted with tracking markers. Elements designated by a numeral and (′) represent the position of the element shown in shadow lines which is at a later point in time (either at the point of impact or post-miss) than the position of an element using a regular numbered call out.
  • the swing plane may be tracked using high speed tracking cameras (for example camera(s) 2140 of FIG. 50 ).
  • the bat swing is captured using a high-speed motion capture system running at 330 frames per second or higher. At each frame, the system checks for bat-ball impact.
  • the valid hitting zone on the bat sweeps over a plane, in the shape of a “trapezoid” (see FIGS. 45-47 ) while the ball travels along an approximated “line segment” towards the home plate.
  • the tracking system introduces a small but consistent bat tracking latency (which may be measured in advance). To compensate for this latency, the bat swing may be predicted based on previous frames tracking data so that the tracked bat (used for bat-ball impact detection) may be made to swing slightly ahead of itself.
  • a different approach to compensate for the said tracking latency is that the system may apply the tracking latency on the pitch ball flight simulation so that the simulated pitch ball (used for bat-ball impact detection) may travel slightly ahead of time. Because the entire ball pitch flight can be simulated in advance using the pitch parameters retrieved from database, the second approach may be simpler to implement, and it may also be more accurate than the first approach.
  • a bat-ball collision detection algorithm #1 detects the scenario of possible impact.
  • the system calculates the minimal distance between the “trapezoid” and the “line segment”. If the minimal distance is less than the sum of the radius of the ball and the radius of the bat, it indicates a possible bat-ball impact, but requires further calculation to confirm; if the minimal distance is larger than the sum of the two, the bat does not make contact with the ball.
  • bat-ball collision detection algorithm #2 When a possible bat-ball impact is detected, the system may use a continuous collision detection algorithm to further confirm the impact. Using the tracked bat data from current frame (t) ( FIG. 46 ) and previous frame (t ⁇ t) ( FIG. 45 ), the system calculates the speed of the bat. The speed of the pitch ball at current frame (t) is also calculated using the pitch parameters (as previously mentioned, the tracking latency may be applied to the timing of the ball).
  • the system may perform an iterative procedure where it slowly advances the time t+ ⁇ t, calculates the position of the bat and the position of the ball based on their speeds at time (t), then calculates the minimal distance between the central axis of the bat and the center of the ball, if such minimal distance falls below the sum of the ball radius and the bat radius (at bat intersection), the system may confirm the bat-ball impact actually occurred at time t+ ⁇ t.
  • Such continuous collision detection algorithm (#2) may accurately confirm the bat-ball impact, however it may run slower than collision detection algorithm #1.
  • the system may be designed to run algorithm #1 by default to achieve maximum frame rate, and switches to algorithm #2 when a possible bat-ball impact needs to be confirmed.
  • Such continuous collision detection algorithm (#2) may calculate the post-impact exit velocity, exit angle, ball trajectory and distance of a simulated hit, based on the pitch speed, pitch spin, speed of the bat swing, angle of swing attack, position of the bat barrel where contact occurred, and the overlap (vertical displacement) between the bat and the ball at contact.
  • some embodiments provide an impact sound generated by the sound module 2195 ( FIG. 50 ) when the bat-ball impact is detected.
  • Such sound may be electronically synthesized, for example the system 2100 plays recorded sound effects based on the type of impact through a speaker.
  • Such sound may be generated physically, for example a mechanical trigger is released to hit a wood block in order to generate the sound effect.
  • Such sound may be generated when a mechanical trigger on or inside the bat 2102 is released to hit the bat in order to generate the sound effect as well as causing the bat to vibrate as a form of haptic feedback.
  • the impact sound may vary based on the type of bat, including an impact sound for the wood bat and a different impact sound for a metal bat.
  • the frequency of the impact sound may vary based on the location where the bat barrel made contact with the simulated pitch, and/or the overlap (vertical displacement) between the bat and the simulated ball at contact.
  • An impact sound may be selected for a particular pitch from a pool of impact soundtracks stored in the database or memory storage 2190 , based on various properties for such particular impact, including but not limited to, pitch speed, pitch spin, speed of the bat swing, angle of swing attack, post-impact exit speed and exit angle, position of the bat barrel where contact occurred, the location where the bat barrel made contact with the simulated ball, and the overlap (vertical displacement) between the bat and the simulated ball at contact.
  • bat-ball impact detection may not real-time, but very close to real-time.
  • the delay may be introduced by the bat tracking latency.
  • the delay may also be introduced by the bat-ball impact collision detection algorithm #2 executed at least 1 frame after the impact.
  • a swing plane ( FIG. 44-46 ) from the user's bat swing may be plotted as an augmented reality overlay inside the virtual environment, along with a virtual baseball as it cut through the strike zone plane.
  • the swing plane may be used to visualize if the batter's swing was too high, too low or at the correct height. Colors may be applied to swing planes in order to help visualize such location differences from the bat swings.
  • Bat swings and the corresponding pitches may be recorded to help the batter analyze his swing. Replay of a previous bat swing and the corresponding pitch may be activated in slow motion (frame by frame) to analyze for any arbitrary practice recorded.
  • disclosure employing the terms “processing,” “computing,” “determining,” “calculating,” “receiving images,” “acquiring,” “generating,” “performing” and others refer to a data processing system or other electronic device manipulating or transforming data within the device memories or controllers into other data within the system memories or registers.
  • One or more embodiments may be implemented in computer software firmware, hardware, digital electronic circuitry, and computer program products which may be one or more modules of computer instructions encoded on a computer readable medium for execution by or to control the operation of a data processing system.
  • the computer readable medium may be a machine readable storage substrate, flash memory, hybrid types of memory, a memory device, a machine readable storage device, random access memory (“RAM”), read-only memory (“ROM”), a magnetic medium such as a hard-drive or floppy disk, an optical medium such as a CD-ROM or a DVR, or in combination for example.
  • a computer readable medium may reside in or within a single computer program product such as a CD, a hard-drive, or computer system, or may reside within different computer program products within a system or network.
  • the computer readable medium can store software programs that are executable by the processor 2032 and may include operating systems, applications, and related program code.
  • the machine readable non-transitory medium storing executable program instructions which, when executed, will cause a data processing system to perform the methods described herein. When applicable, the ordering of the various steps described herein may be changed, combined into composite steps, or separated into sub-steps to provide the features described herein.
  • Computer programs such as a program, software, software application, code, or script may be written in any computer programming language including conventional technologies, object oriented technologies, interpreted or compiled languages, and can be a module, component, or function. Computer programs may be executed in one or more processors or computer systems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computational Linguistics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Computer Graphics (AREA)

Abstract

Virtual and augmented reality sports training environments are disclosed. A user interacts with virtual players in a simulated environment of a virtual reality sporting event. In some embodiments, the user's actions and decisions are monitored by the simulated environment. The environment evaluates the user's performance, and provides quantitative scoring based on the user's decisions and timing. Coaches and other users may design customized scenarios or plays to train and test users, and the resultant scores may be reviewed by the coach. In one application, real life pitchers and their pitching data are tracked and replicated in a simulated pitching environment. A team of users may practice against a simulation of a pitcher they are about to compete against, swinging at pitches that they would see in a real game. Such environments may maximize effective practice time for users, and help develop better players with improved decision-making skills.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a continuation of pending U.S. Non-provisional application Ser. No. 15/431,630 filed Feb. 13, 2017.
  • BACKGROUND OF THE INVENTION 1. Field of the Invention
  • The present invention relates in general to systems and methods for training athletes. More particularly, the invention is directed to virtual reality simulated sports training systems and methods.
  • 2. Description of the Related Art
  • Virtual reality environments may provide users with simulated experiences of sporting events. Such virtual reality environments may be particularly useful for sports such as American football in which players may experience many repetitions of plays while avoiding the chronic injuries that may otherwise result on real-world practice fields. However, conventional virtual reality sports simulators may not provide meaningful training experiences and feedback of the performance of a player.
  • Accordingly, a need exists to improve the training of players in a virtual reality simulated sporting environment.
  • SUMMARY OF THE INVENTION
  • In one embodiment, a virtual reality projection system is disclosed. The virtual reality projection system an electronic display configured to project a virtual reality baseball environment shown in a first-person perspective of a user training a batting swing in the virtual reality baseball environment; and a graphics engine module connected to the electronic display, the graphics engine module configured to: generate the virtual reality baseball environment; generate a digitized image of a user selected real-life baseball pitcher; retrieve from an electronic database, real-life pitching data of the user selected real-life baseball pitcher; display in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user; display in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and display in the virtual reality environment, a replicated release of a digital baseball from a digital hand of the digitized image from a release point replicating a pitch thrown by the user selected real-life baseball pitcher, wherein the digital hand is positioned in a three-dimensional space of the virtual reality baseball environment from a location associated with the release point when replicating the replicated release of the digital baseball.
  • In another embodiment, a machine readable non-transitory medium storing executable program instructions is disclosed which when executed cause a data processing system to perform a method comprising: generating a virtual reality baseball environment in an electronic display; generating a digitized image of a user selected real-life baseball pitcher; retrieving from an electronic database, real-life pitching data of the user selected real-life baseball pitcher; displaying in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user; displaying in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and displaying in the virtual reality environment, a replicated release of a digital baseball from the digitized image from a release point positioned relative to a digital body of the digitized image, replicating a real-life pitch thrown by the user selected real-life baseball pitcher, wherein the replicated release of the digital baseball is based on real-life pitching data captured in association with the real-life pitch, the replicated release of the digital baseball is displayed continuing in a simulated trajectory of a thrown virtual pitch, from the release point of the user selected real-life baseball pitcher toward a strike zone adjacent the user in the virtual reality baseball environment.
  • In yet another embodiment, a method of simulating a baseball pitcher's pitch is disclosed. The method comprises: generating a virtual reality baseball environment in an electronic display; generating a digitized image of a user selected real-life baseball pitcher; retrieving from an electronic database, real-life pitching data of the user selected real-life baseball pitcher; displaying in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user; displaying in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and displaying in the virtual reality environment, a replicated release of a digital baseball from the digitized image from a release point based on the real-life pitching data of the user selected real-life baseball pitcher, replicating a real-life pitch thrown by the user selected real-life baseball pitcher, the replicated release of the digital baseball is displayed continuing in a simulated trajectory of a thrown virtual pitch, from the release point of the user selected real-life baseball pitcher toward a strike zone adjacent the user in the virtual reality baseball environment.
  • These and other features and advantages of the invention will become more apparent with a description of preferred embodiments in reference to the associated drawings.
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an exemplary flowchart illustrating a method for implementing a virtual reality sports training program.
  • FIG. 2 is an exemplary flowchart illustrating the calculation of the decision and timing scores.
  • FIG. 3 is an exemplary flowchart illustrating the decision scoring for a football quarterback.
  • FIG. 4 is an exemplary flowchart illustrating the timing scoring for a football quarterback.
  • FIG. 5 is a front, perspective view of a user in an immersive virtual reality environment.
  • FIG. 6 is a side, perspective view of a user wearing a virtual reality head-mounted display showing a virtual reality environment.
  • FIG. 7 is a front, perspective view of a simulated environment of a football game and a handheld game controller for the user to interact with the game.
  • FIG. 8 is a front, perspective view of the simulated environment of a football game just before the initiation of a play.
  • FIG. 9 is a front, perspective view of the simulated environment of a football game immediately after the initiation of a play.
  • FIG. 10 is a front, perspective view of the simulated environment of a football game showing the correct decision.
  • FIG. 11 is a front, perspective view of a simulated environment of a football game showing a multiple choice question presented to the user.
  • FIG. 12 is a front, perspective view of a simulated environment of a football game showing the possible areas from which the user may select.
  • FIG. 13 is a front, perspective view of a simulated environment of a football game showing the possible areas from which the user may select in an embodiment.
  • FIG. 14 is a front, perspective view of a simulated environment of a football game showing the possible areas from which the user may select in an embodiment.
  • FIG. 15 is a front, perspective view of a simulated environment of a football game where the user is asked to read the defense.
  • FIG. 16 is a front, perspective view of the simulated environment of a football game showing possible areas of weakness from which a user may select.
  • FIG. 17 is a front, perspective view of the simulated environment of a football game showing offensive player running patterns.
  • FIG. 18 is a front, perspective view of a user selecting the area of weakness in an embodiment.
  • FIG. 19 is a front, perspective view of a user interacting with a virtual reality environment via a virtual pointer.
  • FIG. 20 is a front, perspective view of a user selecting an audio recording in an embodiment.
  • FIG. 21 is a front, perspective view of a user selecting a lesson with the virtual pointer.
  • FIG. 22 is a front, perspective view of a user selecting from multiple choices using the virtual pointer.
  • FIG. 23 is a front, perspective view of a user receiving the score of performance in an embodiment.
  • FIG. 24 is a front view of a playlist menu in one or more embodiments.
  • FIG. 25 is a front view of a football field diagram showing details of a play in one or more embodiments.
  • FIG. 26 is a front, perspective view of a simulated environment of a football game immediately before a play is executed.
  • FIG. 27 is a front, perspective view of a user selecting a football player with the virtual pointer in one or more embodiments.
  • FIG. 28 is a front, perspective view of a user receiving the score of performance in an embodiment.
  • FIG. 29 is a schematic block diagram illustrating the devices for implementing the virtual reality simulated environment of a sporting event.
  • FIG. 30 is an exemplary flowchart showing the process of implementing the virtual reality simulated environment of a sporting event on a smartphone or tablet.
  • FIG. 31 is an exemplary flowchart showing the process of implementing the virtual reality simulated environment of a sporting event on a mobile device.
  • FIG. 32 is an exemplary flowchart showing the process of implementing the virtual reality simulated environment of a sporting event on a larger system.
  • FIG. 33 is a schematic block diagram illustrating a mobile device for implementing the virtual reality simulated environment of a sporting event.
  • FIG. 34 is a stereoscopic front perspective view of a three dimensional panoramic virtual reality environment system simulating a baseball pitch thrown from a digitized avatar of a real-life pitcher, as seen from a baseball batter's perspective, according to an exemplary embodiment.
  • FIG. 35 is a stereoscopic front perspective view of the system of FIG. 34 with a user wearing a tracked stereoscopic glasses and swinging a bat at a simulated pitch according to an exemplary embodiment.
  • FIG. 36 is an enlarged view of a digitized avatar of a real-life pitcher within a three dimensional panoramic virtual reality environment system simulating a baseball pitching sequence according to an exemplary embodiment.
  • FIG. 37 is a perspective view displayed inside a head-mounted display, which shows a digitally generated strike zone location box from a batter's perspective displayed at the end of the pitching sequence of FIG. 36, according to an exemplary embodiment.
  • FIG. 38 is a perspective view displayed inside a head-mounted display, which shows a strike or ball decision graphic with countdown timer and the gaze-controlled crosshair according to an exemplary embodiment.
  • FIG. 39 is an illustration of a digitized real-life pitcher performing a pitching sequence and reference points determined in generating a release point in a virtual reality environment according to an exemplary embodiment.
  • FIG. 40 is an illustration of digitized real-life pitcher performing several pitch sequences with different heights in the ball release points, and reference points determined for each pitcher in generating respective release points at arbitrary locations in a virtual reality environment according to an exemplary embodiment.
  • FIG. 41 is a side schematic view illustrating before the alignment of the digitized avatar of a pitcher's release point and pitch release relative to a position of a user batter according to an exemplary embodiment.
  • FIG. 42 is a side schematic view illustrating after the alignment of the digitized avatar of a pitcher's release point and pitch release relative to a position of a user batter according to an exemplary embodiment.
  • FIG. 43 is a side schematic view illustrating generating a field background and stadium environment relative to a position of the digitized pitcher's avatar and the user batter according to an exemplary embodiment.
  • FIG. 44 is a side schematic view illustrating the tracking of a user's bat swing to determine if a simulated pitch was hit in the system of FIG. 35.
  • FIG. 45 shows a top view schematic and a side view schematic of a tracked bat swing to determine if the bat reaches a point of contact of the simulated pitch in a plane that intersects a simulated flight path of the simulated pitched baseball.
  • FIG. 46 shows a top view schematic and a side view schematic of a tracked bat swing that was determined to have hit the simulated pitched baseball.
  • FIG. 47 is a schematic view illustrating two algorithms for determining whether a tracked swing of a baseball makes contact with a simulated baseball pitched in the system of FIG. 35 according to an exemplary embodiment.
  • FIG. 48 is a flowchart of a method of matching a perspective of a batter user with the generated digitized avatar of a real-life pitcher in a three dimensional virtual reality environment according to an exemplary embodiment.
  • FIG. 49 is a flowchart of a method of simulating a flight path of a simulated pitched baseball from a digitized avatar of a real-life pitcher in a three dimensional virtual reality environment according to an exemplary embodiment.
  • FIG. 50 is a block diagram of a system for generating a three dimensional virtual reality simulation of baseball pitches from digitized avatars of real-life baseball pitchers according to an exemplary embodiment.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The following preferred embodiments are directed to virtual reality sports training systems and methods. Virtual reality environments provide users with computer-generated virtual objects which create an illusion to the users that they are physically present in the virtual reality environment. Users typically interact with the virtual reality environment by employing some type of device such as headset goggles, glasses, or mobile devices having displays, augmented reality headgear, or through a cave automatic virtual environment (CAVE) immersive virtual reality environment where projectors project images to the walls, ceiling, and floor of a cube-shaped room.
  • In one or more embodiments, a sports training simulated environment is contemplated. While in a virtual reality environment, a user views a simulated sporting event. In an embodiment, a user acting as a football quarterback in the virtual reality environment may see a pre-snap formation of computer-generated defensive and offensive football players. In an embodiment, the virtual football is snapped, the play is initiated, and the offensive and defensive players move accordingly. The user sees several of his virtual teammates cross the field, and the user must decide among his teammates to whom he should throw the ball. The user makes his selection, and the sports training simulated environment scores the user's decisions. Scores may be based on timing (i.e., how quickly the user decides) and/or on selection (i.e., did the user select the correct player). In one or more embodiments, the user may repeat the virtual play or may move on to additional plays. The scores are stored in a cloud based storage. The user's progress (or regression) over time will be tracked and monitored. The user can access the scores and data via a personalized dashboard in either a webpage or mobile application.
  • In one or more embodiments, the user may be queried on other aspects of a simulated sporting event. For example, a user acting as a virtual quarterback may be asked to read a defense and identify areas of weakness against a play. In one or more embodiments, multiple possible answers are presented to the user, and the user selects the answer he believes is correct.
  • In one or more embodiments, a comprehensive virtual reality sports training environment is contemplated. A coach may develop customized plays for his team. The players of the team then interact individually with the virtual reality simulated sporting event and have their performances scored. The scores of the players may then be interpreted and reviewed by the coach.
  • One or more embodiments provide a means for improving athlete decision-making. Athletes in a virtual environment may experience an increased number of meaningful play repetitions without the risk of injury. Such environments may maximize effective practice time for users, and help develop better players with improved decision-making skills.
  • Embodiments described herein refer to virtual reality simulated environments. However, it shall be understood that one or more embodiments may employ augmented reality environments comprising both virtual and real world objects. As used herein and as is commonly known in the art, the terms “simulated environments,” “simulated,” “virtual,” “augmented,” and “virtual reality environment” may refer to environments or video displays comprising computer-generated virtual objects or computer-generated virtual objects that are added to a display of a real scene, and may include computer-generated icons, images, virtual objects, text, or photographs. Reference made herein to a mobile device is for illustration purposes only and shall not be deemed limiting. Mobile device may be any electronic computing device, including handheld computers, smart phones, tablets, laptop computers, smart devices, GPS navigation units, or personal digital assistants for example.
  • Embodiments described herein make reference to a training systems and methods for American football; however, it shall be understood that one or more embodiments may provide training systems and methods for other sports including, but not limited to, soccer, baseball, hockey, basketball, rugby, cricket, and handball for example. As used herein and as is commonly known in the art, a “play” is a plan or action for one or more players to advance the team in the sporting event. Embodiments described herein may employ head mounted displays or immersive systems as specific examples of virtual reality environments. It shall be understood that embodiments may employ head mounted displays, immersive systems, mobile devices, projection systems, or other forms of simulated environment displays.
  • FIG. 1 is an exemplary flowchart illustrating a machine-implemented method 101 for implementing a virtual reality sports training program. The process begins with a 2 dimensional (“2D”) play editor program (step 110) in which a coach or another person may either choose an existing simulated play to modify (step 112) or else create a completely new simulated play from scratch (step 114). The simulated play may be created by a second user such as a coach, where the simulated play defines the pre-determined location of the simulated players and the movements of the simulated players during the play.
  • An existing play may be modified by adjusting player attributes from an existing play (step 116). A new play may be created by assigning a player's speed, animation, stance, and other attributes for a given play (step 118). The simulated sports training environment provides a simulated environment of a sporting event to a user by one or more computing devices, where the simulated environment depicting the sporting event appears to be in the immediate physical surroundings of the user. The simulated sports training environment generates simulated players in the video display of the sporting event, where each of the simulated players is located in a pre-determined location. In one or more embodiments, the simulated sports training environment initiates a simulated play for a period of time, where one or more simulated players move in response to the play. Assignment of ball movement throughout the play determines the correct answer in the “Challenge mode” (step 120).
  • Multiple viewing modes are possible to present the created play (step 122). A 3D viewing mode supports video game visualization and stereoscopic viewing of the play created (step 124). In a helmet mode, the view of the play is from the individual player's helmet view point (step 126). In a free decision mode, the user may choose the navigation throughout the environment (step 128). All of these modes may be viewed in any virtual reality system (step 134).
  • The user may be faced with a challenge mode (step 130) where the user's interactions with the play are monitored and scored (step 132). In one or more embodiments, the sports training environment presents a question or query to the user, receives a response from the user, and scores the response. The ball movement is assigned to a player throughout a play to signify the player who has possession of the ball. In one or more embodiments, the simulated sports training environment may send the scored response to another user such as the coach. In one or more embodiments, the simulated sports training environment may send the scored response to another user to challenge the other user to beat one's score.
  • FIG. 2 is an exemplary flowchart illustrating the method 201 for calculating the decision and timing scores in the challenge mode. The user interacts with a play in a virtual reality environment, and a question or query is posed to the player (step 210). The player then interacts with the virtual reality system through a handheld controller, player gestures, or player movements in one or more embodiments (step 212). The interaction is monitored by the virtual reality environment which determines if the interaction results in a correct answer (step 214), correct timing (step 216), incorrect answer (step 218) or incorrect timing (step 220). Each of these decision and timing results are compared to an absolute score for the correct answer (step 222), an absolute score for the correct timing (step 224), an absolute score for the incorrect answer (step 226), or the absolute score for incorrect timing (step 228). The comparison of the absolute scores from the correct answer (step 222) and a comparison of the absolute score of the incorrect answer (step 226) determines the decision score (step 230). The comparison of the absolute score of the correct timing (step 224) and the absolute score of the incorrect timing (step 228) determines the timing score (step 232). The decision score (step 230) is assigned a number of stars for providing feedback to the player. A decision score of 60% or less generates no stars (step 234), a decision score of 60-70% results in one star (step 236), a decision score of 70-90% results in two stars (step 238), and a decision score of 90% or greater results in three stars (step 240) in one or more embodiments. The timing score (step 232) is assigned a number of stars for providing feedback to the player. A timing score of 60% or less generates no stars (step 242), a timing score of 60-70% results in one star (step 244), a timing score of 70-90% results in two stars (step 246), and a timing score of 90% or greater results in three stars (step 248) in one or more embodiments.
  • FIG. 3 is an exemplary flowchart illustrating a method 301 for determining the decision scoring for a football quarterback in one or more embodiments. A coach or another person creates a play (step 310), where, in this example, the wide receiver is chosen as the correct teammate for receiving the football (step 312). The wide receiver chosen by the coach as the recipient of the ball is deemed the correct answer for the player in the challenge mode. A play commences in a simulated environment, and the challenge mode is initiated as the player goes through a simulated play where the player interacts with a device and choses the player to receive the ball (step 314). The answer or response is chosen through a player interacting with a gamepad controller, on a tablet by clicking on an individual, or by positional head tracking. The player decision is monitored, resulting in either a correct answer (step 316) or an incorrect answer (step 318). The incorrect answer results from the player selecting players other than the player selected by the coach or play creator. The score is calculated by dividing the correct answers by the total number of questions asked (step 320).
  • FIG. 4 is an exemplary flowchart illustrating a method 351 for determining the timing scoring for a football quarterback. A coach or another person creates a play (step 352), where, in this example, the ball movement is chosen (step 354). The coach determines the time that the ball is chosen to move from the quarterback to the wide receiver that will be deemed as the correct timing in the challenge mode. A play commences in a virtual reality environment, and the challenge mode is initiated as the player goes through a simulated play where the player interacts with a device and chooses the player to whom the quarterback will throw the ball (step 356). The player timing is monitored, resulting in either a correct timing (step 358) or an incorrect timing (step 360). The timing is determined from the time of the snap of the football to the point in time that the quarterback throws the ball to the wide receiver. The incorrect timing is a time of release of the football that is inconsistent with the timing established by the coach. The time score is calculated by dividing the correct answers by the total number of questions (step 362).
  • FIG. 5 is a front, perspective view of a user 510 in an immersive virtual reality environment 501 which may be referred to as a CAVE. A user 510 typically stands in a cube-shaped room in which video images are projected onto walls 502, 503, and 504. The real player 510 is watching the virtual player 512 running across a virtual football field. In an embodiment, the player 510 will be acting in the role of a quarterback. Players in the CAVE can see virtual players in a virtual football field, and can move around them to get a view of the virtual player from a different perspective. Sensors in the virtual reality environment track markers attached to the user to determine the user's movements.
  • FIG. 6 is a side, perspective view 521 of a user wearing a virtual reality head-mounted display 522 showing a virtual reality environment. The head-mounted display 522 has a display positioned inches away from the eyes of the user. Employing motion and orientation sensors, the head-mounted display 522 monitors the movements of the user 510. These movements are fed back to a computer controller generating the virtual images in the display 522. The virtual reality images react to the user's 510 motions so that the user perceives himself to be part of the virtual reality experience. In one or more embodiments, a smartphone or mobile device may be employed.
  • FIGS. 7-11 depict a sequence of virtual reality images for testing and training a quarterback to select a teammate in best position to receive the ball. Several screen shots depict a player interacting with a virtual reality environment where the player responds to a specific play by, for example, receiving a football, responding to the defensive players, deciding one or more actions, and completing the play. FIG. 7 is a front, perspective view of a simulated environment 601 of a virtual football game and a handheld controller 610 for the user to interact with the simulated football game 601. The hand held controller 610 has a joystick 612 and four buttons 614 (labeled as “A”), 616 (“B”), 618 (“X”), and 620 (“Y”). The simulated football game image shows both the offensive and defensive teams. Player 624 is labeled as “A”, player 626 as “B”, player 628 as “X”, and player 630 as “Y.”
  • A user, acting as a quarterback, will see the simulated players move across the field, and will be required to decide when to throw a football and to select which of the simulated players 624, 626, 628, and 630 will be selected for receiving the ball in an embodiment. A user may select player 624 by pressing the button 614 (“A”), player 626 by pressing the button 616 (“B”), player 628 by pressing button 618 (“X”), and player 630 by pressing button 620 (“Y”). In one or more embodiments, real time decisions are made by the user by pressing buttons on a controller which correspond to the same icon above the head of the player.
  • FIG. 8 shows the positions of the players just before the initiation of a play. In FIG. 9, a play is initiated and in real-time, the user must choose which player to whom he will throw the football. In FIG. 10, the user selects which player will be selected to receive the football. In one or more embodiments, virtual ovals 625, 627, 629, and 631 encircle the players 624, 626, 628, and 630. In this example, oval 627 is cross hatched or has a color different from that of the ovals to indicate to the user that player 626 was the correct choice. If the user does not decide correctly, the user can redo the play until he makes the correct choice. The score is based on how quickly decisions are made, and the number of correct decisions compared to total testing events.
  • FIGS. 11 and 12 illustrate that the simulated environment 701 can be configured to challenge users with multiple choice questions. FIG. 11 shows a quarterback's perspective of a simulated football game before a play is initiated. The virtual reality environment shows a pop up window 710 presenting a question posed to the user. In this example, the virtual reality environment is asking the user to identify the areas of vulnerability of a coverage shell. In FIG. 12, the play is initiated and the user is presented with four areas 712, 714, 716, and 718 representing possible choices for the answer to the question posed. In one or more embodiments, the user may select the area by interacting with a hand held controller, or by making gestures or other motions.
  • FIG. 13 is a front, perspective view of a simulated environment 801 of a virtual football game showing the possible areas from which the user may select in an alternative embodiment. The user is presented with a view before the initiation of a play, and a pop-up window 810 appears and asks the user to identify the Mike linebacker (i.e., the middle or inside linebacker who may have specific duties to perform). Three frames 812, 814, and 816 appear around three players and the user is given the opportunity to choose the player he believes to be the correct player.
  • FIG. 14 is a front, perspective view of a simulated environment of a football game in an embodiment. The user is presented with a view before the initiation of a play, and a pop-up window 910 appears and asks the user to identify the defensive front. Four frames 912, 914, 916, and 918 have possible answers to the question posed. The user is given the opportunity to choose the answer he believes is correct. In one or more embodiments, the user may choose the correct answer through interacting with a game controller, or by making gestures or other movements.
  • One or more embodiments train users to read a defense. For Man and Zone Reads such as picking up Man and Cover 2 defense, embodiments may have 20 play packages such as combo, under, curl, and so forth. Embodiments enable users to practice reading Man vs. Zone defenses. Users may recognize each defense against the formation, such as against a cover 2, and may highlight areas of the field that are exposed because the defense is a cover 2 against that specific play. Players and areas may be highlighted. In one or more embodiments, a training package may consist of 20 plays each against Man and Cover 2 and highlighted teaching point.
  • FIGS. 15-18 are front, perspective views of a simulated environment 901 of a football game where the user is asked to read the defense. As a first step, the user is asked to identify the defensive coverage, which is cover 2 in this example. In FIG. 16, the user is asked to identify areas of weakness or exposure against the play, represented as areas 1012, 1014, 1016, 1018, 1020, and 1022. In FIG. 17, the user is asked to create a mismatch against a zone and expose the areas. The play is represented by players 1030, 1032, and 1034 traversing the field as depicted in running patterns 1031, 1033, and 1035. In FIG. 18, the user may use his hand 1050 to assess and identify the areas of the field that have “weak points” against that coverage.
  • FIGS. 19-23 are front, perspective views of a user interacting with a virtual reality environment 1101. In one or more embodiments, the user may be wearing a head mounted display as depicted in FIG. 6, or may be wearing glasses having markers in an immersive virtual reality environment as depicted in FIG. 5. As shown in FIG. 19, the user sees an environment 1101 having a window 1110 describing the current lesson, an icon 1112 for activating audio instructions, and a window 1116 describing a virtual pointer 1120 represented here as a virtual crosshairs. In one or more embodiments, the virtual pointer may be fixed with respect to the display of the device. A user may interact with the virtual reality environment by aiming the virtual pointer toward a virtual object. The virtual pointer provides a live, real time ability to interact with a three-dimensional virtual reality environment.
  • In one or more embodiments, the virtual pointer 1120 moves across the virtual reality environment 1101 as the user moves his head. As depicted in FIG. 20, the user moves the virtual pointer 1120 over the icon 1112 to activate audio instructions for the lesson. The user may then use the virtual pointer 1120 to interact with the virtual reality environment 1101 such as by selecting the player that will receive the ball. As shown in FIG. 21, the user may then either replay the audio instructions or move to the drill by sweeping the virtual pointer 1120 over and selecting icon 1122.
  • FIG. 22 is a front, perspective view of a simulated environment 1201 illustrating that the virtual pointer 1120 may enable a user to select answers from a multiple choice test. A window 1208 may pose a question to the user, where the user selects between answers 1210 and 1212. The user moves virtual pointer 1120 over the selected answer in response to the question. In FIG. 23, the virtual reality environment generates a score for the user, and the user is able to attempt the test again or move to the next level.
  • FIG. 24 is a front view of a menu 1301 in one or more embodiments. In one or more embodiments, the user is presented with a series of plays in a playlist. The user navigates the application (“app”) by successfully completing a play which then unlocks the next play in the playlist. For example, icons 1310, 1312, 1314, 1316, 1318 and so forth, represent plays the user has successfully completed.
  • Each of the icons may have a series of stars such as star 1311 which represents the score for that play. The icons 1360, 1362, 1364, and 1366 represent the “locked” plays that later become accessible as the user completes the series of plays.
  • FIGS. 25-28 illustrate a training lesson in one or more embodiments. FIG. 25 is a diagram 1401 of a pre-snap formation showing details of a basic play concept shown to the user in one or more embodiments. In FIG. 26, the user selects the center 1510 with the virtual pointer 1120 to snap the ball. In FIG. 27, the play is executed and the user decides to which player he will throw the ball. The user selects player 1512 and the user's actions are monitored and scored. In FIG. 28, the user is presented with his score 1526 as well as star icons 1524 indicating performance. The user may choose between icons 1520 and 1522 with the virtual pointer 1120 to select the next action.
  • FIG. 29 is a schematic block diagram illustrating the system 1601 for implementing the virtual reality simulated sporting event. In one or more embodiments, the system 1601 may comprise a web-based system 1610 having a controller or processor 1611, a computer system 1612 having another controller or processor 1613, and website/cloud storage 1616 also having a controller or processor 1617. Both the web-based system 1610 and the computer system 1612 may be employed for creating, editing, importing, and matching plays, as well as for setting up the interaction/assessment, evaluating the interaction, viewing options, and handling feedback. The web-based system 1610 and the computer system 1612 communicate to the website/cloud storage 1616 through an encoding layer such as Extensible Markup Language (“XML”) converter 1614. During this process the native software application .play file that is generated is in the XML language. The converter strips away XML tags, leaving just the remaining code without the XML tags. The mobile viewer is designed to read the remaining code and visualize the data from the code so the virtual simulations can run on the smartphone/tablet. The website/cloud storage 1616 may be employed for storing plays, handling interaction outcomes, playlists, feedback, and analysis of player decisions, timing, location, and position. The website/cloud storage 1616 may interface with several types of virtual reality systems including smartphones and tablets 1634, native apps 1630 running on mobile viewers 1632, or other computers 1620. In one or more embodiments, a USB file transfer/mass storage device 1618 receives data from a computer system 1612 and provides the data to the single computer 1620. The single computer 1620 may interface with a single projector system 1626 in one or more embodiments. The single computer 1620 may interface with a cluster of multiple computers 1622, which, in turn, drive an Icube/CAVE projector system 1624.
  • FIG. 30 is an exemplary flowchart showing the method 1701 of implementing the virtual reality simulated sporting event on a smartphone or tablet. In one or more embodiments, a play is developed on a desktop computer (step 1710). The files are then uploaded to a cloud (step 1712). The cloud then may download the play onto a mobile device (step 1714) such as a smartphone simulator virtual reality headset (step 1716), a tablet 3D view (step 1718), augmented reality (step 1720), or a video game view (step 1722).
  • FIG. 31 is an exemplary flowchart showing the method 1801 of implementing the virtual reality simulated sporting event employing a desktop or a web-based platform. A desktop computer may be employed as a play creation and editing tool. In one or more embodiments, the file type is formatted through an encoding layer such as XML, and is saved as a “*.play” file. Once created, the file can be sent to the XML converter. The file type is XML, and it is saved as a .play file. In one or more embodiments, a user or coach may use the web-based version of the editing and play creation tool. The web-based version will also send the file to the XML convertor.
  • The desktop (step 1810) and the web-based platform (step 1812) interact with the XML convertor (step 1814). The XML converter transfers data to the website (step 1816) having the cloud-based play storage (step 1818). After going through the XML convertor, the play is then able to be stored on the website. This website serves as the cloud based storage facility to host, manage, and categorize the play files. The plays are downloaded to a mobile viewer (step 1820) where the user interacts with the simulated play (step 1822). The website is integrated with a mobile app that automatically updates when new play files are added to the cloud based storage in the website. The mobile viewer, employing an app, interprets the play file. The user then can experience the play, and be given the result of their actions within the play. This data is then sent back to the app/website.
  • Data is captured from the user interactions (step 1824) and is stored (step 1826). Once the data is captured, the system will display the data on the app or website so the athlete can monitor progress, learn information about his performance, and review his standing among other members from their age group. The data is accessed by the end user and the scores and progress are tracked (step 1828). The data capturing is the most important aspect in one or more embodiments. This data can then be used to challenge other users, invite other users to join in the same simulation, and to track and monitor a user's progress throughout their lifetime.
  • FIG. 32 is an exemplary flowchart showing the method 1901 of implementing the virtual reality simulated sporting event on a larger system. Plays are created on a desktop (step 1910) and files are sent to an internal network (step 1912), a USB mass storage device (step 1914), or to the cloud (step 1916). The data is then downloaded to a program on a local computer (step 1918) and is then forwarded to TV based systems (step 1920), projector based systems (step 1922), or large immersive displays integrated with motion capture (step 1924). Examples of such large immersive displays include Icube/CAVE environments (step 1926), Idome (step 1928), Icurve (step 1930), or mobile Icubes (step 1932).
  • FIG. 33 shows an embodiment of a mobile device 2010. The mobile device has a processor 2032 which controls the mobile device 2010. The various devices in the mobile device 2010 may be coupled by one or more communication buses or signal lines. The processor 2032 may be a general purpose computing device such as a controller or microprocessor for example. In an embodiment, the processor 2032 may be a special purpose computing device such as an Application Specific Integrated Circuit (“ASIC”), a Digital Signal Processor (“DSP”), or a Field Programmable Gate Array (“FPGA”). The mobile device 2010 has a memory 2028 which communicates with the processor 2032. The memory 2028 may have one or more applications such as the Virtual Reality (“VR”) or Augmented Reality (“AR”) application 2030. The memory 2028 may reside in a computer or machine readable non-transitory medium 2026 which, when executed, cause a data processing system or processor 2032 to perform methods described herein.
  • The mobile device 2010 has a set of user input devices 2024 coupled to the processor 2032, such as a touch screen 2012, one or more buttons 2014, a microphone 2016, and other devices 2018 such as keypads, touch pads, pointing devices, accelerometers, gyroscopes, magnetometers, vibration motors for haptic feedback, or other user input devices coupled to the processor 2032, as well as other input devices such as USB ports, Bluetooth modules, WIFI modules, infrared ports, pointer devices, or thumb wheel devices. The touch screen 2012 and a touch screen controller may detect contact, break, or movement using touch screen technologies such as infrared, resistive, capacitive, surface acoustic wave technologies, as well as proximity sensor arrays for determining points of contact with the touch screen 2012. Reference is made herein to users interacting with mobile devices such as through displays, touch screens, buttons, or tapping of the side of the mobile devices as non-limiting examples. Other devices for a user to interact with a computing device include microphones for accepting voice commands, a rear-facing or front-facing camera for recognizing facial expressions or actions of the user, accelerometers, gyroscopes, magnetometers and/or other devices for detecting motions of the device, and annunciating speakers for tone or sound generation are contemplated in one or more embodiments.
  • The mobile device 2010 may also have a camera 2020, depth camera, positioning sensors 2021, and a power source 2022. The positioning sensors 2021 may include GPS sensors or proximity sensors for example. The power source 2022 may be a battery such as a rechargeable or non-rechargeable nickel metal hydride or lithium battery for example. The processor 2032 may be coupled to an antenna system 2042 configured to transmit or receive voice, digital signals, and media signals.
  • The mobile device 2010 may also have output devices 2034 coupled to the processor 2032. The output devices 2034 may include a display 2036, one or more speakers 2038, vibration motors for haptic feedback, and other output devices 2040. The display 2036 may be an LCD display device, or OLED display device. The mobile device may be in the form of hand-held, or head-mounted.
  • Referring now to FIGS. 34-50 in general, in another embodiment(s), aspects of the subject technology may generate a baseball related simulation that may be beneficial for users (as batters) to practice hitting in preparation of competition against real-life pitchers. Aspects of the embodiments described below may digitize the image of a real-life baseball pitcher and display the image as a digital avatar within a baseball replicated environment. As may be appreciated by those individuals that have tried hitting a real-life baseball thrown by a pitcher, the act of hitting a pitched baseball is often regarded as perhaps being the single most difficult act in all of competitive team sports. The challenge of hitting a professionally pitched baseball is more challenging than for most. The speed of a pitched baseball can often reach somewhere in the range of 90 miles per hour (MPH) to over 100 MPH for today's pitchers. The distance from the pitcher's plate (also known as a pitcher's rubber) to home plate is 60 feet and 6 inches by Major League Baseball (MLB) rules as well as for most baseball levels from high school and up. The typical time for a pitch to reach the plate once it leaves the pitcher's hand is approximately 400 to 500 milliseconds. The human eye has been found to only recognize the pitched baseball in just a few points along the path of trajectory from the pitcher's hand to the point of contact near the home plate. If one factors in the need for the batter to recognize not only when the ball will be at the point of contact but also from where the baseball is released by the pitcher in three dimensional space, the trajectory of the baseball's flight path, and whether the baseball will cross over home plate between the upper and lower limits of the batter's strike zone, it is understood that an extreme challenge exists for the batter to decide within split seconds whether or not to swing at the pitch.
  • The common approach for batters is to practice against live pitching or to use a mechanical pitching machine to practice form and timing. However, this approach is limited by the fact that every pitcher's delivery has its own subtle characteristics that heretofore are not replicated by conventional tools and practices. For example, one pitcher may release a pitch using what is known as a true overhand delivery (from a “12 o'clock to 6 o'clock” release), while some may throw from a three-quarters slot, some from a side arm release, and a few from what is known as a submariner's release where the arm lowers down to below the waist as the arm is swung upward and forward.
  • Some video games can generally replicate a general delivery type as described above and may use a generic pitcher avatar to “simulate” a certain pitcher for a certain team, however an actual real-life pitch and delivery may vary significantly from the video game generated delivery of said pitcher. In addition, while some video games include a relative speed of a pitch and calculate whether a user's action (which may be tracked by a game controller) resulted in contact, users often experience a substantial lag (for example, from tracking of said game controller) in the triggering action as well as an unnatural perspective because the pitcher often looks larger than life on a monitor. The release point between pitchers may remain static and becomes easy to predict and is not reflective of the pitcher's real-life release point. So while a user may become proficient at hitting a pitch from a static release point and may physically compensate for the video game system's lag, such practice does not translate into successful use in a real-life situation because the timing and release point recognition is very different.
  • Even amongst pitchers using a similar delivery type (overhead, three-quarters, etc.), pitchers have varying physical attributes (for example, height, arm length, etc.) and delivery mechanics that vary the release point that batters see from pitcher to pitcher. Recognizing and being prepared for delivery from a particular pitcher's release point is a strong tool for a batter's preparation against an opponent. This approach eliminates one of the decision making factors described above that need to be made within the split seconds available during a pitch. However, there remains the daunting task of actually recognizing a pitch type, timing and the trajectory of the pitched ball.
  • To date, the only way that is close to practice against a pitcher's delivery (known as “read the pitch”) was to watch still shots or video footage of a pitcher. However, it is almost impossible to capture still shots and video footage exactly from the batter's perspective, but instead are typically captured by a photographer/cameraman at a third-person perspective offset from the batter's perspective. As a result, traditional still shots and video footage still fail to provide the batter the immersive experience of facing a real-life pitcher on the baseball field. Moreover, still shots and video do not allow a batter to simulate a swing at a pitch with certainty in the results. One can swing against a video clip and only guess as to whether or not contact was made and how successful the contact; in other words, did contact result in a foul ball, a weak/strong ground ball, a line drive, or fly ball? In live-action, players are limited to practice swings against a pitcher's delivery while waiting from far to the side of home plate (usually in a warm up circle next to the dugout). From this perspective, the batter is limited to timing pitches but cannot see from the same first person perspective that happens in an actual at-bat.
  • In the embodiments disclosed below, aspects of the subject technology provide a first-person perspective of pitches that simulate the actual real-life delivery and trajectory of pitches from real-life pitchers. The user (sometimes referred to generally as the “batter”) may watch a digitized avatar of a real-life pitcher perform their own pitching sequence with a simulated baseball delivered from the pitcher's tracked release point that leaves the pitcher's hand with the spin pattern of a selected pitch type from the pitcher's known pitch types. Depending on the pitch, pitch data is used to simulate the flight path for a pitch type at the spin rate (and spin axis) associated with the pitcher's pitch and within the known range of speed, velocity and acceleration for a pitch type from the selected pitcher. The system may control whether a pitch will enter the strike zone or pass outside the strike zone so the batter can recognize with practice whether some pitches released at various release points for the pitcher will be a strike or ball.
  • Referring now to FIGS. 34-38 and 50, a virtual reality projection system 2100 (sometimes referred to generally as the “system”) is shown both in illustration and block diagram embodiments. In an exemplary embodiment, the system 2100 generates a three dimensional, panoramic first person view of a pitcher 2150 and the surrounding environment. The surrounding environment may include a home plate, batters' boxes, the pitching mound, base lines, bases, dirt and grass infield parts, an outfield, an outfield wall, stadium seating, and any other elements one may find for a stadium setting. Some embodiments have images of professional (or otherwise) stadiums that can replicate the scene a player is about to experience. In use, the batter may look around and experience the same visual sensation one would perceive at the actual stadium. The pitcher 2150 may be a digitized avatar of a real-life pitcher with movements created by replicating real-life movements of said pitcher from stored video of past pitching performances. The pitcher 2150 may also be a digitized avatar of a real-life pitcher with movements created by animating the body joints of said avatar through techniques known as key-framing or motion capture. The system 2100 may thus enhance the practice experience by seeing the pitcher in the same environment the batter will perform. The processes for simulating the pitcher's appearance in a selected environment is described in more detail further below.
  • Referring to FIG. 50 for the moment, the system 2100 generally includes a projection system 2130 coupled to a processor 2160, and a graphics engine module 2170. The graphics engine module 2170 may be dedicated hardware, software, or some combination thereof and is generally configured to generate the virtual reality graphics simulating a selected pitcher's delivery including delivery from a calculated release point, simulating the trajectory and spin of a thrown pitch, generating the virtual reality surrounding environment of both the digitized pitcher 2150 and the stadium, any pop-up elements, and any simulated hits of a pitch by the user. The processor 2160 may be similar to any of the processors described in FIGS. 1-33 above and/or may be a dedicated graphics processor. Some embodiments also include a memory bank 2180 for temporary storage and retrieval of data being processed and a database or memory storage 2190. Some embodiments may also include a camera or set of cameras 2140 connected to the processor 2160 and/or other elements shown. Any data related to a pitcher may be stored and accessed from the database 2190. The database 2190 may store data associated with a plurality of pitchers. Information retrieved from the database 2190 and used by the system 2100 includes for example initial position (ball release point), initial velocity, initial acceleration, spin rate and other data tracked from real-life pitchers' previous games. The database 2190 may be updated manually or automatically as more pitching data become available, for example as new baseball games are carried out in the season. The pitch data may be streamed via a streaming protocol as a means of the update, for example, the HMD user may choose to stream new pitch practice data remotely via wireless network from a network server or cloud.
  • A user device 2105 may be coupled to the processor 2160 and may vary from embodiment to embodiment. For instance, in the exemplary embodiment shown in FIGS. 34 and 35, the user device 2105 maybe a CAVE type projection room with the projection system 2130 projecting generated virtual reality scene segments onto four walls (front, left, right and floor) 2120 to generate a panoramic virtual reality scene 2110. The user may wear a pair of stereoscopic glasses 2102 that turn the images on the surrounding walls 2120 into a stereoscopic view. One or more walls may be removed to form a more portable system, for example a system with only the front wall and floor. Cameras 2140 may track the position of the user (and the user's eyes as described in some processes below) in the CAVE as well as any accessories carried by the user (including but not limited to a baseball bat, an interactive controller). Once a pitch is thrown, some embodiments will provide a trail 2111 of the pitch's flight path (for example as a series of balls 2101 placed along the pitch's trajectory), with the ball's seam position at each interval along the baseball's flight path shown from the batter's perspective so that the batter may learn to identify one pitch type from another based on the pitch trajectory and rotation of the seam pattern visible. Some embodiments will provide a mechanism for detecting if the pitch was hit with the tracked baseball bat 2104. While the embodiment shown and described is in the form of a CAVE, some embodiments may be adjusted to operate in a head mounted display (HMD) unit (FIGS. 36-38) so that the movement of batter's HMD is tracked in relation to the virtual environment, and adjust flight path of the pitched baseball in the field of view of the HMD. The batter may use gaze directed crosshairs to interact with pop-up elements such as a quiz question asking about the type of pitch, location of the pitch on strike zone plane, ball or strike (FIG. 38). The batter's bat swing may also be tracked in relation to the virtual environment, and checked if the pitch was hit. Some embodiments include a sound generation module 2195 that generates an impact sound dependent on the quality of the hit (for example, foul ball, hard/weak ground ball, line drive, pop fly, or homerun).
  • In use, a pitcher's profile may be selected from the database 2190 by the user so that the selected pitcher's digitized avatar 2150 is displayed in the virtual reality scene 2110. The selected pitcher's profile includes video footage of the pitcher's delivery and general body movements that are part of the delivery. FIG. 36 shows an enlarged view of a selected pitcher 2150 in mid-delivery during a wind-up type sequence. Some simulations will show pitchers delivering from a stretch position. As will be appreciated, every pitcher has his own arm/hand positioning as well as step placement during a delivery which affects the timing of delivery. Some pitchers go so far as to intentionally include hitches in their delivery to throw off a batter's timing. Being able to prepare against an accurate replication of a pitcher's delivery is highly beneficial to a consistent swing against that pitcher. The user may also focus on practicing against a specific type of pitch, by filtering out all other types of pitches, and working on the swing timing and/or swing position. The user may also focus on practicing against a specific range for the pitch speed, by filtering out all other ranges of pitch speed, and working on the swing timing. The user may also focus on practicing against a specific opposing pitcher by loading previous real-life at-bats data of such pitcher.
  • The pitcher's profile data also includes for example, each pitch type (including but not limited to two seam fastball, four seam fastball, change of speed, curveball, sinker, slider, split finger fastball, cut fastball, and knuckleball) that the pitcher has thrown and has been recorded along with a range of speed for each pitch type, the release point of each pitch type, trajectory paths, strike zone pitch distribution, and spin rates and spin patterns for each pitch type. The pitch type may be based on the initial grip of the baseball and the manner of hand movement used when releasing the grip (for example, degrees of supination, pronation, etc.). Thus a user may input into the system a setting which repeats the same pitch type over different parts of the strike zone or according to the selected pitcher's statistical preferences, or the user may ask for random pitch types thrown so that he can learn to recognize and distinguish one pitch type from another.
  • Referring to FIGS. 37 and 38, other embodiments may provide practice against a selected pitcher for determining whether a simulated pitch will be a strike or a ball (outside the strike zone). The system 2100 is able to use the selected pitcher's tracked data to accurately replicate known release points and simulate the same flight paths of pitches. A change in a pitcher's release point may be the difference in whether a same pitch type is either a strike or a ball. In some embodiments, a pop-up strike zone field 2115 (FIG. 37) may be displayed for the batter's reference. Some embodiments may provide a pause in the pitch's flight to present the user a displayed query graphic 2125 which may ask for example, whether the pitch is a strike or a ball. A timer 2135 may be displayed requiring the user to select a choice before the end of the timer. As shown, it will be appreciated that the timer may reflect a real-life time available to decide an aspect of the pitch. While the example is shown with respect to the decision of a ball or strike, it will be understood that other aspects may be presented in the query graphic 2125, such as asking for the pitch type, estimated pitch speed, in case of strike which of the 9 (3 by 3) strike zones did the pitch go through.
  • Referring now to FIGS. 39 and 40, illustrations showing how the release point 2155 for a selected pitcher may be determined. The digitized avatar 2150 for a pitcher may be replicated from actual game footage provided from any number of known third party sources. As described earlier, footage data used may be organized into data entries by pitch type so that the release point for each pitch type a pitcher uses is charted and used in the virtual reality setting rather than just showing for example an average release point for all the pitcher's pitches. In addition, the release point for each pitch type may be further organized by strike or ball results so that a batter may practice distinguishing from each for the same pitch type. Each video clip of a pitch thrown by the selected pitcher may be annotated to mark the player's height, the pitching mound apex (adjacent the pitcher's plate), and the height of the baseball from the mound apex level at the point of release for recorded pitch. As shown in FIG. 40, each pitcher has several video clips per pitch type in the database, with variations in the height of the ball release point recorded depending on the different pitches, as represented by avatars 2150 a, 2150 b, and 2150 c each showing the baseball being released at a different height. The processor 2160 and graphics engine module 2170 may adjust the virtual reality graphics to replicate the release point 2155 based on the annotated data.
  • Referring now again to FIGS. 34 and 45 along with FIGS. 41-43, in operation, it will be further appreciated that the system 2100 provides an enhanced realistic experience by simulating a user's environmental experience as close to the real-life experience of batting against a selected pitcher as possible. The virtual reality scene 2110 may be displayed so that the depth perception and scale of the digitized pitcher's avatar 2150 appears natural.
  • Referring also to FIG. 48 along with FIGS. 41-43, an exemplary process 2200 for matching perspective of the user with a selected pitcher's avatar is shown. Elements of the flowchart in the process 2200 are designated by placement in parenthesis while numerals without parenthesis refer to elements in the diagrams of FIGS. 41-43. Each pitch video may be pre-processed to turn all background pixels to full transparency, but the pitcher's image and baseball's image may be left as-is. This is sometimes known as rotoscoping. Each pitch video is annotated to mark the ball release point, pitcher's height, and locations of pitcher mound apex where the pitcher's pivot foot is placed. To simulate a particular pitch from the ball pitch database, first the ball release point is acquired from the ball pitch flight data, which typically is 50-55 feet away from home plate. Based on the pitch type and also height of the ball release point for this particular pitch, the best matched pitch video (rotoscoped and annotated) is selected from this specific pitcher's pool of video clips, in a way that the ball release point from the video is the closet to the actual ball release point for this particular pitch. This pitcher's avatar and associated video may be displayed on a video plane (for example in the form of a rectangular plane geometry), always facing the batter (user “U” as shown in FIG. 41), and the video plane is positioned so that the actual ball release point 2155 lies within the video plane. The eye position of the batter may be acquired (2210) from the motion tracking system which tracks the batter's stereoscopic glasses (for example glasses 2102 of FIGS. 34 and 35). The position of batter's eyes is updated (2220) in real-time. This pitch video plane may be anchored in a way that the pitcher's pivot foot is perceived to be fixated right at the apex of the pitching mound (because of line of sight) only from the perspective of the batter as shown in FIG. 41. More specifically, when the position of the user's eyes changes, the pitch video plane may be constantly adjusted in real-time to maintain the aforementioned line of sight, so that the pitcher's pivot foot appears to be exactly on top of the pitch's mound apex (essentially on top of pitcher's plate). In addition, the video pitcher may be constantly scaled and translated so that from the batter's perspective, the pitch release point 2155′ shown on the video plane matches exactly the release point 2155 of the actual pitch being simulated (because of line of sight). More specifically, without translating and scaling of the video pitcher, the release point 2155′ that appears in the video plane may not appear to be in the correct point of release in a natural environment as the perspective is off, because the projection surface may be in actuality closer than the 60 feet 6 inches a real-life pitcher would be. The height and pivot foot location of the perceived pitcher as well as the pitch release point 2155′ are calculated in real-time based on the annotation data in the pitch video clip. The pitcher video plane is then scaled (2230) in a way that the ball is perceived to be released exactly from the actual ball release point 2155, while keeping pitcher's pivot foot anchored on top of pitcher's plate, as shown in FIG. 42. The video pitcher is perceived as if he is standing on the pitcher's mound at 60 feet 6 inches away from home plate.
  • The exemplary process 2200 for matching perspective of the user also includes adjusting the stadium and the baseball field. The stadium including the field may be captured as a 360 degrees panorama photo or video and applied to a sphere (represented by the arc 2165′), or may be geometrically modeled, or may be a hybrid of the two, for example the field is geometrically modeled and the stadium is captured as a 360 degrees panorama photo. The stadium including the field may be proportionally scaled (2240) in a way that the pitcher appears to be the same height. For example, if the perceived pitcher may be scaled to be slightly taller than his actual height (in order to match the actual pitch release point), the stadium may be scaled slightly larger by the same proportion towards a reference point 2166 on the arc 2165′. The scaled stadium is represented by the arc 2165 which is scaled up from arc 2165′. The perceived pitcher thus does not appear taller than he actually is. Each step from (2210) to (2240) may be repeated in real-time for consistency with movement of the user's eyes. As a result, the pitcher's avatar is being anchored and scaled in real-time based on batter's eye position, so from the batter's perspective, the pitcher is perceived to be on the pitcher's mound pitching the ball, and the ball release point from the video pitcher exactly matches the actual pitch release point from the pitch database throughout the entire pitch, given that the batter stays within a reasonable range from the home plate. Also the stadium is being scaled in real-time based on batter's eye position, so from the batter's point of view, the pitcher is perceived to be in accordance to his actual body height relative to the size of the stadium.
  • The exemplary process 2200 may need to handle a special case which is caused by the lack of variations in the height of pitch release points (such variation is depicted in FIG. 40). Hypothetically if only one pitch release point at a specific height is available from a particular pitcher's video database, assuming it happens to be a relatively low release point (for example one similar to 2150 a), when a particular pitch performed by this particular pitcher with a relatively high release point is being simulated, this may cause the process 2200 to scale up the video pitcher so much that he becomes out of proportion. To prevent this issue, the exemplary process 2200 is designed to limit the scaling of the video pitcher (and also the corresponding scaling of the stadium) in a way that the overall virtual reality scene looks natural. The exemplary process 2200 may alter the actual pitch release point 2155 for a particular pitch in order to match the release point 2155′ from the scaled video pitcher that is already scaled at the scaling limit. A drawback is that the system may not able to utilize the pitch data 100% original due to said shifting of pitch release height, however an option is added where the batter may filter out pitches with such shifts, to be able to utilize 100% original pitch data.
  • The system may also use computer generated (CG) pitcher avatars that are animated (also known as key-framed) either manually or using motion capture data. Unlike the pitch avatar described in process 2200, CG avatar is geometrically modeled in three dimensional space with articulated bones and body joints. Such CG avatars may eliminate the need for matching the perspective of the batter. In such case, CG avatar may be anchored directly on top of the pitching mound with pivot foot placed on pitcher's plate. The CG avatar may be scaled so that the pitch release point from the animated pitch sequence exactly matches the release point from the actual ball release point 2155. The pitch motion of the CG avatar may also be slightly altered using an algorithm known as Inverse Kinematics (IK), where the joint angles of the CG avatar's arm and/or body may be mathematically computed in a way that to control end-effector (in this case the pitch's hand) to reach any feasible location within the three dimensional space, specifically in this case control the CG avatar's hand to reach exactly the actual pitch release point 2155, without the need to alter the scaling of the said CG avatar nor the stadium. The drawbacks of using CG avatar may include the following. The CG avatar and associated variation of animated pitch sequences may be very difficult to acquire, which may pose a huge challenge when deploying such baseball virtual reality training system where large quantity and variations of CG pitchers and animated pitch sequences are required; the fidelity of the CG avatars as well as the pitch motions may be subpar compared to pitchers simulated using video sources.
  • Referring now to FIG. 49, a process 2300 for simulating a pitched ball flight trajectory in a virtual reality scene is shown according to an exemplary embodiment. Pitch flight parameters are loaded (2310) in from a database that stores pitching data associated with a selected pitcher. With the virtual pitcher having performed his motion and delivery, a pitch timer may be started (2320) at the exact moment when the ball is released from the pitcher's hand, which is associated with the flight time of a virtual pitch. During display of the simulated pitch and its flight path, the simulated baseball's position along the trajectory and its rotational property (shown by position of the seams) may be updated (2340). A determination (2350) may be made whether a user's batting swing made virtual contact with the simulated pitch. If the user missed, the process determines (2360) whether the pitch (flight path) has terminated. If the user makes contact, a second timer may be started (2370) tracking the length of time a hit ball has been travelling in flight. Some embodiments may calculate the trajectory, exit velocity, exit angle and distance of a simulated hit depending on the pitch speed, pitch spin, speed of the swing, angle of swing attack, and position of the bat barrel where contact occurred. A determination (2380) may be made on whether the flight of the hit ball's timer has elapsed. If so, the flight path of the hit ball may be updated (2390) for position and rotation until an affirmative determination (2395) is achieved that the hit ball's flight timer has ended.
  • FIGS. 44-47 show schematics for determining whether a user makes contact with a simulated pitch. In some embodiments, a real bat 2104 may be used and retrofitted with tracking markers. Elements designated by a numeral and (′) represent the position of the element shown in shadow lines which is at a later point in time (either at the point of impact or post-miss) than the position of an element using a regular numbered call out. As shown in FIG. 47 the swing plane may be tracked using high speed tracking cameras (for example camera(s) 2140 of FIG. 50). The bat swing is captured using a high-speed motion capture system running at 330 frames per second or higher. At each frame, the system checks for bat-ball impact. Between two consecutive frames, the valid hitting zone on the bat sweeps over a plane, in the shape of a “trapezoid” (see FIGS. 45-47) while the ball travels along an approximated “line segment” towards the home plate. The tracking system introduces a small but consistent bat tracking latency (which may be measured in advance). To compensate for this latency, the bat swing may be predicted based on previous frames tracking data so that the tracked bat (used for bat-ball impact detection) may be made to swing slightly ahead of itself. A different approach to compensate for the said tracking latency is that the system may apply the tracking latency on the pitch ball flight simulation so that the simulated pitch ball (used for bat-ball impact detection) may travel slightly ahead of time. Because the entire ball pitch flight can be simulated in advance using the pitch parameters retrieved from database, the second approach may be simpler to implement, and it may also be more accurate than the first approach.
  • Referring to FIG. 47 a bat-ball collision detection algorithm #1 detects the scenario of possible impact. At each frame, the system calculates the minimal distance between the “trapezoid” and the “line segment”. If the minimal distance is less than the sum of the radius of the ball and the radius of the bat, it indicates a possible bat-ball impact, but requires further calculation to confirm; if the minimal distance is larger than the sum of the two, the bat does not make contact with the ball.
  • Referring to FIG. 47 bat-ball collision detection algorithm #2. When a possible bat-ball impact is detected, the system may use a continuous collision detection algorithm to further confirm the impact. Using the tracked bat data from current frame (t) (FIG. 46) and previous frame (t−Δt) (FIG. 45), the system calculates the speed of the bat. The speed of the pitch ball at current frame (t) is also calculated using the pitch parameters (as previously mentioned, the tracking latency may be applied to the timing of the ball). The system may perform an iterative procedure where it slowly advances the time t+Δt, calculates the position of the bat and the position of the ball based on their speeds at time (t), then calculates the minimal distance between the central axis of the bat and the center of the ball, if such minimal distance falls below the sum of the ball radius and the bat radius (at bat intersection), the system may confirm the bat-ball impact actually occurred at time t+Δt.
  • As will be appreciated, such continuous collision detection algorithm (#2) may accurately confirm the bat-ball impact, however it may run slower than collision detection algorithm #1. In order to achieve the fastest bat tracking result and low-latency immersive virtual reality experience, the system may be designed to run algorithm #1 by default to achieve maximum frame rate, and switches to algorithm #2 when a possible bat-ball impact needs to be confirmed. Such continuous collision detection algorithm (#2) may calculate the post-impact exit velocity, exit angle, ball trajectory and distance of a simulated hit, based on the pitch speed, pitch spin, speed of the bat swing, angle of swing attack, position of the bat barrel where contact occurred, and the overlap (vertical displacement) between the bat and the ball at contact. Based on the results of the bat-ball impact detection, some embodiments provide an impact sound generated by the sound module 2195 (FIG. 50) when the bat-ball impact is detected. Such sound may be electronically synthesized, for example the system 2100 plays recorded sound effects based on the type of impact through a speaker. Such sound may be generated physically, for example a mechanical trigger is released to hit a wood block in order to generate the sound effect. Such sound may be generated when a mechanical trigger on or inside the bat 2102 is released to hit the bat in order to generate the sound effect as well as causing the bat to vibrate as a form of haptic feedback. The impact sound may vary based on the type of bat, including an impact sound for the wood bat and a different impact sound for a metal bat. The frequency of the impact sound may vary based on the location where the bat barrel made contact with the simulated pitch, and/or the overlap (vertical displacement) between the bat and the simulated ball at contact. An impact sound may be selected for a particular pitch from a pool of impact soundtracks stored in the database or memory storage 2190, based on various properties for such particular impact, including but not limited to, pitch speed, pitch spin, speed of the bat swing, angle of swing attack, post-impact exit speed and exit angle, position of the bat barrel where contact occurred, the location where the bat barrel made contact with the simulated ball, and the overlap (vertical displacement) between the bat and the simulated ball at contact.
  • Note that the bat-ball impact detection may not real-time, but very close to real-time. The delay may be introduced by the bat tracking latency. The delay may also be introduced by the bat-ball impact collision detection algorithm #2 executed at least 1 frame after the impact.
  • As will be appreciated, a swing plane (FIG. 44-46) from the user's bat swing may be plotted as an augmented reality overlay inside the virtual environment, along with a virtual baseball as it cut through the strike zone plane. The swing plane may be used to visualize if the batter's swing was too high, too low or at the correct height. Colors may be applied to swing planes in order to help visualize such location differences from the bat swings. Bat swings and the corresponding pitches may be recorded to help the batter analyze his swing. Replay of a previous bat swing and the corresponding pitch may be activated in slow motion (frame by frame) to analyze for any arbitrary practice recorded.
  • Although the invention has been discussed with reference to specific embodiments, it is apparent and should be understood that the concept can be otherwise embodied to achieve the advantages discussed. The preferred embodiments above have been described primarily as simulated environments for sports training of athletes. In this regard, the foregoing description of the simulated environments is presented for purposes of illustration and description. Furthermore, the description is not intended to limit the invention to the form disclosed herein. Accordingly, variants and modifications consistent with the following teachings, skill, and knowledge of the relevant art, are within the scope of the present invention. The embodiments described herein are further intended to explain modes known for practicing the invention disclosed herewith and to enable others skilled in the art to utilize the invention in equivalent, or alternative embodiments and with various modifications considered necessary by the particular application(s) or use(s) of the present invention.
  • Unless specifically stated otherwise, it shall be understood that disclosure employing the terms “processing,” “computing,” “determining,” “calculating,” “receiving images,” “acquiring,” “generating,” “performing” and others refer to a data processing system or other electronic device manipulating or transforming data within the device memories or controllers into other data within the system memories or registers.
  • One or more embodiments may be implemented in computer software firmware, hardware, digital electronic circuitry, and computer program products which may be one or more modules of computer instructions encoded on a computer readable medium for execution by or to control the operation of a data processing system. The computer readable medium may be a machine readable storage substrate, flash memory, hybrid types of memory, a memory device, a machine readable storage device, random access memory (“RAM”), read-only memory (“ROM”), a magnetic medium such as a hard-drive or floppy disk, an optical medium such as a CD-ROM or a DVR, or in combination for example. A computer readable medium may reside in or within a single computer program product such as a CD, a hard-drive, or computer system, or may reside within different computer program products within a system or network. The computer readable medium can store software programs that are executable by the processor 2032 and may include operating systems, applications, and related program code. The machine readable non-transitory medium storing executable program instructions which, when executed, will cause a data processing system to perform the methods described herein. When applicable, the ordering of the various steps described herein may be changed, combined into composite steps, or separated into sub-steps to provide the features described herein.
  • Computer programs such as a program, software, software application, code, or script may be written in any computer programming language including conventional technologies, object oriented technologies, interpreted or compiled languages, and can be a module, component, or function. Computer programs may be executed in one or more processors or computer systems.

Claims (20)

What is claimed is:
1. A virtual reality system, comprising:
an electronic display configured to project a virtual reality baseball environment shown in a first-person perspective of a user training a batting swing in the virtual reality baseball environment; and
a graphics engine module connected to the electronic display, the graphics engine module configured to:
generate the virtual reality baseball environment;
generate a digitized image of a user selected real-life baseball pitcher;
retrieve from an electronic database, real-life pitching data of the user selected real-life baseball pitcher;
display in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user;
display in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and
display in the virtual reality environment, a replicated release of a digital baseball from a digital hand of the digitized image from a release point replicating a pitch thrown by the user selected real-life baseball pitcher, wherein the digital hand is positioned in a three-dimensional space of the virtual reality baseball environment from a location associated with the release point when replicating the replicated release of the digital baseball.
2. The virtual reality projection system of claim 1, wherein the location associated with the release point is based on capturing a real-life release of a baseball from a video clip of a real-life pitch thrown by the user selected real-life baseball pitcher.
3. The virtual reality projection system of claim 1, wherein the graphics engine module is also configured to scale the digitized image of the user selected real-life baseball pitcher to replicate an appearance of a perceived distance of the real-life baseball pitcher from the user's first-person perspective.
4. The virtual reality projection system of claim 1, wherein the digital hand and the replicated release of the digital baseball from the release point are constantly scaled during the displayed pitching sequence.
5. The virtual reality projection system of claim 1, wherein the graphics engine module is also configured to generate a visible swing plane overlay graphic in the virtual reality baseball environment representing a path of a user's swing.
6. The virtual reality projection system of claim 1, wherein the electronic display is a projector system or a head mounted display.
7. The virtual reality projection system of claim 1, wherein the release point is based on the real-life pitching data of the user selected real-life baseball pitcher.
8. A machine readable non-transitory medium storing executable program instructions which when executed cause a data processing system to perform a method comprising:
generating a virtual reality baseball environment in an electronic display;
generating a digitized image of a user selected real-life baseball pitcher;
retrieving from an electronic database, real-life pitching data of the user selected real-life baseball pitcher;
displaying in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user;
displaying in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and
displaying in the virtual reality environment, a replicated release of a digital baseball from the digitized image from a release point positioned relative to a digital body of the digitized image, replicating a real-life pitch thrown by the user selected real-life baseball pitcher, wherein the replicated release of the digital baseball is based on real-life pitching data captured in association with the real-life pitch, the replicated release of the digital baseball is displayed continuing in a simulated trajectory of a thrown virtual pitch, from the release point of the user selected real-life baseball pitcher toward a strike zone adjacent the user in the virtual reality baseball environment.
9. The machine readable non-transitory medium storing executable program instructions which when executed cause the data processing system to perform the method of claim 8, wherein a location associated with the release point is based on capturing a real-life release of a baseball from a video clip of a real-life pitch thrown by the user selected real-life baseball pitcher
10. The machine readable non-transitory medium storing executable program instructions which when executed cause the data processing system to perform the method of claim 8, further comprise scaling the digitized image of the user selected real-life baseball pitcher to replicate an appearance of a perceived distance of the real-life baseball pitcher from the user's first-person perspective.
11. The machine readable non-transitory medium storing executable program instructions which when executed cause the data processing system to perform the method of claim 10, further comprise constantly scaling a replicated digital hand of the digitized image through the pitching sequence and constantly scaling the digital baseball from the release point to the strike zone.
12. The machine readable non-transitory medium storing executable program instructions which when executed cause the data processing system to perform the method of claim 8, further comprise generate a visible swing plane overlay graphic in the virtual reality baseball environment representing a path of a user's swing.
13. The machine readable non-transitory medium storing executable program instructions which when executed cause the data processing system to perform the method of claim 8, wherein the release point is based on the real-life pitching data of the user selected real-life baseball pitcher.
14. A method of simulating a baseball pitcher's pitch, comprising:
generating a virtual reality baseball environment in an electronic display;
generating a digitized image of a user selected real-life baseball pitcher;
retrieving from an electronic database, real-life pitching data of the user selected real-life baseball pitcher;
displaying in the virtual reality environment, the digitized image of the user selected real-life baseball pitcher with an appearance replicating the real-life baseball pitcher in real-life from the first-person perspective of the user;
displaying in the virtual reality environment, a pitching sequence by the digitized image of the user selected real-life baseball pitcher, from either a wind-up or a stretch delivery; and
displaying in the virtual reality environment, a replicated release of a digital baseball from the digitized image from a release point based on the real-life pitching data of the user selected real-life baseball pitcher, replicating a real-life pitch thrown by the user selected real-life baseball pitcher, the replicated release of the digital baseball is displayed continuing in a simulated trajectory of a thrown virtual pitch, from the release point of the user selected real-life baseball pitcher toward a strike zone adjacent the user in the virtual reality baseball environment.
15. The method of claim 14, wherein a location associated with the release point is based on capturing a real-life release of a baseball from a video clip of a real-life pitch thrown by the user selected real-life baseball pitcher.
16. The method of claim 15, further comprising scaling the digitized image of the user selected real-life baseball pitcher to replicate an appearance of a perceived distance of the real-life baseball pitcher from the user's first-person perspective.
17. The method of claim 16, further comprising constantly scaling a replicated digital hand of the digitized image through the pitching sequence and constantly scaling the digital baseball from the release point to the strike zone.
18. The method of claim 14, further comprising generating a visible swing plane overlay graphic in the virtual reality baseball environment representing a path of a user's swing.
19. The method of claim 14, wherein the electronic display is a projector system or a head mounted display.
20. The method of claim 14, further comprising generating a three-dimensional panoramic view of a stadium surrounding the digitized image from the first-person perspective of the user.
US16/690,501 2015-04-23 2019-11-21 Virtual reality sports training systems and methods Active US10821347B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/690,501 US10821347B2 (en) 2015-04-23 2019-11-21 Virtual reality sports training systems and methods
US17/087,121 US11278787B2 (en) 2015-04-23 2020-11-02 Virtual reality sports training systems and methods
US17/700,803 US11826628B2 (en) 2015-04-23 2022-03-22 Virtual reality sports training systems and methods

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US14/694,770 US20160314620A1 (en) 2015-04-23 2015-04-23 Virtual reality sports training systems and methods
US201662294195P 2016-02-11 2016-02-11
US15/431,630 US10300362B2 (en) 2015-04-23 2017-02-13 Virtual reality sports training systems and methods
US16/404,313 US10486050B2 (en) 2015-04-23 2019-05-06 Virtual reality sports training systems and methods
US16/690,501 US10821347B2 (en) 2015-04-23 2019-11-21 Virtual reality sports training systems and methods

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/404,313 Continuation US10486050B2 (en) 2015-04-23 2019-05-06 Virtual reality sports training systems and methods

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/087,121 Continuation-In-Part US11278787B2 (en) 2015-04-23 2020-11-02 Virtual reality sports training systems and methods

Publications (2)

Publication Number Publication Date
US20200086199A1 true US20200086199A1 (en) 2020-03-19
US10821347B2 US10821347B2 (en) 2020-11-03

Family

ID=58776708

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/431,630 Active US10300362B2 (en) 2015-04-23 2017-02-13 Virtual reality sports training systems and methods
US16/404,313 Active US10486050B2 (en) 2015-04-23 2019-05-06 Virtual reality sports training systems and methods
US16/690,501 Active US10821347B2 (en) 2015-04-23 2019-11-21 Virtual reality sports training systems and methods

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US15/431,630 Active US10300362B2 (en) 2015-04-23 2017-02-13 Virtual reality sports training systems and methods
US16/404,313 Active US10486050B2 (en) 2015-04-23 2019-05-06 Virtual reality sports training systems and methods

Country Status (1)

Country Link
US (3) US10300362B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112883883A (en) * 2021-02-26 2021-06-01 北京蜂巢世纪科技有限公司 Information presentation method, information presentation device, information presentation medium, glasses, and program product
US20220264075A1 (en) * 2021-02-17 2022-08-18 flexxCOACH VR 360-degree virtual-reality system for dynamic events

Families Citing this family (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10438448B2 (en) * 2008-04-14 2019-10-08 Gregory A. Piccionielli Composition production with audience participation
US9573039B2 (en) 2011-12-30 2017-02-21 Nike, Inc. Golf aid including heads up display
US9597574B2 (en) * 2011-12-30 2017-03-21 Nike, Inc. Golf aid including heads up display
US9610489B2 (en) 2014-05-30 2017-04-04 Nike, Inc. Golf aid including virtual caddy
US10434396B2 (en) * 2015-11-30 2019-10-08 James Shaunak Divine Protective headgear with display and methods for use therewith
US10071306B2 (en) * 2016-03-25 2018-09-11 Zero Latency PTY LTD System and method for determining orientation using tracking cameras and inertial measurements
US10650590B1 (en) * 2016-09-07 2020-05-12 Fastvdo Llc Method and system for fully immersive virtual reality
US20180096506A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
US10363472B2 (en) 2016-11-02 2019-07-30 Makenna Noel Bentley Training system and method for cuing a jumper on a jump over a crossbar
US20190279428A1 (en) * 2016-11-14 2019-09-12 Lightcraft Technology Llc Team augmented reality system
US10409363B1 (en) 2017-03-07 2019-09-10 vGolf, LLC Mixed-reality golf tracking and simulation
US10661149B2 (en) 2017-03-07 2020-05-26 vSports, LLC Mixed-reality sports tracking and simulation
US10525324B2 (en) 2017-03-07 2020-01-07 vSports, LLC Mixed-reality kick tracking and simulation
WO2018165284A1 (en) 2017-03-07 2018-09-13 vSports, LLC Mixed reality sport simulation and training system
WO2018165278A1 (en) 2017-03-07 2018-09-13 vGolf, LLC Mixed reality golf simulation and training system
JP6874448B2 (en) * 2017-03-17 2021-05-19 株式会社デンソーウェーブ Information display system
US10369446B2 (en) * 2017-04-27 2019-08-06 TrinityVR, Inc. Baseball pitch simulation and swing analysis using virtual reality device and system
CN107193372B (en) * 2017-05-15 2020-06-19 杭州一隅千象科技有限公司 Projection method from multiple rectangular planes at arbitrary positions to variable projection center
US10279269B2 (en) 2017-06-22 2019-05-07 Centurion VR, LLC Accessory for virtual reality simulation
JP7000050B2 (en) * 2017-06-29 2022-01-19 キヤノン株式会社 Imaging control device and its control method
US20190037184A1 (en) * 2017-07-28 2019-01-31 Ravi Gauba Projection Display Apparatus
KR102367889B1 (en) * 2017-08-10 2022-02-25 엘지전자 주식회사 Mobile terminal
US10617933B2 (en) 2017-08-14 2020-04-14 International Business Machines Corporation Sport training on augmented/virtual reality devices by measuring hand-eye coordination-based measurements
WO2019051492A1 (en) * 2017-09-11 2019-03-14 Cubic Corporation Immersive virtual environment (ive) tools and architecture
US11410564B2 (en) 2017-11-07 2022-08-09 The Board Of Trustees Of The University Of Illinois System and method for creating immersive interactive application
CN109783112A (en) * 2017-11-13 2019-05-21 深圳市创客工场科技有限公司 The exchange method of virtual environment and physical hardware, device and storage medium
US10139899B1 (en) * 2017-11-30 2018-11-27 Disney Enterprises, Inc. Hypercatching in virtual reality (VR) system
US10864422B1 (en) 2017-12-09 2020-12-15 Villanova University Augmented extended realm system
US11113887B2 (en) * 2018-01-08 2021-09-07 Verizon Patent And Licensing Inc Generating three-dimensional content from two-dimensional images
US11409402B1 (en) * 2018-01-09 2022-08-09 Amazon Technologies, Inc. Virtual reality user interface communications and feedback
US10546425B2 (en) 2018-01-30 2020-01-28 Disney Enterprises, Inc. Real physical objects interacting with augmented reality features
JP7246649B2 (en) * 2018-02-14 2023-03-28 成典 田中 tactical analyzer
JP7032948B2 (en) * 2018-02-14 2022-03-09 成典 田中 Tactical analyzer
US10445940B2 (en) 2018-03-15 2019-10-15 Disney Enterprises, Inc. Modeling interactions between simulated characters and real-world objects for more realistic augmented reality
WO2019188229A1 (en) * 2018-03-26 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program
JP2019187501A (en) * 2018-04-18 2019-10-31 美津濃株式会社 Swing analysis system and swing analysis method
US10909372B2 (en) 2018-05-28 2021-02-02 Microsoft Technology Licensing, Llc Assistive device for the visually-impaired
US20190388791A1 (en) * 2018-06-22 2019-12-26 Jennifer Lapoint System and method for providing sports performance data over a wireless network
US20190391647A1 (en) * 2018-06-25 2019-12-26 Immersion Corporation Real-world haptic interactions for a virtual reality user
CN109200582A (en) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 The method, apparatus and storage medium that control virtual objects are interacted with ammunition
US20210339111A1 (en) * 2018-10-17 2021-11-04 Sphery Ag Training module
CN109483540B (en) * 2018-11-21 2022-02-25 南京邮电大学 Optimization method of humanoid robot layered kicking optimization model based on Gaussian punishment
US11032607B2 (en) * 2018-12-07 2021-06-08 At&T Intellectual Property I, L.P. Methods, devices, and systems for embedding visual advertisements in video content
US10984575B2 (en) 2019-02-06 2021-04-20 Snap Inc. Body pose estimation
US11691071B2 (en) 2019-03-29 2023-07-04 The Regents Of The University Of Michigan Peripersonal boundary-based augmented reality game environment
US11406887B2 (en) 2019-04-01 2022-08-09 Flyingtee Tech, Llc Multiplayer, multisport indoor game system and method
US20220223067A1 (en) * 2019-06-06 2022-07-14 Rubicon Elite Performance, Inc. System and methods for learning and training using cognitive linguistic coding in a virtual reality environment
US20210027652A1 (en) * 2019-07-26 2021-01-28 Krystian Sands Stretching facility and method for stretching, strengthening, and balancing muscles
JP7339053B2 (en) * 2019-07-30 2023-09-05 美津濃株式会社 Evaluation methods, evaluation systems, and evaluation programs
US11210856B2 (en) 2019-08-20 2021-12-28 The Calany Holding S. À R.L. System and method for interaction-level based telemetry and tracking within digital realities
CA3151925A1 (en) * 2019-08-21 2021-02-25 Flyingtee Tech, Llc Multiplayer, multisport indoor game system and method
KR20190104928A (en) * 2019-08-22 2019-09-11 엘지전자 주식회사 Extended reality device and method for controlling the extended reality device
US10894198B1 (en) * 2019-10-01 2021-01-19 Strikezone Technologies, LLC Systems and methods for dynamic and accurate pitch detection
CN110827602A (en) * 2019-11-19 2020-02-21 国家电网有限公司 Cable joint manufacturing and operation and maintenance skill training device and method based on VR + AR technology
US11816757B1 (en) * 2019-12-11 2023-11-14 Meta Platforms Technologies, Llc Device-side capture of data representative of an artificial reality environment
CN111068272A (en) * 2019-12-27 2020-04-28 上海玩星信息技术有限公司 Interactive shooting machine and control method thereof
CN111249721B (en) * 2020-01-20 2023-09-01 郑宇洋 Game design method and device based on virtual VR reality technology
AU2021265107A1 (en) * 2020-04-30 2022-11-24 Joseph Fields Systems and methods for augmented-or virtual reality-based decision-making simulation
CA3182568A1 (en) * 2020-05-08 2021-11-11 Sumitomo Pharma Co., Ltd. Three-dimensional cognitive ability evaluation system
US11210843B1 (en) * 2020-07-15 2021-12-28 Disney Enterprises, Inc. Virtual-world simulator
CN111950407B (en) * 2020-07-30 2023-12-05 浙江大学 Immersion-based shuttlecock track analysis method and system
US20220092498A1 (en) * 2020-09-24 2022-03-24 International Business Machines Corporation Virtual reality simulation activity scheduling
US11660022B2 (en) 2020-10-27 2023-05-30 Snap Inc. Adaptive skeletal joint smoothing
US11615592B2 (en) 2020-10-27 2023-03-28 Snap Inc. Side-by-side character animation from realtime 3D body motion capture
CN112286355B (en) * 2020-10-28 2022-07-26 杭州天雷动漫有限公司 Interactive method and system for immersive content
US11896882B2 (en) 2020-11-04 2024-02-13 Xreps, Llc Extended reality system for training and evaluation
US11213733B1 (en) * 2020-11-05 2022-01-04 Liteboxer Technologies, Inc. Surface interactions in a virtual reality (VR) environment
US11450051B2 (en) * 2020-11-18 2022-09-20 Snap Inc. Personalized avatar real-time motion capture
US11734894B2 (en) 2020-11-18 2023-08-22 Snap Inc. Real-time motion transfer for prosthetic limbs
US11748931B2 (en) 2020-11-18 2023-09-05 Snap Inc. Body animation sharing and remixing
US11763527B2 (en) * 2020-12-31 2023-09-19 Oberon Technologies, Inc. Systems and methods for providing virtual reality environment-based training and certification
TWI770874B (en) 2021-03-15 2022-07-11 楷思諾科技服務有限公司 Method for displaying simulation images through clicking and rolling operations
CA3213537A1 (en) * 2021-04-01 2022-10-06 Tawnia K. Weiss Experiential virtual reality event host delivery dais
US11210844B1 (en) 2021-04-13 2021-12-28 Dapper Labs Inc. System and method for creating, managing, and displaying 3D digital collectibles
US11099709B1 (en) 2021-04-13 2021-08-24 Dapper Labs Inc. System and method for creating, managing, and displaying an interactive display for 3D digital collectibles
USD991271S1 (en) 2021-04-30 2023-07-04 Dapper Labs, Inc. Display screen with an animated graphical user interface
US11227010B1 (en) 2021-05-03 2022-01-18 Dapper Labs Inc. System and method for creating, managing, and displaying user owned collections of 3D digital collectibles
US11533467B2 (en) * 2021-05-04 2022-12-20 Dapper Labs, Inc. System and method for creating, managing, and displaying 3D digital collectibles with overlay display elements and surrounding structure display elements
US11170582B1 (en) 2021-05-04 2021-11-09 Dapper Labs Inc. System and method for creating, managing, and displaying limited edition, serialized 3D digital collectibles with visual indicators of rarity classifications
US20220398936A1 (en) * 2021-06-15 2022-12-15 Richard Parker Aircraft training aid systems and processes
US11880947B2 (en) 2021-12-21 2024-01-23 Snap Inc. Real-time upper-body garment exchange
CN115624735B (en) * 2022-10-12 2023-06-27 杭州欣禾圣世科技有限公司 Auxiliary training system for ball games and working method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5443260A (en) 1994-05-23 1995-08-22 Dynamic Sports Technology Virtual reality baseball training and amusement system
US6164973A (en) 1995-01-20 2000-12-26 Vincent J. Macri Processing system method to provide users with user controllable image for use in interactive simulated physical movements
US20030040381A1 (en) * 1999-02-04 2003-02-27 Richings Richard J. Accurate, multi-axis, computer-controlled object-projection machine and manufacturing process employing the accurate, multi-axis, computer-controlled object-projection machine
US20070072705A1 (en) * 2005-09-26 2007-03-29 Shoich Ono System for pitching of baseball
US8406906B2 (en) 2009-05-23 2013-03-26 Dream Big Baseball, Inc. Baseball event outcome prediction method and apparatus
US8579734B2 (en) * 2009-07-02 2013-11-12 Stephen Joseph Stemle Throwing target, system, and method
JP6366128B2 (en) * 2011-02-15 2018-08-01 アクソン スポーツ、エルエルシー program
US9345957B2 (en) * 2011-09-30 2016-05-24 Microsoft Technology Licensing, Llc Enhancing a sport using an augmented reality display
US20130244211A1 (en) * 2012-03-15 2013-09-19 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for measuring, analyzing, and providing feedback for movement in multidimensional space
US20140274486A1 (en) * 2013-03-15 2014-09-18 Wilson Sporting Goods Co. Ball sensing
US9339709B2 (en) * 2014-08-09 2016-05-17 Les Lagier Guide arm machine
US9889358B2 (en) * 2015-06-04 2018-02-13 Jeffrey Kyle Greenwalt Systems and methods utilizing a ball including one or more sensors to improve pitching performance

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220264075A1 (en) * 2021-02-17 2022-08-18 flexxCOACH VR 360-degree virtual-reality system for dynamic events
US11622100B2 (en) * 2021-02-17 2023-04-04 flexxCOACH VR 360-degree virtual-reality system for dynamic events
US12041220B2 (en) 2021-02-17 2024-07-16 flexxCOACH VR 360-degree virtual-reality system for dynamic events
CN112883883A (en) * 2021-02-26 2021-06-01 北京蜂巢世纪科技有限公司 Information presentation method, information presentation device, information presentation medium, glasses, and program product

Also Published As

Publication number Publication date
US20170151484A1 (en) 2017-06-01
US10300362B2 (en) 2019-05-28
US10821347B2 (en) 2020-11-03
US20190255419A1 (en) 2019-08-22
US10486050B2 (en) 2019-11-26

Similar Documents

Publication Publication Date Title
US10821347B2 (en) Virtual reality sports training systems and methods
US11826628B2 (en) Virtual reality sports training systems and methods
US11642047B2 (en) Interactive training of body-eye coordination and reaction times using multiple mobile device cameras
US11132533B2 (en) Systems and methods for creating target motion, capturing motion, analyzing motion, and improving motion
US10695644B2 (en) Ball game training
US11783721B2 (en) Virtual team sport trainer
US20160314620A1 (en) Virtual reality sports training systems and methods
Miles et al. A review of virtual environments for training in ball sports
TWI377055B (en) Interactive rehabilitation method and system for upper and lower extremities
US9345957B2 (en) Enhancing a sport using an augmented reality display
US8418085B2 (en) Gesture coach
JP2019535347A (en) Method and system for using sensor of control device for game control
US20170036106A1 (en) Method and System for Portraying a Portal with User-Selectable Icons on a Large Format Display System
US20140078137A1 (en) Augmented reality system indexed in three dimensions
US20180339215A1 (en) Virtual reality training system for team sports
Yeo et al. Augmented learning for sports using wearable head-worn and wrist-worn devices
US20240245975A1 (en) Mixed reality simulation and training system
US11331551B2 (en) Augmented extended realm system
Bhandari Influence of Perspective in Virtual Reality
Miles An Advanced Virtual Environment for Rugby Skills Training

Legal Events

Date Code Title Description
AS Assignment

Owner name: EON REALITY SPORTS, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REILLY, BRENDAN;HUANG, YAZHOU;CHURCHES, LLOYD;AND OTHERS;SIGNING DATES FROM 20170210 TO 20170213;REEL/FRAME:051077/0113

Owner name: WIN REALITY, LLC, TEXAS

Free format text: CHANGE OF NAME;ASSIGNOR:EON SPORTS, LLC;REEL/FRAME:051077/0259

Effective date: 20181227

Owner name: EON SPORTS, LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EON REALITY SPORTS, LLC;REEL/FRAME:051077/0203

Effective date: 20181026

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: LAGO INNOVATION FUND, LLC, GEORGIA

Free format text: SECURITY AGREEMENT;ASSIGNOR:WIN REALITY, LLC;REEL/FRAME:057247/0112

Effective date: 20210819

AS Assignment

Owner name: LAGO INNOVATION FUND III, LLC, GEORGIA

Free format text: SECURITY AGREEMENT;ASSIGNORS:WIN REALITY, INC.;WIN REALITY, LLC;REEL/FRAME:066141/0332

Effective date: 20231228

Owner name: WIN REALITY, LLC, TEXAS

Free format text: RELEASE OF IP SECURITY INTEREST;ASSIGNOR:LAGO INNOVATION FUND, LLC;REEL/FRAME:066141/0444

Effective date: 20231228

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 4