WO2022056315A1 - System and method for capture and analysis of sporting performance data and broadcast of the same - Google Patents

System and method for capture and analysis of sporting performance data and broadcast of the same Download PDF

Info

Publication number
WO2022056315A1
WO2022056315A1 PCT/US2021/049950 US2021049950W WO2022056315A1 WO 2022056315 A1 WO2022056315 A1 WO 2022056315A1 US 2021049950 W US2021049950 W US 2021049950W WO 2022056315 A1 WO2022056315 A1 WO 2022056315A1
Authority
WO
WIPO (PCT)
Prior art keywords
impact
ball
velocity vector
sporting equipment
equipment
Prior art date
Application number
PCT/US2021/049950
Other languages
French (fr)
Inventor
Bernhard Wilhelm Benjamin RICHTER
Jacob Van Reenen PRETORIUS
Original Assignee
Richter Bernhard Wilhelm Benjamin
Pretorius Jacob Van Reenen
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Richter Bernhard Wilhelm Benjamin, Pretorius Jacob Van Reenen filed Critical Richter Bernhard Wilhelm Benjamin
Publication of WO2022056315A1 publication Critical patent/WO2022056315A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image
    • G06T2207/30224Ball; Puck
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use

Definitions

  • This invention relates to systems and methods for capture, display and analysis of the performance of sporting equipment, players, and the broadcasting of sporting performance data in a manner that can be displayed in conjunction with, or separately from, video coverage of the sporting activity.
  • the object of the game is to deliver a certain outcome with the ball using the body, a bat, club, racket, or other mechanisms/techniques as permitted by the rules of the particular sport.
  • the outcome is usually measured as scoring a goal in soccer or hockey, a home run in baseball, a boundary in cricket, an ace in tennis, or any other metric(s) as defined in the rules of the associated sport.
  • the outcome of the position of the game ball, puck, shuttlecock, etc. has determined the degree of effectiveness of the shot and/or the player (if at all).
  • the outcome of the shot is, however, not always a true representation of the skill of the player(s) executing a shot as other players (such as fielders), nets, posts, and/or other obstacles that are part of the sport can sometimes inhibit the outcome of the action, and mask the effectiveness and/or skill of a player’s action.
  • badly hit shots can still reach a boundary in cricket, a miss-que can still reach the goal, and a badly executed shot can still go past a player to win a point.
  • Such devices may also fail to comply with the strict regulations that apply to bats, rackets, clubs, and other hitting devices as permitted by the specific rules of the respective sports.
  • Some devices such as the device(s) described in U.S. Patent No. 10,527,487, entitled SYSTEM AND METHOD FOR SENSING HIGH-FREQUENCY VIBRATIONS ON SPORTING EQUIPMENT, filed 5/30/2017, and which is incorporated herein by reference as useful background information, overcome, some but not all, of these disadvantages.
  • all these data gathering devices require physical adhesion/fastening of a device to the player’s equipment, thereby interfering with the player in some manner or form, or potentially being in breach of the rules of the sport.
  • Some performance data/metrics are presently derived from third party sensing devices that do not require (are free of) attachment to, or interference with, the associated sporting equipment.
  • optical, radar, and acoustic based systems are ubiquitous at professional sporting events such as golf, baseball, soccer, tennis, and cricket where the ball is tracked to inform viewers of the trajectory, position of impact with the ground, or distance of a shot.
  • various sports utilize a form of these technologies to increase the accuracy of umpiring decisions such as goal-line crossing, strike zone penetration, ball location and its position relative to features of the sport, bat contact, or leg-before-wicket decisions to name a few.
  • This invention overcomes disadvantages of the prior art by providing a system and method for generating independent, repeatable, high fidelity, and verifiable information regarding the outcome of a player’s action with respect to a sport ball, particularly in sporting activities that employ equipment to engage/hit the ball — for example bat-on-ball sports, in a manner that does not interfere with/is free of interference and/or distraction relative to the player and/or does not require/is free of any involvement by the player or the player’s equipment in the data gathering. Moreover, operation of the system and method does not implicate any restrictions based upon the rules or regulations of the sport.
  • the system and method operates to render relatively complex measurements of captured audio, visual and/or radar information into simple, human understandable scalars, which, in turn, can be used to create informatics such as infographics, live metrics used in broadcasting, and/or near real-time metric dashboards.
  • the system and method can be applied effectively to, but not limited for use in, summarizing common phenomena in hitting ball sports such as baseball, cricket, golf, hockey, and tennis.
  • this system and method presents a set of metrics that comparatively summarize player performance and how to efficiently compute them from audio, visual, and/or radar data collected from sports broadcasting equipment without the need to physically contact the equipment.
  • the system and method also advantageously allows for manipulation of computational parameters in order to produce ideal metrics which account for human bias and the desire for familiar info-metric scales (such as a scale from 1 to 100). More generally, the system and method herein overcomes the deficiencies of the prior art by utilizing readily available, non-contact, unobtrusive, and, in some cases, already implemented sensors and sensing technology to deliver metrics such as ball-speed off the bat, amount of energy delivered to the ball by the player, equipment speed, as well as the effectiveness of the power transfer between equipment and ball.
  • a system and method for obtaining velocity vectors of a first sporting equipment in conjunction with engagement by the equipment with a second sporting equipment is provided.
  • the system and method provides one or more of remote sensing modalities that generate two respective forms of data free of contact with the first and the second sporting equipment, and combines the respective forms of data to provide the velocity vectors.
  • a postcontact velocity vector by the first sporting equipment is determined by visual ball tracking using a camera assembly that generates a succession of image frame data.
  • the post-contact velocity vector by the first sporting equipment can be determined by radar and LIDAR ball-tracking.
  • a post-contact velocity vector can be determined by a combination of ball-tracking and audio recording.
  • a difference between a velocity vector of the first sporting equipment before and after contact with the second sporting equipment can also be determined by the system and method. Additionally, the system and method can calculate a change in kinetic energy of the first sporting equipment based upon the difference in the velocity vector, and/or a change in momentum of the first sporting equipment based upon the difference in the velocity vector. The system and method can further calculate power delivered to the first sporting equipment based upon the difference in velocity vector in combination with the duration of contact between the first sporting equipment and the second sporting equipment during engagement therebetween. Illustratively, the system and method can determine the duration of contact from an audio recording of the impact.
  • a system and method for tracking a preimpact velocity vector and a post-impact velocity vector of a pair of sporting equipment coming into contact to impact with each other is provided.
  • Motion information from at least one non-contacting sensor generating a discrete type of motion data is received, and the data from the non-contacting sensor is used to provide information relative to the pre-impact velocity vector and the post-impact velocity vector.
  • the system and method can determine the pre-impact velocity vector and the post-impact velocity vector based upon visual ball and equipment tracking.
  • the system and method can also determine the pre-impact velocity vector and the post-impact velocity vector using radar and LIDAR ball and equipment tracking.
  • the post-impact velocity vector of the sporting equipment can be determined based upon a combination of ball-tracking and audio recording.
  • the system and method can calculate a change in kinetic energy of the sporting equipment based upon a change in the pre-impact velocity vector and the post impact velocity vector and post-impact velocity vector and/or the efficiency of transfer of energy during the impact based upon a ratio of change in kinetic energy between each piece of the sporting equipment.
  • the system and method can also calculate a pre-impact kinetic energy of the sporting equipment based upon the pre-impact velocity vector.
  • one piece of the sporting equipment is a ball
  • the system and method can calculate the efficiency of transfer of the kinetic energy between each piece of the sporting equipment based upon a ratio of a difference in pre-impact kinetic energy and post-impact kinetic energy of a ball and the pre-impact velocity of the piece of sporting equipment impacting the ball.
  • the other piece of sporting equipment is a bat, club, racket or other hand-held implement for impacting and/or imparting kinetic energy to the ball.
  • FIG. 1A is a diagram showing exemplary images of various sports during engagement of a ball with associated information captured by video cameras and tracking systems for use in the embodiments herein;
  • FIGs. 2A-2E are diagrams of exemplary images depicting various infographics presented by broadcasters as derived from auxiliary sensors according to the embodiments herein;
  • FIG. 3 A is a diagram showing an generalized system for acquiring various forms of data with respect to a sporting activity using remote sensing devices and an associated processing arrangement for generating meaningful data that can be displayed and manipulated relative to the activity;
  • Fig. 3B is a diagram depicting different, exemplary traces and vectors associated with ball flight pre-impact and post-impact with sporting equipment;
  • FIG. 4 is a flow diagram showing a process utilizing video and acoustic information to determine performance metrics in sports using the exemplary arrangement of Fig. 1;
  • FIGs. 5 A and 5B are diagrams of exemplary images of information captured by synchronized microphone and video sensor systems according to the arrangement of Fig. 1;
  • FIGs. 6A and 6B are diagrams showing an example of the use of captured acoustic and video sensor data to determine the point of and result of impact of sport equipment according to embodiments herein;
  • Fig. 7 is a graph showing an example of a captured acoustic trace of the impact of sports equipment according to the embodiments herein;
  • FIG. 8 is a flow diagram showing a process for utilizing acoustic and video information to determine performance metrics in sports according to embodiments herein;
  • Figs. 9A and 9B are graphs showing an exemplary series of captured positions and the actual velocity of a bat in time during a swing at and impact of a ball analyzed using the arrangement of Fig. 1 and processes of the embodiments herein.
  • the higher data capture rate of the high-speed video cameras has also enabled system providers to track the flight of the ball during plays. Flight tracking is currently performed by either radar systems or high-speed cameras, or both, depending upon the speed of the ball or the specific sport.
  • the term “ball”, as used herein, can refer to any sport equipment that is used to determine the outcome of the sport. Therefore, the term “ball” can include, but is not limited to, a puck, chuck, shuttlecock, and any other device that experiences a velocity and/or direction change due to the impact of equipment that are attached to/manipulated by a player or due to impact by the player themselves.
  • Radar-based systems such as the above-noted Trackman, can also deliver player based metrics based on the speed of the club and the speed of the ball. Disadvantageous ⁇ , these metrics are limited to sports where the ball is initially at rest before impact, such as in golf. Some of the systems currently in use in sports, such as baseball, are able to detect the velocity vector of a moving ball before it comes into contact with the equipment, as well as after it has made contact with the equipment. The process of capturing that information is well-known to those skilled in the art. [0026] Also disadvantageous ⁇ , non-contact based systems are currently unable to determine the speed of the equipment, such as a cricket bat, before making contact with the ball.
  • the present invention can perform the task of accurately measuring the ball velocity vector before and after contact with the hitting equipment, as well as the velocity of the hitting equipment. Furthermore, the invention can utilize this information, as well as other non-contact information such as sound recordings, to establish the time and point of contact between ball and hitting equipment.
  • the information contained in the measured velocity vectors, contact time, and contact point can be sufficient to calculate and produce a plurality of values, measurements, and metrics that are novel and have not been implemented in prior art arrangements, and can thereby provide a non-contact approach for calculating various unique measurements and values provided by visual and acoustic sensors herein, as well as values and measurements that have been implemented by prior art implementations.
  • Fig. 1 depicts an example of captured screen images 100 an optical ball tracking system utilized in various sports — for example sports employing a bat 113, such as baseball (images 110, 112) and cricket (images 114, 116).
  • a bat 113 such as baseball (images 110, 112) and cricket (images 114, 116).
  • mathematical formulas such as Kalman filters are able to calculate the position of the ball in each frame of the video.
  • This position vector is fed to a digital overlay system 360 that is able to plot the position of the ball on the video and produce flight track 101 as shown in the images 110, 112, 114 and 116 of Fig. 1.
  • Fig. 2A illustrates another addition to the ball tracking system depicted generally in Fig. 1.
  • the position vector and time stamp of each frame in the video is utilized to calculate the velocity vector and speed of the ball at different points during flight.
  • Fig. 2A thus depicts a captured image 210 of a ball bowled by a bowler in a cricket match and how the speed of the ball changes from release point 221 to when it reaches the batsmen 218 after it has come into contact with the ground.
  • Release point 221 speed is significantly reduced by aerodynamic drag, friction, and the friction of the ball coming into contact with the ground, reducing position 221 speed by almost 16% to the attain batman’s position 222 speed.
  • the image 210 illustrates the ability of current available optical technology to measure and compute the velocity of the ball at different points of flight as well as compute track 101.
  • Fig. 2B shows an image 230 that further illustrates the information captured by a ball tracking system before and after impact with sports hitting equipment.
  • Pre-impact ball tracking positions 223 show how rich the information captured by the camera is. It should be clear to those of skill that the ball position and speed can be calculated frame by frame and then compared with the predicted position. Errors in the prediction can then be corrected by tweaking parameters of the prediction algorithm in a process called calibration, well known to those skilled in the art. Previous testing has shown that current prediction systems need only (e.g.) four (4) frames of ball trajectory to be able to converge coefficients to the point that the predicted flight of the ball is within the margin of error of the ball tracking system itself.
  • Fig. 2C depicts an image 240, which illustrates ball tracking after the ball has impacted sporting hitting equipment (bat 113) controlled by the player.
  • bat 113 sporting hitting equipment
  • the embodiments herein desirably operate to compute and/or capture the velocity vector of the ball immediately post-impact with hitting equipment.
  • Fig. 2D depicts an exemplary image 250 where the average speed of the ball after impact with the hitting equipment has been measured.
  • Post-impact ball speed 224 displays the magnitude of the velocity vector, but lacks the direction of the vector as known to those skilled in the art.
  • trajectory 226, which is computed from a plurality of discrete camera captures is readily available from trajectory 226, which is computed from a plurality of discrete camera captures.
  • Fig. 2E depicts an image 260, showing an example of the types of visual infographic and metrics 227, which are captured by sensing systems with regard to player performance, and which are currently displayed by broadcasters to augment audience viewing experience. These metrics are measured by sensing systems that are in physical contact with the player or player’s equipment.
  • the present embodiments are, therefore, adapted to deliver similar metrics but aims to capture them with sensing systems that are not in contact with the player or player’s equipment, and that are currently deployed at professional sporting events.
  • the arrangement 300 includes various sensing devices, including, but not limited to, one or more cameras 310 and 312 (shown in phantom).
  • the camera(s) provide video data 314 to, and receive control signals from, the tracking process(or) 340 (described further below).
  • one or more audio sensors e.g. microphones with or without appropriate sound-focusing structures
  • Other sensors 326 such as radar, laser range finders, LIDAR, time of flight devices, etc., can be employed to deliver appropriate data 328 to the process(or) 340.
  • the tracking process(or) 340 can be implemented using a variety of processing modalities, including, but not limited to, a general purpose computer — such as a server, PC, laptop, tablet, etc., an ASIC, FPGA, GPU-accelerated parallel processing platform, or other custom-built data processor. Appropriate capture cards, etc. can also be employed for the particular form of input data in a manner clear to those of skill.
  • the process(or) 340 can define a plurality of functional processes/ors and/or modules. These modules can include, but are not limited to, a video process(or) 342, that receives and analyzes video data 314, and/or an audio process(or) 344 that receives and analyzes audio data 322.
  • the other sensors 326 can transmit data 328 (and receive appropriate control signals) to an appropriate sensor process(or) 346 that is adapted to handle the particular data provided by the sensor(s) 326.
  • a generalized infographic process(or) 348 receives processed data from each of the process(ors) 342, 344, 346, and other data stores (not shown) to generate appropriate infographic and related performance data in a manner described below.
  • An artificial intelligence system 349 can serve the tracking process(or) 340, the video process(or) 342, the audio process(or) 344, the other sensor process(or) 346, and the infographic generator 348 individually, or collectively.
  • artificial intelligence system 349 can provide machine learning, deep learning, machine vision techniques including foreground detection, object tracking, optical flow, semantic segmentation, and other artificial intelligence functions as in a manner generally known to those skilled in the art. Included in these artificial intelligence functions are algorithms/processors, decision trees, classification trees, databases, optical vectors, masks, and training sets among others.
  • the data can be displayed on an appropriate display, for example, a computer screen 350, mobile devices 353, other receiving systems 354 that interfaces with other broadcast (and/or streaming) resources as shown, and/or can be provided directly to broadcast applications and/or digital overlay systems 360.
  • the data can be distributed to and/or stored on other destinations such as the cloud 351, local storage 352, mobile devices 353, and/or other receiving systems 354 for storage, distribution, and consumption, in a manner generally known to those of skill.
  • Video data 314, audio data 322, and other sensor data 328 can also be retrieved from recordings of events that happened in the past and have been stored on devices such as the cloud 351, local storage 352, mobile devices 353, other receiving systems 354, and/or broadcast applications and/or digital overlay systems 360.
  • the results and output of video process(or) 342, audio process(or) 344, or other sensor process(or) of an event or events can be stored and delivered to infographic generator 348 at a later date than when the event(s) occurred.
  • the tracking processor 340 can perform the functions of the illustrative embodiments described here on live, real-time, historical, archived, and/or otherwise stored video (and other) data/media.
  • Fig. 3B is a diagram showing a graphical representation 330, by way of non-limiting example, of an embodiment for graphically/mathematically solving the performance metrics described and contemplated above.
  • vectors are defined as magnitudes in different axes — for example three orthogonal, Cartesian axes, that can be labelled x, y and z.
  • the depicted axes are depicted as an x axis 331, which is horizontal and in the direction of the ball traveling toward the player; ay axis 332, which is the horizontal direction perpendicular to the travel of the ball toward the player; and a z axis 333, which is parallel to gravity or vertical.
  • Pre-impact ball trajectory 334 intersects player hitting equipment (e.g., bat) 113 (Fig. 1A, etc.) at impact point 336, and defines an incoming velocity vector 337 at impact point 336.
  • the impact with equipment 113 caused the ball to change direction and follow post-impact ball trajectory 335, with outgoing velocity vector 338 at impact point 336.
  • Pre-impact ball trajectory 334 and postimpact ball trajectory 335 are determined by the systems and methods described above, and are known to those of skill. Once pre-impact trajectory 334 and postimpact trajectory 335 are known, their intersection can be calculated by known mathematical techniques. This intersection is, thus, defined in time and space as impact point 336. Once impact point 336 is known, incoming velocity vector 337 of pre-impact trajectory 334 and outgoing velocity vector 338 of post-impact trajectory 35 can be calculated.
  • Velocity vectors 337 and 338 is defined as the square of the sum of the roots of the velocity axis (x 331, y 332 and z 333) as described by the following equation:
  • V is the velocity in the axis as indicated by the subscript [0043]
  • Fig. 4 is a flow diagram showing a procedure 400 for determining the change in kinetic energy and associated efficiency.
  • visual tracking in the above example of calculating metrics can be replaced by other tracking mechanisms such as LIDAR, radar, and any other tracking system that can track a moving object and capture its position in time.
  • a microphone 410 can provide sound inputs from the sports play /ball impact to an impact identifier 412.
  • a front-on oriented camera 420 delivers at least (e.g.) four (4) pre-impact and post impact successive image frames of the (e.g.) bat and ball action to a collection process 422, that also receives the impact identification information 412 as a trigger.
  • one or more side-on camera(s) 424 deliver at least four pre-impact and post impact image frames (typically in timesynchronization with the front-on camera 420 to the collection process 422.
  • these image frames are transmitted by both cameras 420, 424 to a preimpact bat frame data store 426.
  • the impact identification process 412 transmits the audio data to a contact duration process 428 that determines the time through which the ball is contacted by the bat. This data is then transmitted to a power calculation process 432, which uses this information as one of the factors (described further below) to compute the power transferred to the ball by the bat or other equipment piece.
  • the collected frames are transmitted from the collection process 422 to a ball location identification process 430 that identifies the location of the ball in each of the image frames.
  • This process provides the location information to an incoming ball trajectory determination process 434 and outgoing ball trajectory determination process 436.
  • These two pieces of data are then provided to an impact point calculation process 440.
  • the overall data is then provided to an incoming vector calculation process 442 and an outgoing vector calculation process 444.
  • the calculated, incoming and outgoing vectors are then provided to a vector velocity change calculation process 446.
  • This data is provided to calculate the change in momentum in the process 450.
  • the information from the velocity vector change process 446 is also provided to a change in kinetic energy calculation process 454.
  • the computed change in kinetic energy is used, along with the above-described contact duration time (process 428) to calculate the power delivered to the ball by the bat/ equipment (process 432).
  • the change in kinetic energy from the process 454 is used to calculate energy transfer efficiency (process 456).
  • the pre-impact frame data is used to determine pre-impact bat speed (process 460), and bat kinetic energy (462), which is provided as a second input to the kinetic energy transfer efficiency calculation process 456.
  • FIGs. 5A and 5B are images of an exemplary use of acoustic capture technology, which is another widely available sensor technology deployed in sport.
  • Use of a microphone arrangement to perform acoustic capture of a scene is widely used to enhance viewer experience since microphones can be placed strategically, and directionally oriented, near players, spectators, and umpires to capture the sounds and noises associated with the game and bring those additional experiences to a viewer. In some sports, like cricket, sound is also utilized to aid in umpire decision making. Acoustic capture provides additional, richer data sets since it is often captured at rates that far exceed those of visual systems.
  • Standard acoustic capturing can occur between 10 kHz and 80 kHz, or 10,000 to 80,000 samples per second, compared to 250 frames per second (fps) of typical visual information, thereby providing detail at a far higher rate than other sensing equipment typically deployed during these sporting events.
  • Figs. 5A and 5B depict an example (successive image frames) of a technique whereby the visual frame-capturing of cameras are synchronized with the audio capture.
  • the cameras from the front (images 510, 512) and from the side (images 520, 522) of the batsman 530 in cricket are synchronized with each other as well as with a microphone placed in close proximity of the batsman.
  • Acoustic signature box 551 displays the sound captured between two consecutive video frame captures.
  • Video capture can occur between 60 and 1,000 fps.
  • the video is captured at 250 fps, and thus, there is a 4 millisecond (ms) delay between video frames.
  • the sound in this example is recorded at 44,200 Hz. Consequently, there are 176 sound samples taken between the consecutive video frames in the video sample.
  • Acoustic signature box 551 plots all 176 sound samples associated with the time between the capturing of the two frames.
  • acoustic signature box 551 shows no sound recorded since ball 552 has not reached bat 113.
  • contact noise 554 has been recorded and displayed in acoustic signature box 551.
  • the acoustic signature is useful in a number of ways. Firstly, if the illustrative intersecting/combinatory procedure 400 described above and visualized in Fig. 4 is utilized, contact noise 554 can indicate and mark the frame within which contact happens, and thereby identify the frame and point in time that contains impact point 336, incoming velocity vector 337, and outgoing velocity vector 338 (Fig. 3B). [0049] Secondly, when the acoustic signature is properly synchronized with the video recording, another exemplary procedure to determine incoming and outgoing velocity vectors 337 and 338 can be implemented. Hence, referring to Fig.
  • signature box 551 contains the acoustic signature between two consecutive video frames with contact noise 554 indicating that contact between ball and hitting equipment bat 113 has occurred in the time interval between the two consecutive frames.
  • Pre-impact frame A 661 shows the location of the ball in the video frame before contact and postimpact frame B the location of the ball after impact.
  • Pre-impact trajectory 334 shows the trajectory of the ball up to pre-impact frame A and incoming vector 337 is the vector of the ball at pre-impact frame A.
  • Pre-impact trajectory 334 and incoming vector 337 can be determined from the immediately preceding contiguous video frames, as illustrated in Fig. 1, by those skilled in the art.
  • any impact that occurs between the capture of consecutive frame A 661 and frame B 662 will have a tl 663 and t2 664 associated with it that will add up to 0.004 seconds for a frame rate of 250 fps, or the time between consecutive video frame captures as shown in Fig. 6A.
  • Frame A 661 is, therefore, a capture pre impact and frame B 662 a capture post impact.
  • Impact noise 554 can be detected by various techniques known to those skilled in the art. For example, a basic amplitude threshold can be utilized. Alternatively, a high-pass, low-pass, or band-pass filter can be used to remove unwanted noise from the signal, after which a threshold trigger can be used. These and other procedures that can be used to detect the time of impact are well known to those skilled in the art.
  • impact point 336 can be calculated by multiplying tl by incoming velocity vector 337 to obtain distance vector 665 that the ball has travelled to impact point 336. Adding this distance vector 665 to the known captured position of preimpact frame A 661 will fix impact point 336 in space. The difference in position of impact point 336 and post impact capture Frame B provides the outgoing distance vector, and by dividing that vector by t2 produces outgoing velocity vector 338. Similarly, tl can be utilized with pre-impact trajectory 334 to establish impact point 336 in a manner clear to those skilled in the art.
  • outgoing velocity vector 338 can be calculated as described above.
  • the example described above illustrates the method for a two-dimensional vector utilizing a set of images from one camera. Adding information from another camera that captures Frame A 661 and Frame B 662 from a different angle will provide information such that outgoing velocity vector 338 can be determined in three directions as will be known to those skilled in the art.
  • impact point 36 is the location of the ball at either frame A when tl 663 is zero or at frame B when t2 664 is zero.
  • the position of the ball for frame A 661, as well as the accompanying pre-impact trajectory 334 and incoming velocity vector 337 should be obtained from the frame immediately preceding frame A for the calculation of outgoing velocity vector 38.
  • impact point 336 coincides with frame B, it should be clear that the position of the ball in the frame immediately following frame B should be used in the calculation of outgoing vector 338.
  • Fig. 6B further depicts a subset of different scenarios for which this illustrative method will accomplish the calculation of outgoing velocity vector 338.
  • Three different traces in audio box 551 are shown, with different times of impact in the box for each scenario.
  • scenario 601 the impact occurs closer to the middle of the audio box and thus tl and t2 are almost the same length.
  • incoming vector 337 has a different magnitude and direction than in Fig. 6A so as to illustrate that the method can still be used to calculate the impact point and therefore outgoing velocity vector 338 as described above.
  • tl 663 is smaller than t2 664 in audio box 551.
  • distance vector 665 will be smaller for the same incoming velocity vector 337 and therefore impact point 336 will be at a different location, influencing both the magnitude and direction of outgoing velocity vector 338 as shown.
  • the ball location in frame B and the subsequent outgoing velocity vector 338 are substantially different from the previous scenarios, as tl 663 is substantially smaller than t2 664.
  • the influence of the changes in these variables as illustrated in these three scenarios (601-603) are captured in the change in magnitude and direction of post-impact velocity vector 338, as shown in Fig 6B.
  • FIG. 7 depicts a graph 700 of a typical signature acoustic trace of a ball-and-bat impact captured by a microphone.
  • trace 554 There are several features of trace 554 that are of interest. Firstly, there is a rapid rise in acoustic sound level when the ball first makes contact with the bat. After several high frequency oscillations, the trace dampens out to almost zero.
  • the start of acoustic signature 772 and end of acoustic signature 773 are two points in time that can easily be identified by standard methods known to those skilled in the art.
  • the difference between these two points are an indication of the duration of contact between the bat and the ball and can be calculated as contact time 774.
  • Contact time 774 can be used in unison with the change in kinetic energy to calculate the performance metric of amount of power delivered by the bat to the ball.
  • Power delivered to the ball can be calculated by the following equation: n _ E Ba ii ⁇ Ball ⁇ - -
  • Fig. 8 is a flow diagram showing a procedure 800 for determining the exit velocity vector and then the change in momentum and kinetic energy as well as the power delivered to the ball with the illustrative hybrid system described above. As this procedure is similar to the procedure 400 in Fig. 4, any similar steps and processes have the same reference numbers, and only differing steps/processes are described. More particularly, the identified impact (process 442) is used to determine times tl and t2 (as described above) in process 820, which, in turn is used to calculate the impact point in process 822.
  • the incoming velocity vector is computed in process 830 based upon inputs from the pre-impact and post-impact frame collection process 422 and the location of the ball in the image frames (process 430).
  • the incoming velocity vector value is provided to the outgoing vector computation process 832 along with the impact point that is computed in the process 822.
  • the outgoing velocity vector from process 832 is provided to the change in velocity vector computation process 840, which provides outputs to compute the change in momentum (process 842) and the change in kinetic energy process 844).
  • the procedure 400 determines the power delivered to the ball in process 850).
  • the change in kinetic energy output from process 844 is also used, along with the computed bat kinetic energy (process 462) to determine the energy transfer efficiency in process 860.
  • Fig. 9A is a three-axis graphical representation of a cricket bat being swung through 3D space and the position of the bat captured at certain discrete time intervals. As bat 113 travels through space, its orientation and speed is constantly altered and manipulated by the player holding the bat.
  • the bat tip speed at any point can be calculated by determining the displacement vector between two consecutive video frames of the bat tip and then dividing it by the time it takes to advance a frame as in the equation below.
  • Pos is the position of the bat in a frame
  • t is the time of the first frame
  • t+1 is the time of the following frame
  • At Frame is the difference in time of the two consecutive frames as determined by the frames per second of the capturing device.
  • Fig. 9B is a graph of an exemplary, actual measurement of bat speed versus time of a bat impacting a ball.
  • the largest displacement between discrete time intervals are between position 992 and 993. This represents peak pre-impact bat tip speed 997 as shown in Fig. 9B.
  • the bat speed just before impacting the ball (pre-impact speed 999) can be calculated by determining the difference in position of the bat in the last two pre-ball impact frames 994 and 995, and dividing that by the time it takes to advance a frame.
  • Pre-impact speed 999 is a useful metric since it provides an indication of the amount of kinetic energy available in the bat before impact with the ball.
  • an estimate of the kinetic energy of the bat can be made by the following equation.
  • m Bat is the mass of the bat
  • t is the time of a frame
  • t+1 is the next frame in a series
  • At Frame is the time between frames.
  • pre-impact kinetic energy of the bat can be calculated.
  • the amount of energy delivered from the bat to the ball during the impact can also be determined by utilizing the following formula
  • another important performance metric for bat-on-ball sports namely the efficiency of the shot, can be calculated. This metric is sometimes referred to as the quality of the shot, or the sweetness of the shot, or hitting the shot in the sweetspot of the bat, and so forth.
  • the efficiency of a shot relates to the change in kinetic energy of the ball versus the amount of effort the player has put into the hit. Since the kinetic energy of the bat and the change in kinetic energy of the ball can be determined by the methods described above, the efficiency of the shot can be calculated by the dividing the change in kinetic energy of the ball by the available kinetic energy of the bat as shown in following equation:
  • n shot is the efficiency of the shot.
  • FIG. 4 A schematic representation of this illustrative process to establish the velocity and kinetic energy of the bat utilizing either of the illustrative means of determining the kinetic energy change in the ball is shown in Figure 4 and Figure 8.
  • the above was a description of a technique to resolve player performance metrics from sensors that are not in contact with and not interfering with the player or player’s equipment, and that do not need any input or intervention from the player to be operational. Furthermore, the methods described above can be utilized with information already captured by sensors, sensing devices, cameras, and microphones that are presently deployed at most professional, collegiate, and club games by broadcasters.
  • the data obtained by the foregoing can also be used to calculate a number of additional metrics. For example, prediction of the outcome of the shot, distance the ball will travel, probable impact location of the ball, probability of interference of the ball by another player or object in the game, potential to and amount of score, and other metrics can be calculated.
  • the system and method described above can form the basis for further analysis of the game by comparing these metrics for a specific player over a period of time, as well as between players, to assess how for instance playing conditions, opponents, time-of-day, and other external factors captured as meta-data influence the ability and performance of players.
  • Such information can be invaluable to team managers and selectors in picking the best team for a particular situation.
  • these methods can be applied to historical audio and visual content that was generated and stored for games that happened in the past. This will allow the generation of informatics of previous players, players in previous seasons or games, and the creation of a databases of such information.
  • the information captured in said database can be used to compare current players with players of a previous generation, compare player’s technique and outcomes with their technique and outcomes in previous games and seasons, as well as comparing players to each other.
  • the information in said database can be used to train machine learning and Al algorithms to identify features, techniques, and other relative information that can serve to improve players performance and coaching, identify players for scouting and selection, provide additional information for commentators, allow researchers to test and verify hypothesis, provide fans with additional data, create additional metrics for fantasy leagues, as well as other insights.
  • processor should be taken broadly to include a variety of electronic hardware and/or software based functions and components (and can alternatively be termed functional “modules” or “elements”). Moreover, a depicted process or processor can be combined with other processes and/or processors or divided into various sub-processes or sub-processors. Such sub-processes and/or subprocessors can be variously combined according to embodiments herein. Likewise, it is expressly contemplated that any function, process and/or processor herein can be implemented using electronic hardware, software consisting of a non-transitory computer-readable medium of program instructions, or a combination of hardware and software.
  • hitting equipment and bat are indicative of sporting equipment utilized by players in the sport and can take on the meaning of club, bat, racket, stick, hurl, boot, cleat, boot, and others.
  • ball and projectile are used to indicate sporting equipment that are impacted by hitting equipment and can mean ball, shuttlecock, puck, sliotar, or any other equipment used to determine the outcome of the sport.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

This invention provides a system and method for transmuting complex measurements of optical, audio and/or radar data into simple, human understandable scalars, which, in turn, can be used in informatics such as infographics, performance analysis, player assessment, coaching, and team strategy and management, to name a few. The system and method are particularly suited to, though not limited to, summarizing common phenomena in bat-and-ball sports such as baseball, cricket, golf, hockey and tennis, to name a few. Moreover, this invention describes a set of metrics which comparatively and objectively summarize player performance and how to efficiently compute them from video, radar, LIDAR, and microphone data collected from sports equipment. The invention further describes how to manipulate computational parameters in order to produce ideal and custom-tailored metrics.

Description

SYSTEM AND METHOD FOR CAPTURE AND ANALYSIS OF SPORTING PERFORMANCE DATA AND BROADCAST OF THE SAME
FIELD OF THE INVENTION
[0001] This invention relates to systems and methods for capture, display and analysis of the performance of sporting equipment, players, and the broadcasting of sporting performance data in a manner that can be displayed in conjunction with, or separately from, video coverage of the sporting activity.
BACKGROUND OF THE INVENTION
[0002] In ball sports the object of the game is to deliver a certain outcome with the ball using the body, a bat, club, racket, or other mechanisms/techniques as permitted by the rules of the particular sport. The outcome is usually measured as scoring a goal in soccer or hockey, a home run in baseball, a boundary in cricket, an ace in tennis, or any other metric(s) as defined in the rules of the associated sport. [0003] To date, the outcome of the position of the game ball, puck, shuttlecock, etc., has determined the degree of effectiveness of the shot and/or the player (if at all). The outcome of the shot is, however, not always a true representation of the skill of the player(s) executing a shot as other players (such as fielders), nets, posts, and/or other obstacles that are part of the sport can sometimes inhibit the outcome of the action, and mask the effectiveness and/or skill of a player’s action. Conversely, badly hit shots can still reach a boundary in cricket, a miss-que can still reach the goal, and a badly executed shot can still go past a player to win a point.
[0004] It is increasingly desirable in the highly competitive world of sports for players, coaches, owners, talent scouts, and equipment sellers to be able to objectively assess a player’s ability to execute actions during a sports game, as well as the effectiveness of relevant sporting equipment. Such assessment capability can enable effective team management, strategy development, talent acquisitions, target skills to develop, and proper training techniques. Furthermore, with the increasing value of sports broadcasting contracts, it is increasingly desirable for broadcasters to provide new modalities to engage and captivate their audiences by presenting even more statistics and information regarding the abilities, effectiveness, and quality of players and teams. [0005] Recent technological developments have been made in the area of sports performance assessment across a wide range of sports and player roles. In baton-ball sports or games, a variety of data-gathering devices/sy stems that attach directly to sports equipment have been proposed and/or deployed in the field. However, most of these devices/systems have certain disadvantages including, but not limited to, obtrusiveness, communications issues, inaccurate data, inability to remain attached to the equipment, difficulty to ensure high fidelity adhesion and contact between the sensors and the equipment, device weight, battery life, ergonomic fit with the equipment, ease of use, and others as are known to those with skill. All these devices also require some sort of action from the player to enable the device, keep it in working order such as charging the device’s battery, and making sure that it remains permanently attached to the equipment. Such devices may also fail to comply with the strict regulations that apply to bats, rackets, clubs, and other hitting devices as permitted by the specific rules of the respective sports. Some devices, such as the device(s) described in U.S. Patent No. 10,527,487, entitled SYSTEM AND METHOD FOR SENSING HIGH-FREQUENCY VIBRATIONS ON SPORTING EQUIPMENT, filed 5/30/2017, and which is incorporated herein by reference as useful background information, overcome, some but not all, of these disadvantages. However, all these data gathering devices require physical adhesion/fastening of a device to the player’s equipment, thereby interfering with the player in some manner or form, or potentially being in breach of the rules of the sport.
[0006] Some performance data/metrics are presently derived from third party sensing devices that do not require (are free of) attachment to, or interference with, the associated sporting equipment. For example, optical, radar, and acoustic based systems are ubiquitous at professional sporting events such as golf, baseball, soccer, tennis, and cricket where the ball is tracked to inform viewers of the trajectory, position of impact with the ground, or distance of a shot. Currently, various sports utilize a form of these technologies to increase the accuracy of umpiring decisions such as goal-line crossing, strike zone penetration, ball location and its position relative to features of the sport, bat contact, or leg-before-wicket decisions to name a few. However, to date, none of these optical, radar, or acoustic based systems used in bat-on-ball sports, racket sports, or other equipment-on-ball sports have been able to assess the performance metrics of the player and his/her performances with the bat, club, stick, racket, or other hitting equipment. SUMMARY OF THE INVENTION
[0007] This invention overcomes disadvantages of the prior art by providing a system and method for generating independent, repeatable, high fidelity, and verifiable information regarding the outcome of a player’s action with respect to a sport ball, particularly in sporting activities that employ equipment to engage/hit the ball — for example bat-on-ball sports, in a manner that does not interfere with/is free of interference and/or distraction relative to the player and/or does not require/is free of any involvement by the player or the player’s equipment in the data gathering. Moreover, operation of the system and method does not implicate any restrictions based upon the rules or regulations of the sport. More particularly, the system and method operates to render relatively complex measurements of captured audio, visual and/or radar information into simple, human understandable scalars, which, in turn, can be used to create informatics such as infographics, live metrics used in broadcasting, and/or near real-time metric dashboards. The system and method can be applied effectively to, but not limited for use in, summarizing common phenomena in hitting ball sports such as baseball, cricket, golf, hockey, and tennis. Moreover, this system and method presents a set of metrics that comparatively summarize player performance and how to efficiently compute them from audio, visual, and/or radar data collected from sports broadcasting equipment without the need to physically contact the equipment. The system and method also advantageously allows for manipulation of computational parameters in order to produce ideal metrics which account for human bias and the desire for familiar info-metric scales (such as a scale from 1 to 100). More generally, the system and method herein overcomes the deficiencies of the prior art by utilizing readily available, non-contact, unobtrusive, and, in some cases, already implemented sensors and sensing technology to deliver metrics such as ball-speed off the bat, amount of energy delivered to the ball by the player, equipment speed, as well as the effectiveness of the power transfer between equipment and ball.
[0008] In an illustrative embodiment, a system and method for obtaining velocity vectors of a first sporting equipment in conjunction with engagement by the equipment with a second sporting equipment is provided. The system and method provides one or more of remote sensing modalities that generate two respective forms of data free of contact with the first and the second sporting equipment, and combines the respective forms of data to provide the velocity vectors. Illustratively, a postcontact velocity vector by the first sporting equipment is determined by visual ball tracking using a camera assembly that generates a succession of image frame data. Alternatively, or additionally, the post-contact velocity vector by the first sporting equipment can be determined by radar and LIDAR ball-tracking. A post-contact velocity vector can be determined by a combination of ball-tracking and audio recording. A difference between a velocity vector of the first sporting equipment before and after contact with the second sporting equipment can also be determined by the system and method. Additionally, the system and method can calculate a change in kinetic energy of the first sporting equipment based upon the difference in the velocity vector, and/or a change in momentum of the first sporting equipment based upon the difference in the velocity vector. The system and method can further calculate power delivered to the first sporting equipment based upon the difference in velocity vector in combination with the duration of contact between the first sporting equipment and the second sporting equipment during engagement therebetween. Illustratively, the system and method can determine the duration of contact from an audio recording of the impact.
[0009] In an illustrative embodiment, a system and method for tracking a preimpact velocity vector and a post-impact velocity vector of a pair of sporting equipment coming into contact to impact with each other is provided. Motion information from at least one non-contacting sensor generating a discrete type of motion data is received, and the data from the non-contacting sensor is used to provide information relative to the pre-impact velocity vector and the post-impact velocity vector. Illustratively, the system and method can determine the pre-impact velocity vector and the post-impact velocity vector based upon visual ball and equipment tracking. The system and method can also determine the pre-impact velocity vector and the post-impact velocity vector using radar and LIDAR ball and equipment tracking. The post-impact velocity vector of the sporting equipment can be determined based upon a combination of ball-tracking and audio recording. The system and method can calculate a change in kinetic energy of the sporting equipment based upon a change in the pre-impact velocity vector and the post impact velocity vector and post-impact velocity vector and/or the efficiency of transfer of energy during the impact based upon a ratio of change in kinetic energy between each piece of the sporting equipment. The system and method can also calculate a pre-impact kinetic energy of the sporting equipment based upon the pre-impact velocity vector. In exemplary embodiments, one piece of the sporting equipment is a ball, and the system and method can calculate the efficiency of transfer of the kinetic energy between each piece of the sporting equipment based upon a ratio of a difference in pre-impact kinetic energy and post-impact kinetic energy of a ball and the pre-impact velocity of the piece of sporting equipment impacting the ball. In various embodiments, the other piece of sporting equipment is a bat, club, racket or other hand-held implement for impacting and/or imparting kinetic energy to the ball.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The invention description below refers to the accompanying drawings, of which:
[0011] Fig. 1A is a diagram showing exemplary images of various sports during engagement of a ball with associated information captured by video cameras and tracking systems for use in the embodiments herein;
[0012] Figs. 2A-2E are diagrams of exemplary images depicting various infographics presented by broadcasters as derived from auxiliary sensors according to the embodiments herein;
[0013] Fig. 3 A is a diagram showing an generalized system for acquiring various forms of data with respect to a sporting activity using remote sensing devices and an associated processing arrangement for generating meaningful data that can be displayed and manipulated relative to the activity;
[0014] Fig. 3B is a diagram depicting different, exemplary traces and vectors associated with ball flight pre-impact and post-impact with sporting equipment;
[0015] Fig. 4 is a flow diagram showing a process utilizing video and acoustic information to determine performance metrics in sports using the exemplary arrangement of Fig. 1;
[0016] Figs. 5 A and 5B are diagrams of exemplary images of information captured by synchronized microphone and video sensor systems according to the arrangement of Fig. 1;
[0017] Figs. 6A and 6B are diagrams showing an example of the use of captured acoustic and video sensor data to determine the point of and result of impact of sport equipment according to embodiments herein; [0018] Fig. 7 is a graph showing an example of a captured acoustic trace of the impact of sports equipment according to the embodiments herein;
[0019] Fig. 8 is a flow diagram showing a process for utilizing acoustic and video information to determine performance metrics in sports according to embodiments herein; and
[0020] Figs. 9A and 9B are graphs showing an exemplary series of captured positions and the actual velocity of a bat in time during a swing at and impact of a ball analyzed using the arrangement of Fig. 1 and processes of the embodiments herein.
DETAILED DESCRIPTION
[0021] I. General Considerations
[0022] There exist a variety of mechanisms and procedures by which data regarding the state-of-play can be collected and recorded in professional, collegiate, club, school, and amateur level sports. Most commonly, cameras are utilized to capture the visual content of the sport action, and often from different points of view/angles. These captured visual images are typically supplemented by audio recording microphones placed at different locations to capture the audio content of the players, umpires, and spectators.
[0023] Recently video cameras and microphones have been augmented by one or more high-speed and high-definition camera(s), as well as radar equipment that are able to capture even more information, the particulars of which are invisible to the naked eye during the sporting event. Slow-motion replays often reveal interesting details about the sporting event, and/or assist in the application of the rules of the sport e.g. in determining if a player was offside during soccer, or faint bat contact with a ball to give a player out or not in cricket.
[0024] The higher data capture rate of the high-speed video cameras has also enabled system providers to track the flight of the ball during plays. Flight tracking is currently performed by either radar systems or high-speed cameras, or both, depending upon the speed of the ball or the specific sport. Note that the term “ball”, as used herein, can refer to any sport equipment that is used to determine the outcome of the sport. Therefore, the term “ball” can include, but is not limited to, a puck, chuck, shuttlecock, and any other device that experiences a velocity and/or direction change due to the impact of equipment that are attached to/manipulated by a player or due to impact by the player themselves. Several companies such as Trackman, Top- Golf, Animation Research Limited, and HawkEye, to name a few, have provided systems to capture the flight of a ball and overlay a digital tracking line on the TV screen for viewer enjoyment. Metrics, such as ball speed, distance travelled, balltrajectory and others, can be continually displayed on visual coverage of sporting events.
[0025] Radar-based systems, such as the above-noted Trackman, can also deliver player based metrics based on the speed of the club and the speed of the ball. Disadvantageous^, these metrics are limited to sports where the ball is initially at rest before impact, such as in golf. Some of the systems currently in use in sports, such as baseball, are able to detect the velocity vector of a moving ball before it comes into contact with the equipment, as well as after it has made contact with the equipment. The process of capturing that information is well-known to those skilled in the art. [0026] Also disadvantageous^, non-contact based systems are currently unable to determine the speed of the equipment, such as a cricket bat, before making contact with the ball. Several machine vision techniques, including foreground detection, object tracking, optical flow, semantic segmentation, learning and deep learning, and others known to those skilled in the art, can be applied to achieve this determination. Mote that such learning and deep learning can be either supervised or unsupervised (among other characteristics) in various embodiments. However, effective tracking of the hitting equipment has not been implemented to date.
[0027] In the various embodiments contemplated herein, the present invention can perform the task of accurately measuring the ball velocity vector before and after contact with the hitting equipment, as well as the velocity of the hitting equipment. Furthermore, the invention can utilize this information, as well as other non-contact information such as sound recordings, to establish the time and point of contact between ball and hitting equipment. The information contained in the measured velocity vectors, contact time, and contact point can be sufficient to calculate and produce a plurality of values, measurements, and metrics that are novel and have not been implemented in prior art arrangements, and can thereby provide a non-contact approach for calculating various unique measurements and values provided by visual and acoustic sensors herein, as well as values and measurements that have been implemented by prior art implementations. By way of background, such measurements and values are described in published Patent Cooperation Treaty (PCT) Application No. WO/2020/086909, entitled SYSTEM AND METHOD FOR DETERMINING SCALAR VALUES FROM MEASUREMENTS AND TELEMETRY RELATED TO A STRUCTURE SUBJECTED TO MOTION AND FORCE, published 04/30/2020, incorporated herein by reference. The following list is an example of some of the metrics that can be calculated and produced by this incorporated PCT application:
1. The kinetic energy of the ball before impact.
2. The kinetic energy of the ball after impact.
3. The change in momentum and kinetic energy of the ball due to the impact.
4. The estimated kinetic energy of the hitting equipment before impact.
5. The change in momentum and kinetic energy of the hitting equipment.
6. The power delivered to the ball by the hitting equipment.
7. The efficiency of energy transfer between the hitting equipment and the ball.
8. The efficiency in movement and correctness of form used by the player to manipulate the hitting equipment.
[0028] Fig. 1, thus, depicts an example of captured screen images 100 an optical ball tracking system utilized in various sports — for example sports employing a bat 113, such as baseball (images 110, 112) and cricket (images 114, 116). By identifying and capturing the position of the ball from different angles, with (e.g.) video cameras of sufficient frame rate and resolution that are synchronized in time, then mathematical formulas, such as Kalman filters are able to calculate the position of the ball in each frame of the video. This position vector is fed to a digital overlay system 360 that is able to plot the position of the ball on the video and produce flight track 101 as shown in the images 110, 112, 114 and 116 of Fig. 1. Furthermore, since the system captures the position vector and the time to reach that position vector, the system is able to calculate the velocity vector of the ball at any point during the flight. Ball speed is often displayed to the viewer as an infographic 102. The details of the operation of such tracking systems is described in UK Patent No. GB2496428B, entitled AN APPARATUS, METHOD AND SYSTEM FOR DETECTING THE POSITION OF A SPORTING PROJECTILE, dated November 11, 2011, the teachings of which should be clear to those of skill.
[0029] Fig. 2A illustrates another addition to the ball tracking system depicted generally in Fig. 1. Here the position vector and time stamp of each frame in the video is utilized to calculate the velocity vector and speed of the ball at different points during flight. Fig. 2A, thus depicts a captured image 210 of a ball bowled by a bowler in a cricket match and how the speed of the ball changes from release point 221 to when it reaches the batsmen 218 after it has come into contact with the ground. Release point 221 speed is significantly reduced by aerodynamic drag, friction, and the friction of the ball coming into contact with the ground, reducing position 221 speed by almost 16% to the attain batman’s position 222 speed. Hence, the image 210 illustrates the ability of current available optical technology to measure and compute the velocity of the ball at different points of flight as well as compute track 101.
[0030] Fig. 2B shows an image 230 that further illustrates the information captured by a ball tracking system before and after impact with sports hitting equipment. Pre-impact ball tracking positions 223 show how rich the information captured by the camera is. It should be clear to those of skill that the ball position and speed can be calculated frame by frame and then compared with the predicted position. Errors in the prediction can then be corrected by tweaking parameters of the prediction algorithm in a process called calibration, well known to those skilled in the art. Previous testing has shown that current prediction systems need only (e.g.) four (4) frames of ball trajectory to be able to converge coefficients to the point that the predicted flight of the ball is within the margin of error of the ball tracking system itself.
[0031] Fig. 2C depicts an image 240, which illustrates ball tracking after the ball has impacted sporting hitting equipment (bat 113) controlled by the player. Here it is clear that sufficient information is in fact available from post-impact positions 224 to compute the position and velocity vector after the ball has come into contact with the hitting equipment. However, to date, broadcasters and technology providers have not been calculating and presenting the immediate post-impact velocity vector data. The embodiments herein, thus, desirably operate to compute and/or capture the velocity vector of the ball immediately post-impact with hitting equipment.
[0032] Fig. 2D depicts an exemplary image 250 where the average speed of the ball after impact with the hitting equipment has been measured. Post-impact ball speed 224 displays the magnitude of the velocity vector, but lacks the direction of the vector as known to those skilled in the art. However, it should clear to those of skill from Fig. 2D that the information to calculate the velocity vector is readily available from trajectory 226, which is computed from a plurality of discrete camera captures. [0033] Fig. 2E depicts an image 260, showing an example of the types of visual infographic and metrics 227, which are captured by sensing systems with regard to player performance, and which are currently displayed by broadcasters to augment audience viewing experience. These metrics are measured by sensing systems that are in physical contact with the player or player’s equipment. The present embodiments are, therefore, adapted to deliver similar metrics but aims to capture them with sensing systems that are not in contact with the player or player’s equipment, and that are currently deployed at professional sporting events.
[0034] II. System Overview and Operation
[0035] Reference is made to the diagram showing a generalized system/ arrangement 300 for capturing, analyzing, and displaying performance data in Fig. 3A. The arrangement 300 includes various sensing devices, including, but not limited to, one or more cameras 310 and 312 (shown in phantom). The camera(s) provide video data 314 to, and receive control signals from, the tracking process(or) 340 (described further below). Additionally, one or more audio sensors (e.g. microphones with or without appropriate sound-focusing structures) 320 can deliver audio 322 data in various forms to the process(or) 340. Other sensors 326, such as radar, laser range finders, LIDAR, time of flight devices, etc., can be employed to deliver appropriate data 328 to the process(or) 340.
[0036] The tracking process(or) 340 can be implemented using a variety of processing modalities, including, but not limited to, a general purpose computer — such as a server, PC, laptop, tablet, etc., an ASIC, FPGA, GPU-accelerated parallel processing platform, or other custom-built data processor. Appropriate capture cards, etc. can also be employed for the particular form of input data in a manner clear to those of skill. The process(or) 340 can define a plurality of functional processes/ors and/or modules. These modules can include, but are not limited to, a video process(or) 342, that receives and analyzes video data 314, and/or an audio process(or) 344 that receives and analyzes audio data 322. The other sensors 326 can transmit data 328 (and receive appropriate control signals) to an appropriate sensor process(or) 346 that is adapted to handle the particular data provided by the sensor(s) 326. A generalized infographic process(or) 348 receives processed data from each of the process(ors) 342, 344, 346, and other data stores (not shown) to generate appropriate infographic and related performance data in a manner described below. An artificial intelligence system 349 can serve the tracking process(or) 340, the video process(or) 342, the audio process(or) 344, the other sensor process(or) 346, and the infographic generator 348 individually, or collectively. Then artificial intelligence system 349 can provide machine learning, deep learning, machine vision techniques including foreground detection, object tracking, optical flow, semantic segmentation, and other artificial intelligence functions as in a manner generally known to those skilled in the art. Included in these artificial intelligence functions are algorithms/processors, decision trees, classification trees, databases, optical vectors, masks, and training sets among others.
[0037] The data can be displayed on an appropriate display, for example, a computer screen 350, mobile devices 353, other receiving systems 354 that interfaces with other broadcast (and/or streaming) resources as shown, and/or can be provided directly to broadcast applications and/or digital overlay systems 360. Similarly the data can be distributed to and/or stored on other destinations such as the cloud 351, local storage 352, mobile devices 353, and/or other receiving systems 354 for storage, distribution, and consumption, in a manner generally known to those of skill.
[0038] Video data 314, audio data 322, and other sensor data 328 can also be retrieved from recordings of events that happened in the past and have been stored on devices such as the cloud 351, local storage 352, mobile devices 353, other receiving systems 354, and/or broadcast applications and/or digital overlay systems 360. Similarly, the results and output of video process(or) 342, audio process(or) 344, or other sensor process(or) of an event or events can be stored and delivered to infographic generator 348 at a later date than when the event(s) occurred.
[0039] These types of data transfers can be implemented via protocols, interfaces, and others such as APIs as well known by those skilled in the art.
[0040] Thus, the tracking processor 340 can perform the functions of the illustrative embodiments described here on live, real-time, historical, archived, and/or otherwise stored video (and other) data/media.
[0041] Fig. 3B is a diagram showing a graphical representation 330, by way of non-limiting example, of an embodiment for graphically/mathematically solving the performance metrics described and contemplated above. As is known to those of skill, vectors are defined as magnitudes in different axes — for example three orthogonal, Cartesian axes, that can be labelled x, y and z. Thus, in Fig. 3B, the depicted axes are depicted as an x axis 331, which is horizontal and in the direction of the ball traveling toward the player; ay axis 332, which is the horizontal direction perpendicular to the travel of the ball toward the player; and a z axis 333, which is parallel to gravity or vertical. Pre-impact ball trajectory 334 intersects player hitting equipment (e.g., bat) 113 (Fig. 1A, etc.) at impact point 336, and defines an incoming velocity vector 337 at impact point 336. The impact with equipment 113 caused the ball to change direction and follow post-impact ball trajectory 335, with outgoing velocity vector 338 at impact point 336. Pre-impact ball trajectory 334 and postimpact ball trajectory 335 are determined by the systems and methods described above, and are known to those of skill. Once pre-impact trajectory 334 and postimpact trajectory 335 are known, their intersection can be calculated by known mathematical techniques. This intersection is, thus, defined in time and space as impact point 336. Once impact point 336 is known, incoming velocity vector 337 of pre-impact trajectory 334 and outgoing velocity vector 338 of post-impact trajectory 35 can be calculated.
[0042] Velocity vectors 337 and 338 is defined as the square of the sum of the roots of the velocity axis (x 331, y 332 and z 333) as described by the following equation:
Figure imgf000013_0001
(1)
Where n indicates the velocity vector number, 337 and 338 in this case, V is the velocity in the axis as indicated by the subscript [0043] The change of velocity vector of the ball is given by the difference in incoming velocity vector 337 and outgoing velocity vector 338 AV = v337 — V338
(2) With the change in velocity calculated, a number of useful informatics can be derived. The change in momentum of the ball is given by
AM = mMi . (AV)
(3) Where mball. is the mass of the ball/puck/etc. and regulated by the sport. The mechanical energy added to the ball is given by the change in kinetic energy as described by the following relation
Figure imgf000013_0002
[0044] Fig. 4 is a flow diagram showing a procedure 400 for determining the change in kinetic energy and associated efficiency. As will be appreciated by those skilled in the art, visual tracking in the above example of calculating metrics can be replaced by other tracking mechanisms such as LIDAR, radar, and any other tracking system that can track a moving object and capture its position in time. In block 410 a microphone 410 can provide sound inputs from the sports play /ball impact to an impact identifier 412. Concurrently, a front-on oriented camera 420 delivers at least (e.g.) four (4) pre-impact and post impact successive image frames of the (e.g.) bat and ball action to a collection process 422, that also receives the impact identification information 412 as a trigger. Concurrently, one or more side-on camera(s) 424 deliver at least four pre-impact and post impact image frames (typically in timesynchronization with the front-on camera 420 to the collection process 422. Concurrently, these image frames are transmitted by both cameras 420, 424 to a preimpact bat frame data store 426. The impact identification process 412 transmits the audio data to a contact duration process 428 that determines the time through which the ball is contacted by the bat. This data is then transmitted to a power calculation process 432, which uses this information as one of the factors (described further below) to compute the power transferred to the ball by the bat or other equipment piece.
[0045] In the procedure 400, the collected frames are transmitted from the collection process 422 to a ball location identification process 430 that identifies the location of the ball in each of the image frames. This process provides the location information to an incoming ball trajectory determination process 434 and outgoing ball trajectory determination process 436. These two pieces of data are then provided to an impact point calculation process 440. The overall data is then provided to an incoming vector calculation process 442 and an outgoing vector calculation process 444. The calculated, incoming and outgoing vectors are then provided to a vector velocity change calculation process 446. This data is provided to calculate the change in momentum in the process 450. The information from the velocity vector change process 446 is also provided to a change in kinetic energy calculation process 454. The computed change in kinetic energy is used, along with the above-described contact duration time (process 428) to calculate the power delivered to the ball by the bat/ equipment (process 432). [0046] The change in kinetic energy from the process 454 is used to calculate energy transfer efficiency (process 456). The pre-impact frame data is used to determine pre-impact bat speed (process 460), and bat kinetic energy (462), which is provided as a second input to the kinetic energy transfer efficiency calculation process 456.
[0047] Figs. 5A and 5B are images of an exemplary use of acoustic capture technology, which is another widely available sensor technology deployed in sport. Use of a microphone arrangement to perform acoustic capture of a scene is widely used to enhance viewer experience since microphones can be placed strategically, and directionally oriented, near players, spectators, and umpires to capture the sounds and noises associated with the game and bring those additional experiences to a viewer. In some sports, like cricket, sound is also utilized to aid in umpire decision making. Acoustic capture provides additional, richer data sets since it is often captured at rates that far exceed those of visual systems. Standard acoustic capturing can occur between 10 kHz and 80 kHz, or 10,000 to 80,000 samples per second, compared to 250 frames per second (fps) of typical visual information, thereby providing detail at a far higher rate than other sensing equipment typically deployed during these sporting events. More particularly, Figs. 5A and 5B depict an example (successive image frames) of a technique whereby the visual frame-capturing of cameras are synchronized with the audio capture. Here the cameras from the front (images 510, 512) and from the side (images 520, 522) of the batsman 530 in cricket are synchronized with each other as well as with a microphone placed in close proximity of the batsman. Acoustic signature box 551 displays the sound captured between two consecutive video frame captures. Video capture can occur between 60 and 1,000 fps. In the example of Figs. 5 A and 5B, the video is captured at 250 fps, and thus, there is a 4 millisecond (ms) delay between video frames. The sound in this example is recorded at 44,200 Hz. Consequently, there are 176 sound samples taken between the consecutive video frames in the video sample. Acoustic signature box 551 plots all 176 sound samples associated with the time between the capturing of the two frames. As can be seen in Fig. 5 A, acoustic signature box 551 shows no sound recorded since ball 552 has not reached bat 113. In Fig. 5B, ball 552 has reached bat 113, and contact noise 554 has been recorded and displayed in acoustic signature box 551.
[0048] The acoustic signature is useful in a number of ways. Firstly, if the illustrative intersecting/combinatory procedure 400 described above and visualized in Fig. 4 is utilized, contact noise 554 can indicate and mark the frame within which contact happens, and thereby identify the frame and point in time that contains impact point 336, incoming velocity vector 337, and outgoing velocity vector 338 (Fig. 3B). [0049] Secondly, when the acoustic signature is properly synchronized with the video recording, another exemplary procedure to determine incoming and outgoing velocity vectors 337 and 338 can be implemented. Hence, referring to Fig. 6A, signature box 551 contains the acoustic signature between two consecutive video frames with contact noise 554 indicating that contact between ball and hitting equipment bat 113 has occurred in the time interval between the two consecutive frames. Pre-impact frame A 661 shows the location of the ball in the video frame before contact and postimpact frame B the location of the ball after impact. Pre-impact trajectory 334 shows the trajectory of the ball up to pre-impact frame A and incoming vector 337 is the vector of the ball at pre-impact frame A. Pre-impact trajectory 334 and incoming vector 337 can be determined from the immediately preceding contiguous video frames, as illustrated in Fig. 1, by those skilled in the art.
[0050] For this illustrative example it is assumed that video is captured at 250 fps and acoustic signature is captured at 44,200 Hz. Acoustic capture can be synchronized with and overlaid on, the video capture such that the elapsed time between any two (2) consecutive video frames will be 0.004 seconds. When impact occurs at the time that pre-impact frame A 661 is captured, tl 663 will be zero and t2 664 will be 0.004 seconds. Alternatively, if impact occurs right at the time that postimpact frame B 661 is captured, then tl 663 will be 0.004 seconds and t2 664 will be zero (0). Therefore, any impact that occurs between the capture of consecutive frame A 661 and frame B 662 will have a tl 663 and t2 664 associated with it that will add up to 0.004 seconds for a frame rate of 250 fps, or the time between consecutive video frame captures as shown in Fig. 6A. Frame A 661 is, therefore, a capture pre impact and frame B 662 a capture post impact. Impact noise 554 can be detected by various techniques known to those skilled in the art. For example, a basic amplitude threshold can be utilized. Alternatively, a high-pass, low-pass, or band-pass filter can be used to remove unwanted noise from the signal, after which a threshold trigger can be used. These and other procedures that can be used to detect the time of impact are well known to those skilled in the art.
[0051] Since incoming velocity vector 337 is known, having been established as described above, impact point 336 can be calculated by multiplying tl by incoming velocity vector 337 to obtain distance vector 665 that the ball has travelled to impact point 336. Adding this distance vector 665 to the known captured position of preimpact frame A 661 will fix impact point 336 in space. The difference in position of impact point 336 and post impact capture Frame B provides the outgoing distance vector, and by dividing that vector by t2 produces outgoing velocity vector 338. Similarly, tl can be utilized with pre-impact trajectory 334 to establish impact point 336 in a manner clear to those skilled in the art. Again, by establishing the position of impact point 336, utilizing the position of the ball in impact capture Frame B, and t2, outgoing velocity vector 338 can be calculated as described above. The example described above illustrates the method for a two-dimensional vector utilizing a set of images from one camera. Adding information from another camera that captures Frame A 661 and Frame B 662 from a different angle will provide information such that outgoing velocity vector 338 can be determined in three directions as will be known to those skilled in the art.
[0052] In the case that tl 663 or t2 663 is zero (0), and impact point 336 coincides with either frame A 661 or frame B 662 respectively, it should be clear to those skilled in the art that impact point 36 is the location of the ball at either frame A when tl 663 is zero or at frame B when t2 664 is zero. In the case of impact point 663 coinciding with frame A, the position of the ball for frame A 661, as well as the accompanying pre-impact trajectory 334 and incoming velocity vector 337, should be obtained from the frame immediately preceding frame A for the calculation of outgoing velocity vector 38. Alternatively if impact point 336 coincides with frame B, it should be clear that the position of the ball in the frame immediately following frame B should be used in the calculation of outgoing vector 338.
[0053] Fig. 6B further depicts a subset of different scenarios for which this illustrative method will accomplish the calculation of outgoing velocity vector 338. Three different traces in audio box 551 are shown, with different times of impact in the box for each scenario. In scenario 601 the impact occurs closer to the middle of the audio box and thus tl and t2 are almost the same length. In scenario 601, incoming vector 337 has a different magnitude and direction than in Fig. 6A so as to illustrate that the method can still be used to calculate the impact point and therefore outgoing velocity vector 338 as described above. In scenario 602, tl 663 is smaller than t2 664 in audio box 551. This means that distance vector 665 will be smaller for the same incoming velocity vector 337 and therefore impact point 336 will be at a different location, influencing both the magnitude and direction of outgoing velocity vector 338 as shown. In scenario 603, the ball location in frame B and the subsequent outgoing velocity vector 338 are substantially different from the previous scenarios, as tl 663 is substantially smaller than t2 664. The influence of the changes in these variables as illustrated in these three scenarios (601-603) are captured in the change in magnitude and direction of post-impact velocity vector 338, as shown in Fig 6B. It should be clear to those skilled in the art how the measured values of incoming velocity vector 337, pre-impact ball location 661, post-impact ball location 662, and contact noise 554, drives the computation of impact point 36 and outgoing velocity vector 338 in this illustrative example.
[0054] From the above it should be clear that all four variables of tl, t2, location of ball in frame A, and location of ball in frame B are used to accurately determine impact point 336 and thus the exit velocity vector with this hybrid method that combines optical and audio information captured by cameras and microphones. Once the outgoing velocity vector has been calculated, it can be used to determine change in momentum and kinetic energy as described above in equations (1) through (4) above.
[0055] A third example of the application of an acoustic signature to performance determination systems and methods herein is provided with reference to Fig. 7, which depicts a graph 700 of a typical signature acoustic trace of a ball-and-bat impact captured by a microphone. There are several features of trace 554 that are of interest. Firstly, there is a rapid rise in acoustic sound level when the ball first makes contact with the bat. After several high frequency oscillations, the trace dampens out to almost zero. The start of acoustic signature 772 and end of acoustic signature 773 are two points in time that can easily be identified by standard methods known to those skilled in the art. The difference between these two points are an indication of the duration of contact between the bat and the ball and can be calculated as contact time 774. Contact time 774 can be used in unison with the change in kinetic energy to calculate the performance metric of amount of power delivered by the bat to the ball. Power delivered to the ball can be calculated by the following equation: n _ EBaii ^Ball ~ - -
'-Contact (5) Where AEBall is the change in Kinetic Energy calculated from the change in Velocity vectors as described in equation (4) and tContact is contact time 774 described above. [0056] Fig. 8 is a flow diagram showing a procedure 800 for determining the exit velocity vector and then the change in momentum and kinetic energy as well as the power delivered to the ball with the illustrative hybrid system described above. As this procedure is similar to the procedure 400 in Fig. 4, any similar steps and processes have the same reference numbers, and only differing steps/processes are described. More particularly, the identified impact (process 442) is used to determine times tl and t2 (as described above) in process 820, which, in turn is used to calculate the impact point in process 822. The incoming velocity vector is computed in process 830 based upon inputs from the pre-impact and post-impact frame collection process 422 and the location of the ball in the image frames (process 430). The incoming velocity vector value is provided to the outgoing vector computation process 832 along with the impact point that is computed in the process 822. The outgoing velocity vector from process 832 is provided to the change in velocity vector computation process 840, which provides outputs to compute the change in momentum (process 842) and the change in kinetic energy process 844). Based upon the kinetic energy change (process 844) and contact duration (from process 428, the procedure 400 determines the power delivered to the ball in process 850). The change in kinetic energy output from process 844 is also used, along with the computed bat kinetic energy (process 462) to determine the energy transfer efficiency in process 860.
[0057] Fig. 9A is a three-axis graphical representation of a cricket bat being swung through 3D space and the position of the bat captured at certain discrete time intervals. As bat 113 travels through space, its orientation and speed is constantly altered and manipulated by the player holding the bat.
[0058] Another useful performance metric is the speed of the bat, which can be measured with several processes/techniques known to those skilled in the art. As an illustrative example, the bat tip speed at any point can be calculated by determining the displacement vector between two consecutive video frames of the bat tip and then dividing it by the time it takes to advance a frame as in the equation below.
Figure imgf000019_0001
where Pos is the position of the bat in a frame, t is the time of the first frame and t+1 is the time of the following frame and AtFrame is the difference in time of the two consecutive frames as determined by the frames per second of the capturing device. [0059] It is contemplated that a variety of techniques and procedures can be employed to capture and estimate the position of a bat in a video frame as should be clear to those of skill. First, since the bat is moving, an ubiquitous foreground detection technique, known to those skilled in art of machine vision, can be utilized. This technique can be set up to detect high speed objects since the bat is one of the fastest moving objects on the field and easily distinguished form the other fast moving object, the ball. Alternatively, an artificial intelligence (Al), machine learning, or deep learning method (using, for example, a commercially available convolutional neural network (CNN) architecture) can be employed whereby several images of bats are collected and the system trained to identify the bat in a manner clear to those of skill. Other methods/processes, such as detecting displacement of straight edges (since the bat is the only moving object with straight edges) or deep learning with the distinctive color of a bat and/or optical flow, can be utilized. Several other techniques to detect and identify a specific object such as a bat are known to those skilled in the art.
[0060] Fig. 9B is a graph of an exemplary, actual measurement of bat speed versus time of a bat impacting a ball. Referring to Fig. 9A, the largest displacement between discrete time intervals are between position 992 and 993. This represents peak pre-impact bat tip speed 997 as shown in Fig. 9B. There is a significant reduction in bat tip speed between positions 995 and 996, indicating contact with the ball 998 as shown in Fig 9B. The bat speed just before impacting the ball (pre-impact speed 999) can be calculated by determining the difference in position of the bat in the last two pre-ball impact frames 994 and 995, and dividing that by the time it takes to advance a frame. Pre-impact speed 999 is a useful metric since it provides an indication of the amount of kinetic energy available in the bat before impact with the ball.
[0061] If the mass of the bat it known, an estimate of the kinetic energy of the bat can be made by the following equation.
Figure imgf000020_0001
where mBat is the mass of the bat, and t is the time of a frame, t+1 is the next frame in a series, and AtFrame is the time between frames. Utilizing this information, pre-impact kinetic energy of the bat can be calculated. The amount of energy delivered from the bat to the ball during the impact can also be determined by utilizing the following formula
Figure imgf000021_0001
Once the kinetic energy of the bat is calculated, another important performance metric for bat-on-ball sports, namely the efficiency of the shot, can be calculated. This metric is sometimes referred to as the quality of the shot, or the sweetness of the shot, or hitting the shot in the sweetspot of the bat, and so forth. Regardless of its name, the efficiency of a shot relates to the change in kinetic energy of the ball versus the amount of effort the player has put into the hit. Since the kinetic energy of the bat and the change in kinetic energy of the ball can be determined by the methods described above, the efficiency of the shot can be calculated by the dividing the change in kinetic energy of the ball by the available kinetic energy of the bat as shown in following equation:
_ EEBan Eshot r. bBat (9) where nshot is the efficiency of the shot.
[0062] Another example of a technique in which the efficiency of the shot can be represented is by dividing the change in kinetic energy of the ball by the change in kinetic energy of the bat as shown below
Figure imgf000021_0002
However, this method might produce inconsistent results if standard rate capture cameras are used since the change in velocity in the bat happens at such a high rate that the details cannot be captured by slower capture rate cameras. High-speed video cameras will be needed for an accurate estimation of change in bat speed when utilizing the method in equation (10) above. However, if any other means of capturing the velocity of the bat, such as radar, LIDAR, or any other means known to those skilled in the art, is utilized, the rate of capture might be sufficient to resolve the change in velocity of the bat such that equation (10) can produce consistent results. [0063] A schematic representation of this illustrative process to establish the velocity and kinetic energy of the bat utilizing either of the illustrative means of determining the kinetic energy change in the ball is shown in Figure 4 and Figure 8. [0064] The above was a description of a technique to resolve player performance metrics from sensors that are not in contact with and not interfering with the player or player’s equipment, and that do not need any input or intervention from the player to be operational. Furthermore, the methods described above can be utilized with information already captured by sensors, sensing devices, cameras, and microphones that are presently deployed at most professional, collegiate, and club games by broadcasters.
[0065] In addition to providing information regarding the measured metrics, the data obtained by the foregoing can also be used to calculate a number of additional metrics. For example, prediction of the outcome of the shot, distance the ball will travel, probable impact location of the ball, probability of interference of the ball by another player or object in the game, potential to and amount of score, and other metrics can be calculated. Furthermore, the system and method described above can form the basis for further analysis of the game by comparing these metrics for a specific player over a period of time, as well as between players, to assess how for instance playing conditions, opponents, time-of-day, and other external factors captured as meta-data influence the ability and performance of players. Such information can be invaluable to team managers and selectors in picking the best team for a particular situation.
[0066] The above-described examples have focused on the capturing and analysis of data from professional games utilizing broadcasting equipment. As should be clear to those of skill, video and audio recording equipment capable of capturing the information required and utilized in the example methods above, can be purchased off-the-shelf. For instance, 4k video cameras with capture rates of 250 fps and video recordings on cellphones with capture rates of more than 200 fps are widely available. Furthermore, audio recording equipment capable of in excess of 44 kHz are also available on commercial cellphones and computers. Therefore, it is possible for individual players and/or clubs, professional and amateur alike, to capture the data required to implement the methods in the illustrative examples above and to compute the metrics using the methods described. These metrics can then be utilized for player assessment, player improvement, coaching, scouting, team management and strategy, and various other useful applications.
[0067] Additionally, these methods can be applied to historical audio and visual content that was generated and stored for games that happened in the past. This will allow the generation of informatics of previous players, players in previous seasons or games, and the creation of a databases of such information. The information captured in said database can be used to compare current players with players of a previous generation, compare player’s technique and outcomes with their technique and outcomes in previous games and seasons, as well as comparing players to each other. The information in said database can be used to train machine learning and Al algorithms to identify features, techniques, and other relative information that can serve to improve players performance and coaching, identify players for scouting and selection, provide additional information for commentators, allow researchers to test and verify hypothesis, provide fans with additional data, create additional metrics for fantasy leagues, as well as other insights.
[0068] III. Conclusion
[0069] It should be clear that the above-described system and method provides an effective approach to determining various performance statistics for players using equipment to engage balls and similar game implements in a manner that is free of the need to apply transmitters or other devices to the sporting equipment, thus avoiding weight increase, potential rules violations and general interference with the state of play. This system and method can employ commercially available audio and video sensors, among others, to derive useful information on the motion and energy of equipment and/or balls by applying novel data handling techniques to the acquired image and audio data.
[0070] The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope of this invention. Features of each of the various embodiments described above may be combined with features of other described embodiments as appropriate, in order to provide a multiplicity of feature combinations in associated new embodiments. Furthermore, while the foregoing describes a number of separate embodiments of the apparatus and method of the present invention, what has been described herein is merely illustrative of the application of the principles of the present invention. For example, as used herein the terms “process” and/or “processor” should be taken broadly to include a variety of electronic hardware and/or software based functions and components (and can alternatively be termed functional “modules” or “elements”). Moreover, a depicted process or processor can be combined with other processes and/or processors or divided into various sub-processes or sub-processors. Such sub-processes and/or subprocessors can be variously combined according to embodiments herein. Likewise, it is expressly contemplated that any function, process and/or processor herein can be implemented using electronic hardware, software consisting of a non-transitory computer-readable medium of program instructions, or a combination of hardware and software. Furthermore, the terms hitting equipment and bat are indicative of sporting equipment utilized by players in the sport and can take on the meaning of club, bat, racket, stick, hurl, boot, cleat, boot, and others. Similarly, the term ball and projectile are used to indicate sporting equipment that are impacted by hitting equipment and can mean ball, shuttlecock, puck, sliotar, or any other equipment used to determine the outcome of the sport. Additionally, as used herein various directional and dispositional terms such as “vertical”, “horizontal”, “up”, “down”, “bottom”, “top”, “side”, “front”, “rear”, “left”, “right”, and the like, are used only as relative conventions and not as absolute directi ons/dispositions with respect to a fixed coordinate space, such as the acting direction of gravity. Additionally, where the term “substantially” or “approximately” is employed with respect to a given measurement, value or characteristic, it refers to a quantity that is within a normal operating range to achieve desired results, but that includes some variability due to inherent inaccuracy and error within the allowed tolerances of the system (e.g. 1-5 percent). Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
[0071] What is claimed is:

Claims

24 CLAIMS
1. A method for obtaining velocity vectors of a first sporting equipment in conjunction with engagement by the equipment with a second sporting equipment comprising the steps of: providing one or more forms of remote sensing modalities that generate two respective forms of data free of contact with the first and the second sporting equipment; and combining the respective forms of data to provide the velocity vectors.
2. The method as set forth in claim 1, further comprising, determining a postcontact velocity vector by the first sporting equipment by visual ball tracking using a camera assembly that generates a succession of image frame data.
3. The method as set forth in claim 1, further comprising, determining a postcontact velocity vector by the first sporting equipment by radar and LIDAR balltracking.
4. The method as set forth in claim 1, further comprising, determining a postcontact velocity vector by a combination of ball-tracking and audio recording.
5. The method as set forth in claiml, further comprising, calculating a difference between a velocity vector of the first sporting equipment before and after contact with the second sporting equipment.
6. The method as set forth in claim 5, further comprising, calculating a change in kinetic energy of the first sporting equipment based upon the difference in the velocity vector.
7. The method as set forth in claim 5, further comprising, calculating a change in momentum of the first sporting equipment based upon the difference in the velocity vector.
8. The method as set forth in claim 5, further comprising, calculating a power delivered to the first sporting equipment based upon the difference in velocity vector in combination with the duration of contact between the first sporting equipment and the second sporting equipment during engagement therebetween.
9. The method as set forth in claim 8, further comprising, determining the duration of contact from an audio recording of the impact.
10. A method of tracking a pre-impact velocity vector and a post-impact velocity vector of a pair of sporting equipment coming into contact to impact with each other, comprising the steps of: receiving motion information from at least one non-contacting sensor generating a discrete type of motion data; and using data from the at least one non-contacting sensor to provide information relative to the pre-impact velocity vector and the post-impact velocity vector.
11. The method as set forth in claim 10, further comprising, determining the preimpact velocity vector and the post-impact velocity vector based upon visual ball and equipment tracking.
12. The method as set forth in claim 10, further comprising, determining the preimpact velocity vector and the post-impact velocity vector using radar and LIDAR ball and equipment tracking.
13. The method as set forth in claim 10, further comprising, determining the postimpact velocity vector of the sporting equipment based upon a combination of balltracking and audio recording.
14. The method as set forth in claim 10, further comprising, calculating a change in kinetic energy of the sporting equipment based upon a change in the pre-impact velocity vector and the post impact velocity vector and post-impact velocity vector.
15. The method as set forth in claim 14, further comprising, calculating an efficiency of transfer of energy during the impact based upon a ratio of change in kinetic energy between each piece of the sporting equipment.
16. The method as set forth in claim 11, further comprising, calculating a preimpact kinetic energy of the sporting equipment based upon the pre-impact velocity vector.
17. The method as set forth in claim 16, wherein one piece of the sporting equipment is a ball, and further comprising, calculating an efficiency of transfer of the kinetic energy between each piece of the sporting equipment based upon a ratio of a difference in pre-impact kinetic energy and post-impact kinetic energy of a ball and the pre-impact velocity of the piece of sporting equipment impacting the ball.
18. A system for obtaining velocity vectors of a first sporting equipment in conjunction with engagement by the equipment with a second sporting equipment comprising: at least two forms of remote sensing modalities that generate two respective forms of data free of contact with the first and the second sporting equipment; a process that combines the respective forms of data to provide the velocity vectors; and an infographic generator that creates displayable content based upon the velocity vectors.
PCT/US2021/049950 2020-09-10 2021-09-10 System and method for capture and analysis of sporting performance data and broadcast of the same WO2022056315A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063076908P 2020-09-10 2020-09-10
US63/076,908 2020-09-10

Publications (1)

Publication Number Publication Date
WO2022056315A1 true WO2022056315A1 (en) 2022-03-17

Family

ID=78087539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/049950 WO2022056315A1 (en) 2020-09-10 2021-09-10 System and method for capture and analysis of sporting performance data and broadcast of the same

Country Status (1)

Country Link
WO (1) WO2022056315A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9233294B1 (en) * 2013-12-12 2016-01-12 Thomas J. Coyle Baseball technologies
US20170014698A1 (en) * 2015-07-13 2017-01-19 Sports Sensors, Inc. Instrumented, angle-adjustable batting tee
GB2496428B (en) 2011-11-11 2018-04-04 Sony Corp An apparatus, method and system for detecting the position of a sporting projectile
US20180361223A1 (en) * 2017-06-19 2018-12-20 X Factor Technology, LLC Swing alert system and method
US20190209909A1 (en) * 2015-07-16 2019-07-11 Blast Motion Inc. Swing quality measurement system
US10527487B2 (en) 2016-05-31 2020-01-07 Future Technologies In Sport, Inc. System and method for sensing high-frequency vibrations on sporting equipment
WO2020086909A1 (en) 2018-10-24 2020-04-30 Future Technologies In Sport, Inc. System and method for determining scalar values from measurements and telemetry related to a structure subjected to motion and force

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2496428B (en) 2011-11-11 2018-04-04 Sony Corp An apparatus, method and system for detecting the position of a sporting projectile
US9233294B1 (en) * 2013-12-12 2016-01-12 Thomas J. Coyle Baseball technologies
US20170014698A1 (en) * 2015-07-13 2017-01-19 Sports Sensors, Inc. Instrumented, angle-adjustable batting tee
US20190209909A1 (en) * 2015-07-16 2019-07-11 Blast Motion Inc. Swing quality measurement system
US10527487B2 (en) 2016-05-31 2020-01-07 Future Technologies In Sport, Inc. System and method for sensing high-frequency vibrations on sporting equipment
US20180361223A1 (en) * 2017-06-19 2018-12-20 X Factor Technology, LLC Swing alert system and method
WO2020086909A1 (en) 2018-10-24 2020-04-30 Future Technologies In Sport, Inc. System and method for determining scalar values from measurements and telemetry related to a structure subjected to motion and force

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ADAIR R K: "THE PHYSICS OF BASEBALL, PASSAGE", PHYSICS OF BASEBALL, XX, XX, 1 January 1989 (1989-01-01), pages 44 - 107, XP002931160 *

Similar Documents

Publication Publication Date Title
US11049258B2 (en) Stereoscopic image capture with performance outcome prediction in sporting environments
US10987567B2 (en) Swing alert system and method
KR102205639B1 (en) Golf ball tracking system
US10025987B2 (en) Classification of activity derived from multiple locations
US20100030350A1 (en) System and Method for Analyzing Data From Athletic Events
JP6704606B2 (en) Judgment system and judgment method
CN109069903B (en) System and method for monitoring objects in a sporting event
JP6596804B1 (en) Position tracking system and position tracking method
US11850498B2 (en) Kinematic analysis of user form
US20230360397A1 (en) System, device and method for master clock and composite image
Liu et al. Application of Hawk-Eye Technology to Sports Events
WO2022056315A1 (en) System and method for capture and analysis of sporting performance data and broadcast of the same
US11989942B2 (en) Method for characterizing a dynamic occupancy of a space by two opposing teams of a game and related system
Larson et al. Sensors and data retention in grand Slam tennis
EP4325448A1 (en) Data processing apparatus and method
EP4325443A1 (en) Data processing apparatus and method
WO2020230677A1 (en) Play analysis apparatus and play analysis method
Kato Batted ball kinematics for making one-and extra-base hits in Japan’s professional baseball league

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21790680

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21790680

Country of ref document: EP

Kind code of ref document: A1