CN112969513B - System and method for determining reduced athlete performance in a sporting event - Google Patents

System and method for determining reduced athlete performance in a sporting event Download PDF

Info

Publication number
CN112969513B
CN112969513B CN201980057186.5A CN201980057186A CN112969513B CN 112969513 B CN112969513 B CN 112969513B CN 201980057186 A CN201980057186 A CN 201980057186A CN 112969513 B CN112969513 B CN 112969513B
Authority
CN
China
Prior art keywords
ball
athlete
dribbling
person
logic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980057186.5A
Other languages
Chinese (zh)
Other versions
CN112969513A (en
Inventor
A·W·玛蒂
J·卡特
C·T·玛蒂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pillar Vision Inc
Original Assignee
Pillar Vision Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pillar Vision Inc filed Critical Pillar Vision Inc
Priority to CN202211237654.7A priority Critical patent/CN115487484A/en
Publication of CN112969513A publication Critical patent/CN112969513A/en
Application granted granted Critical
Publication of CN112969513B publication Critical patent/CN112969513B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/807Photo cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/833Sensors arranged on the exercise apparatus or sports implement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/836Sensors arranged on the body of the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/20Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/04Measuring physiological parameters of the user heartbeat characteristics, e.g. ECG, blood pressure modulations
    • A63B2230/06Measuring physiological parameters of the user heartbeat characteristics, e.g. ECG, blood pressure modulations heartbeat rate only
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/207P-O2, i.e. partial O2 value
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image
    • G06T2207/30224Ball; Puck

Abstract

Systems and methods for determining reduced athlete performance during a sporting event are provided. The system may use one or more sensors to capture information about the player's intake of air regarding actions such as shooting, dribbling, kicking, and/or passing, and use at least one processor to analyze such information to assess the player's performance. The system may determine and store at least one parameter indicative of an athlete's performance while performing the action, and calculate at least one value indicative of whether the athlete intentionally underperforms the action based on the at least one stored parameter, and provide an output indicative of the assessment.

Description

System and method for determining reduced athlete performance in a sporting event
Cross Reference to Related Applications
This application claims the benefit of U.S. provisional application No.62/693,436 filed on 7/2.2018, entitled "Systems and Methods for Determining Reduced athlete Performance in a Sporting event" which is hereby incorporated by reference in its entirety. This application is related to U.S. patent application No.15/839,445, filed 12/2017, entitled "system and method for tracking and passing balls in a sports environment", which is a continuation of the section of U.S. patent application No.15/173,245, filed 3/6/2016, entitled "system and method for tracking and passing balls in a sports environment", the contents of both of which are hereby incorporated by reference in their entirety.
Background
Athletes often spend countless hours training to improve their skill level so that they can become more competitive in sporting events such as basketball games, football games, hockey games, and other sporting events. To help athletes improve their skill level, systems have been developed that track an athlete's performance while training or participating in a sporting event and then provide feedback indicating the performance. Such feedback may then be evaluated to assist the athlete in improving his/her skill level. For example, commonly assigned U.S. Pat. No.7,094,164 describes a system that tracks the trajectory of a basketball during the shooting of the basketball so that the shooter can use feedback from the system for improving his skill in shooting.
Tracking the performance of a dribbling/passing of a player, such as a basketball, football, or hockey player, presents various challenges that may limit the effectiveness of tracking systems that attempt to assess the performance of the dribbling or passing. For example, the duration of a dribbling or passing is typically very short and may occur at a relatively high speed. Furthermore, dribbling may involve the ball or puck changing direction and speed frequently. Furthermore, when a player takes a ball or pass or guards against a ball or pass, the ball or puck being taken or passed is often made to be at least temporarily hidden from view by the player, making it difficult to consistently and accurately track the ball or pass over time. Furthermore, unlike some other activities (e.g., shooting, where an ideal trajectory may be characterized by certain parameters (e.g., entry angle) that do not vary significantly from shot to shot), the characteristics of an ideal dribbling or pass may vary dramatically depending on the condition of the dribbling or passing, making it difficult to accurately assess the performance and skill level of the dribbling or passing.
Because of these and other challenges, few attempts have been made to develop devices that attempt to determine when an athlete is pretending to exhibit a reduced level of performance at a sporting event.
Drawings
The included drawings are for illustrative purposes and serve only to provide examples of possible structures and process steps of the disclosed inventive systems and methods for providing tournament services to remote clients. These drawings in no way limit any changes in form and detail that may be made to the invention by one skilled in the art without departing from the spirit and scope of the invention.
FIG. 1 is a block diagram representing an embodiment of a tracking system.
FIG. 2 is a block diagram of an embodiment of a camera used in the tracking system of FIG. 1.
FIG. 3 is a block diagram of an embodiment of a computing device for use in the tracking system of FIG. 1.
Fig. 4 is an information flow diagram of an embodiment of evaluating a dribbling and/or passing action during a training sequence or a game sequence.
FIG. 5 is a block diagram of an embodiment of an object tracker of the computing device of FIG. 3.
Figure 6 is an illustration of an offensive and defensive basketball player on a athletic playing surface.
Fig. 7 is a block diagram of an embodiment of an augmented reality system for a tracking system.
Fig. 8 is a message flow diagram of an embodiment of evaluating whether an athlete is playing at a reduced performance level during a sporting event.
Detailed Description
Systems and methods are provided for tracking and evaluating projections, carryover, and/or delivery actions associated with persons engaged in a training session for a sporting event or a live event for a sporting event (including a transition from carryover actions to delivery actions, which may be referred to as a carryover to delivery transition, or a transition from carryover actions to projection actions, which may be referred to as a carryover to projection transition). In a sporting event, a motion maneuver may be associated with a repetitive motion or short track sequence between changes in orientation of a ball, puck, or other object used in the sporting event. Repetitive motion or short track sequences for a belt may involve up and down movements (such as in basketball) or back and forth movements of a ball or hockey puck (such as in football or hockey). The projecting action may involve movement of the object from a person to a goal in order to place the object into the goal. The belt-to-belt transition may be associated with movement of the object during the end of the human's belt motion and the beginning of the human's belt motion. The transition of the belt to the projection may be associated with the movement of the object during the end of the human's movement and the beginning of the human's projection action. The transfer action may be associated with movement of the object between two people during the sporting event in order to transfer possession of the object from one person to another. During the transfer action, the movement of the object may be forward, backward or sideways, depending on the location of the person transferring the object (the transferor) and the person receiving the object (the recipient). In one embodiment, systems and methods may also be provided for tracking and evaluating kicking motions of persons engaged in a training session for a sporting event or a live event for a sporting event (e.g., football or rugby). For example, kicking motions associated with scoring or adding points to attempted shots in a football (e.g., kicking a ball between goal posts) or shots in a soccer (e.g., kicking a ball toward a goal) may be tracked and evaluated. The system may use one or more cameras to capture images of the person's projection, belt, kick, and/or transfer of objects; capturing projection, belt, kick, and/or transfer related data of the object using one or more sensors; and analyzing the images and/or sensor data using at least one processor to determine and evaluate one or more characteristics related to the projection, belt, kick, and/or transfer motion. These characteristics may relate to: types of projections, belts, kicks, and/or passes; projecting, carrying, kicking, and/or conveying gestures; projection, belt, kick, and/or transfer attributes; and converting the attributes.
In addition, to maintain the integrity of the sporting event, it may be desirable to know when an athlete's performance is not at a level typically associated with the athlete. A player may not exhibit its usual performance level at a sporting event because the player is injured or affected by some substance, or because the player attempts to change the outcome of the sporting event (or a portion of the sporting event) for illegal reasons (e.g., for wagering purposes such as intentionally changing the game to control whether the bet line is covered). The results of an athlete's intentional action may be indistinguishable from the results produced by the athlete's natural action to an observer of a sporting event. For example, an intentionally missed shot may appear the same as a missed shot that the athlete actually attempted to make.
In some embodiments, systems and methods are provided for determining whether an athlete's performance while performing a performance during a sporting event is below the athlete's expected performance level while performing the same performance. The athlete's actions (e.g., shots, carryovers, and/or deliveries) during the sporting event may be evaluated and categorized based on a variety of different factors. Some factors may include the type of action, the corresponding circumstances in the sporting event, the level of defense offered by the defensive player, the degree of fatigue of the player, and/or other suitable factors associated with the player's action (e.g., the player is attempting to entice the defensive player to foul). For example, the action type may be a projected type of projected action, the corresponding condition in the sporting event may be that the score is close when the sporting event is close to the end, the level of defense may be a high level of defense for the athlete, and the degree of fatigue may be a high degree of fatigue for the athlete.
Once the athlete's movements are classified, corresponding characteristics associated with the classified movements may be determined based on the athlete's movements. For example, for a throwing action, some characteristics that may be used are the entry angle of the projection, the drop point of the projection (e.g., the depth of the projection and the left and right position of the projection relative to the hoop of the goal of the basketball), the spin rate of the ball, the spin axis of the ball, the release height of the projection, or the release speed of the projection. The determined athlete performance characteristics may then be evaluated relative to an expected characteristic of the athlete for the same (or substantially similar) classification performance. In one embodiment, the determined characteristic may be evaluated with respect to an acceptable range of values for the respective characteristic. A probability that an athlete's motion is atypical to the athlete (i.e., the athlete does not perform in the manner intended by the athlete) may be generated based on an evaluation of certain characteristics of the athlete's motion.
The generated probability of an atypical action by the athlete may then be compared to a threshold probability to determine if there is an indication that the performance level of the athlete has decreased compared to the normal performance level of the athlete for that classified action. The determined indication may then be further evaluated and/or investigated to determine whether there is reason to suspect that the athlete is actually performing at a reduced performance level compared to the performance level expected by the person. Some reasons why this person may play at a lower performance level include: the person is injured or affected by a substance; or the person may intentionally attempt to alter the natural outcome of the sporting event (or a portion thereof) for illegal reasons, such as manipulating wagered outcomes.
In other embodiments, machine learning may be used to evaluate a player's actions (e.g., throwing, dribbling, kicking, and/or passing) during a sporting event. The deep learning process may receive a plurality of inputs, such as video data and/or determined parameters associated with a motion (e.g., a value of a projected angle of entry), and generate an output indicating whether the motion is an atypical motion of the athlete. The deep learning process may identify parameters based on inputs to the deep learning process, and may generate an output based on an evaluation of the identified parameters.
FIG. 1 illustrates an embodiment of a system 1500 for tracking a shot, dribbling, and/or pass action (including a transition from dribbling to a pass action). For illustrative purposes, the system 1500 will be described in the context of projecting, dribbling, and/or delivering a basketball. However, the system may be used for shooting, dribbling and/or passing in other sports (e.g., football, hockey or field hockey), and is particularly useful for tracking objects in sports (e.g., american football, air hockey, table tennis, etc.) that involve repetitive motions or short track sequences between directional changes of the object used in the sports.
The system 1500 may include at least two depth sensing cameras 1502 or other types of cameras communicatively coupled to a computing device 1504. The cameras 1502 may be positioned around and/or above a sports playing surface (e.g., a basketball court or other type of playing area 1516) and used to capture images of people who throw, take and/or transfer basketball (including the conversion of a take to a transfer). Various positions of the camera 1502 are possible. As an example, the cameras 1502 may be positioned on opposite sides of the play area 1516 such that a basketball is visible to at least one camera 1502 regardless of the direction the person is facing or on which side of the person's body the basketball is located. The at least one camera 1502 may be mounted to a ceiling or structure associated with the basketball rim (e.g., the backboard of the basketball rim, a clock positioned above the backboard, or a pole coupled to the basketball rim).
The camera 1502 may provide the captured images as camera data to the computing device 1504. The computing device 1504 may analyze camera data from the camera 1502 to track a dribbling motion of the ball and determine and/or evaluate one or more characteristics of the dribbling motion, such as the type of dribbling (e.g., cross dribbling, backdribbling, interphalangeal dribbling, etc.); performing a person with a ball gesture (e.g., a ball gesture, a body motion, and performing a hand with a ball); and attributes of the dribbling (e.g., ball speed, dribbling height, repetition rate, power, direction, and errors).
In one embodiment, the type of dribbling may refer to typical dribbling associated with a basketball game. Cross-dribbling may refer to the movement of a ball from one player's hand to the other player's hand with a single dribbling (or bounce) of the ball in front of a person. Backlashing may refer to the movement of a ball from one player's hand to the other player's hand in the case of a single dribbling (or bounce) of a person's back-front running ball. Dribbling may refer to the movement of a ball from one hand of a player to the other hand of the player with the ball being taken (or bounced) a single time under the torso of a person so that the ball travels "between the legs" of the person. As will be described in more detail below, the computing device 1504 may determine the type of dribbling by analyzing the movement of the ball relative to the identified body parts (e.g., torso, legs, hands, etc.) of the person dribbling.
In another embodiment, a dribbling property may refer to a typical property associated with a dribbling action. Ball speed may refer to the speed of the ball as it travels to and from the human hand. Dribbling height may refer to the distance between a person's hand and the athletic playing surface. The repetition rate may refer to the number of times the ball leaves and returns to the human hand within a defined period of time. Strength may refer to the amount of force a person's hand applies to a ball. A mistake may generally mean that the person with the ball loses control of the ball. As an example, a miss may be detected when the ball is transferred directly from a person with the ball to a player of another team or to an out-of-range area.
The computing device 1504 may also analyze camera data from the camera 1502 and/or sensor data from sensors to track transitions from dribbling to passing and/or to track passing motions. The computing device 1504 may determine and/or evaluate one or more characteristics of the dribbling-to-pass conversion and/or one or more characteristics of the pass action, such as: the type of pass (e.g., a chest pass, a bounce pass, an overhead pass, a back pass, a baseball pass, etc.); the position of the person performing the pass (e.g., pass pose, body motion, and hand performing the pass); the type of transition (e.g., from dribbling between legs to jump pass, cross dribbling to chest pass, etc.); the nature of the conversion (e.g., the time to get control of the ball after it is taken, the time to release the pass after it is taken, the separation of the person from the defender, and the position of the ball before the pass); and the attributes of the pass (e.g., ball speed, pass height, pass location, pass accuracy, pass distance, release time, power, direction, and errors).
It should be noted that the transition from dribbling to passing may include dribbling (e.g., a dribbling just prior to making a pass) and passing (e.g., a passing just after the player's last dribbling). In fact, a good player often initiates a pass while still carrying a pass or shortly after carrying a pass, so that the player's carry has an effect on the type or quality of the player's pass, and when the player switches from carry to pass, the quality of the pass-to-pass switch can be measured based on the carry and pass parameters. Thus, in evaluating a player's performance in a transition from a dribbling to a pass, the system may determine a dribbling parameter indicative of the performance of the dribbling just prior to making the pass, and may also determine a pass parameter indicative of the performance of the pass when the player initiates the pass from the dribbling. The system may then evaluate a take-to-pass conversion based on both the take-and-pass parameters (e.g., provide at least one value indicative of the player's performance in converting from take-to-pass). Various techniques for evaluating a player's performance in taking, passing, transitioning from taking to passing, and making other maneuvers are described in more detail below.
In one embodiment, the pass type may refer to a typical pass action associated with a basketball game. A pass may refer to the movement of a ball from the front torso of one player to another player without contacting the playing surface. Pass back may refer to the movement of a ball from one player's back to another player (with or without the ball contacting the playing surface). Bouncing pass may refer to the movement of a ball from the front torso of one player after contacting the playing surface (i.e., bouncing from the playing surface) to another player, typically contacting the playing surface only once. Passing overhead may refer to the movement of a ball from overhead one player to another without contacting the playing surface. A baseball pass may refer to the act of moving a ball from the front of one player to another player at or above the shoulder with one hand without touching the playing surface. As will be described in greater detail below, the computing device 1504 may determine the type of pass by analyzing the movement of the ball relative to identified body parts (e.g., torso, legs, hands, etc.) of the passer, the playing surface, and the receiver.
In another embodiment, pass attributes may refer to typical attributes associated with a pass action. Ball speed may refer to the speed at which a ball travels in and out of a person's hand. Pass height may refer to: the distance between the ball and the athletic playing surface when the ball is off the passer's hand; the distance between the ball and the athletic playing surface when the ball reaches the receiver; or the average distance between the ball and the sports playing surface as the ball travels between the passer and the receiver. Pass position may refer to the position of the ball (relative to the catcher) at the completion of the pass (e.g., the pass is caught by the catcher or the pass is out of bounds). Pass accuracy may refer to the ability of a passer to provide a pass to a receiver at a predetermined pass location. Pass distance may refer to the distance traveled between a passer and a receiver. Pass time may refer to the time to complete a pass and remove the ball from the passer's hand. The force may refer to the amount of force applied by a person's hand to the ball. The direction may refer to the direction (e.g., forward, rearward, lateral, etc.) of the ball as it travels to the receiver (relative to the passer). A miss may generally refer to a passer or a receiver losing control of the ball. For example, a miss may be detected when the ball travels directly from the passer to another team of players or to an out-of-range area.
In yet another embodiment, a transition attribute may refer to a typical attribute associated with a transition from a dribbling action to a pass action. The transition from a dribbling action to a pass action may refer to a sequence of steps performed to stop the dribbling action, obtain a control and pass. The time attribute of obtaining control of the ball after the ball is taken may refer to a period of time from stopping the ball-taking action to obtaining control of the ball to start a pass action. The time to release the pass after the carry may refer to the time period between when the carry stops and when the ball leaves the passer's hand. The separation of a person from a defensive player may refer to the distance between the passer and the defensive player (if present) when the pass action is initiated. The positional attribute of the ball before the pass may refer to the position of the ball (relative to the passer) before the start of the pass. In one embodiment, the transition from a dribbling action to a passing action may incorporate one or more characteristics of the dribbling action and/or one or more characteristics of the passing action in addition to the nature of the action between the dribbling action and the passing action (insofar as such intermediate action exists).
Further, the computing device 1504 may also analyze camera data from the camera 1502 and/or sensor data from the sensor 1514 to track transitions from a dribbling action to a shooting action and/or to track a shooting action. The computing device 1504 may determine and/or evaluate one or more parameters indicative of characteristics of a dribbling-to-shooting transition and/or one or more parameters indicative of characteristics of a shooting action, such as: the type of shot (e.g., jump shot, fixed point shot, top shot, three point shot, lever shot, ball catch shot, back step shot, ball pocket shot, hook shot, penalty ball, etc.); the posture of the shooter (e.g., shooting attitude, physical movement of shooting, and shooting hand); the type of transition (e.g., transition from dribbling between legs to jump shot, transition from cross dribbling to a three-shot, etc.); the nature of the transition (e.g., the time to gain control of the ball after it is taken, the time to release the shot after it is taken, the separation of the person from the defender, and the position of the ball before it is shot); and attributes of the shot (e.g., the angle of entry of the shot, the entry rate of the shot, the trajectory of the shot, the success/failure of the shot (i.e., whether the ball passed through the rim during the shot), the drop point of the shot (e.g., the depth of the shot and the left and right position of the shot relative to the rim), the location of the shot (e.g., the location on the playing surface when the shot was made), the height of the shot, the release rate of the shot, the release height of the shot, and/or the location of the body or body part of the shooter when the shot was made (e.g., the location of the shooter's feet when the shot was made), various exemplary techniques for evaluating parameters indicative of or otherwise related to the characteristics of the shot act of shooting may be incorporated by reference to U.S. patent application No.15/684,413 entitled "system and method for tracking basketball player performance on 23/2017 and U.S. patent application No.15/684,413 entitled" stereoscopic image capture with performance prediction in "sports environment on 2016 and U.S. patent application No.9,501 for monitoring the path of each of these patents and methods filed on 2016 and patent application for monitoring the course of the current patent application for basketball system and methods for tracking 11/10/11.
Note that the transition from dribbling to shooting may include dribbling (e.g., dribbling just prior to passing) and shooting (e.g., shooting just after the player's last dribbling). In fact, a good player often initiates a shooting action while still carrying a dribbling or shortly after the dribbling, so that the dribbling of the player has an impact on the type or quality of the shot of the player, and when the player switches from dribbling to shooting, the quality of the dribbling to shooting transition can be measured based on both dribbling parameters and shooting parameters. Thus, in evaluating a player's performance in a transition from dribbling to a pass, the system may determine dribbling parameters indicative of dribbling performance immediately prior to making the pass, and may also determine shooting parameters indicative of shooting performance by the player when shooting from dribbling. The system may then evaluate a dribbling to shot transition based on both the dribbling parameter and the shooting parameter (e.g., provide at least one value indicative of the player's performance of the transition from dribbling to shooting).
The system 1500 may have an input device 1506 and an output device 1508 communicatively coupled to a computing device 1504. Input device 1506 may be any device or mechanism (e.g., a tag) that may be capable of being used to identify the ball or person with the ball. As an example, the input device 1506 may be worn by an athlete and wirelessly communicate with the computing device 1504 to identify the athlete or provide information about the athlete other information. In other examples, the input device 1506 may be configured to receive manual inputs from the athlete and wirelessly transmit information submitted by the athlete to the computing device 1504, such as information identifying the athlete or providing other information about the athlete. The recognition process using input device 1506 may occur automatically during initialization of system 1500 or may require one or more actions by a person, such as standing at a predetermined location or performing a predetermined action. The output device 1508 may be a display screen or other similar output device that may provide training or other information to a person relating to a shooting action, a dribbling action, a pass action, and/or a dribbling-to-pass conversion (e.g., a sequence of dribbling to be repeated followed by a pass action to be performed) and the results of the training or testing process.
In one embodiment, input device 1506 and output device 1508 are integrated into a single apparatus (e.g., a smartphone or other mobile device). Prior to playing or training, a user may use such an apparatus to input information and then receive feedback from the computing device 1504 indicating performance results or other training information.
The computing device 1504 may be communicatively coupled to a lighting system 1510 to control lighting effects (e.g., the brightness and direction of light) in an area where a person is carrying a ball. In one embodiment, illumination system 1510 may include one or more light sources 1511. The light source 1511 may include an incandescent light bulb, a Light Emitting Diode (LED), or a fluorescent lamp assembled into a lamp or lighting fixture. In other embodiments, however, other additional types of light sources 1511 are possible, including light sources that provide light or radiation that is not visible to the human eye (e.g., infrared or ultraviolet light sources). Depending on the type of light source 1511 used, the camera 1502 may be selected and/or configured to detect light or radiation from the respective light source 1511. For example, if the light source 1511 provides infrared radiation, the camera 1502 may be equipped with an infrared sensor to detect infrared radiation from the light source 1511.
The computing device 1504 may be used to control an illumination state (e.g., an on state or an off state), an illumination output aperture position (e.g., an amount of light that all light may exit or be reduced may exit), and/or an illumination output intensity (e.g., a high intensity output or a low intensity output) of a light source 1511 of the illumination system 1510. Further, the light source 1511 may include one or more reflectors that may be adjusted by the computing device 1504 to change the direction of light output by the light source 1511. Further, the lighting system 1510 may include one or more mechanisms (e.g., rails and motorized carts) for the light sources 1511 to allow the position and/or orientation of the light sources 1511 to be adjusted by the computing device 1504. The computing device 1504 may be configured to submit commands to the lighting system 1510 for controlling the state of the light sources 1511 based on the computing device 1504's analysis of images received from the camera 1502 in order to achieve better lighting conditions for analyzing the captured images.
The system 1500 may also include calibration marks 1512, such as LED (light emitting diode) lights designed to emit light of a particular color, or objects that have been colored a particular color, which calibration marks 1512 may be used by the computing device 1504 to calibrate (or recalibrate) camera data from the camera 1502. Calibration marks 1512 may be identified in the image and used as reference points corresponding to known locations. To facilitate identifying the mark 1512, the color of the mark may be set to a predetermined color (which may be a color that is rarely found in natural environments) that the computing device 1504 searches for in images received from the camera 1502. Once the computing device finds the markers 1512 in the received image, the computing device 1504 may then use the markers 1512 as reference points in calibrating the camera data from the different cameras 1502. By having known reference points within the images, the computing device 1504 is able to identify pixels in different camera data sets from the camera 1502 that display the same "item" from different fields of view based on the identification of known reference points in different images. In one embodiment, the calibration marks 1512 may be incorporated as a light source 1511 in the illumination system 1510. In other embodiments, other types of markers, such as court markers, may be used as known reference points.
In one example, assume that multiple cameras 1502 capture images of a ball analyzed by computing device 1504 at the same time. Using the markings 1512 in each image from different cameras 1502, the computing device 1504 may correlate pixels representing one physical location in one image from one camera 1502 with pixels representing the same physical location in another image from another camera 1502. That is, the pixel coordinates from multiple cameras may be synchronized to the global coordinate system using the markers 1512 as a reference. Thus, the position of the ball at a given instant in space can be accurately determined relative to the global coordinate system using any image captured by any camera 1502. Thus, as the ball moves in and out of the field of view of multiple cameras (e.g., as the player changes the dribbling or turns his body, the ball may be shielded from the field of view of one camera 1502 but visible to another camera 1502), as long as the ball is in the field of view of at least one camera 1502, the position of the ball relative to the global coordinate system at any given instant in time may be accurately determined from the images by the computing device 1504. Thus, over time, the position of the ball can be accurately and consistently tracked with the plurality of cameras 1502 as it enters and leaves the camera field of view.
One or more sensors 1514, such as accelerometers or other similar types of sensors, may provide position, motion, and/or acceleration information to the computing device 1504 for determining a dribbling action, dribbling characteristic, pass characteristic, conversion action, and/or conversion characteristic. In one embodiment, the sensor 1514 may be incorporated into the ball and/or attached to or included on the person who shot, took the ball, and/or passed the ball, and information from the sensor 1514 may be wirelessly transmitted to the computing device 1504. In another embodiment, the sensors 1514 may also include one or more biometric sensors (e.g., a heart rate monitor or pulse oximeter) that may measure the physical performance of a person who shoots, takes, or passes.
The play area 1516 may have one or more sensors (which may include sensor 1514) and/or one or more cameras 1502, which may provide information to the computing device 1504 regarding people and shots, dribbling actions, and/or pass actions (including dribbling-to-pass transitions). For example, the playing area 1516 may have one or more sensors embedded in a floor or wall of the playing area 1516, positioned around the perimeter of the playing area 1516, positioned on equipment (e.g., a basketball rim, basketball net, or backboard) in the playing area 1516, or otherwise associated with the playing area 1516. The sensors may include any combination of optical sensors, proximity sensors, infrared sensors, magnetic sensors, touch sensors, height sensors, temperature sensors, pressure sensors, or any other suitable type of sensor. Sensors used with the playing area 1516 may provide information about the position of the person in the playing area 1516 and the position and movement of the ball in the playing area 1516 based on the signals provided by the sensors.
In one embodiment, the playing area 1516 may be a playing area having bounded areas on one (or more) sides of the playing area 1516 to prevent the ball from leaving the playing area 1516 when the system 1500 tracks a dribbling action. Additionally, the walls of the play area 1516 may be used to evaluate the user's pass performance, which may include a transition from a dribbling action to a pass action. The walls of the playing area 1516 may incorporate one or more sensors (e.g., pressure sensors, proximity sensors, etc.). In one embodiment, a sensor may be embedded in and/or located behind a wall to detect contact by a ball, which may indicate that a user has attempted a pass. To evaluate the user's pass performance, the system 1500 may display a target on one of the walls and may ask the user to "hit" the target with a pass. In one embodiment, multiple targets may be displayed to the user, and the user may be asked to select the "correct" target to accept the pass. The system 1500 may evaluate a user's pass performance (including selection of appropriate goals) during a user's dribbling action and a user's non-dribbling action (e.g., the user holds the ball).
The system 1500 may evaluate the dribbling performance, the conversion from dribbling to pass (if the user is dribbling), and/or the user's pass performance. For example, the system 1500 may then determine and evaluate the speed at which the user completed the conversion and pass action (e.g., the time between when the target was displayed on the wall and when the ball contacted the wall) and the pass accuracy of the user (e.g., the distance between the target location and the location of the ball when the ball contacted the wall).
The target may be located at a fixed location on the wall to provide a "fixed" target to the user, or may be moved horizontally and/or vertically along the wall to provide a "moving" target to the user. When the target is displayed to the user on the wall, the user may perform a pass action to attempt a pass so that the ball reaches the same location as the target. The system 1500 may then evaluate the pass action and determine the position of the ball relative to the target contact wall using the sensor and/or camera 1502. In one embodiment, the object displayed on the wall may be a point, circle, or bubble. The targets may be displayed via lights (e.g., LEDs) located in or behind the walls of the playing area 1516 or projectors associated with the playing area 1516. In another embodiment, the target may be an image of a person displayed on a wall by a projector or other device. In yet another embodiment, the targets may not be displayed on a wall and may be presented to the user via an augmented reality system, as described in more detail below.
The computing device 1504 may be communicatively coupled to a network 1518, such as a Local Area Network (LAN) or a Wide Area Network (WAN), to allow the computing device 1504 to communicate with remote storage systems 1520 and remote devices 1522. In one embodiment, the network 1518 may be the internet. Remote storage system 1520 may be used to remotely store camera data, dribbling action and characteristic information, pass action and characteristic information, conversion information, and other information generated and/or obtained by computing device 1504. The remote device 1522 may be used to display camera data, dribbling actions and characteristics, pass action and characteristics information, and/or conversion information generated or obtained by the computing device 1504. In one embodiment, the remote device 1522 may be a handheld device, such as a smartphone or tablet computer. In another embodiment, remote device 1522 may be used in place of output device 1508.
The computing device 1504 may communicate wirelessly (i.e., by electromagnetic or acoustic waves carrying signals) with the other components of the system 1500, although the computing device 1504 may communicate by conductive media (e.g., wires), optical fibers, or otherwise with the other components of the system 1500. In one embodiment, one or more of the system components (other than computing device 1504) may communicate directly with each other (without communicating with computing device 1504) through a wireless or wired connection. For example, each camera 1502 may communicate directly with each other to synchronize the start time of the cameras 1502 or otherwise synchronize the cameras 1502 or data captured by the cameras 1502. In another example, the camera 1502 may communicate directly with the illumination system 1510 to change the illumination conditions when the camera 1502 detects that less than optimal illumination conditions are reached.
In one embodiment, the camera 1502 and the light source 1511 of the illumination system 1510 may be stationary. However, in other embodiments, one or more of the camera 1502 and the light source 1511 may be portable. Each of the camera 1502 and light source 1511 may be positioned at a particular location relative to the athletic playing surface.
One or more of the cameras 1502 may be automatically rotated or pivoted horizontally and/or vertically to adjust the field of view of the cameras 1502 without changing the position of the cameras 1502. Similarly, one or more light sources 1511 of lighting system 1510 can be automatically rotated or pivoted horizontally and/or vertically to adjust the output direction of light source 1511 without changing the position of light source 1511. In one embodiment, the rotation or pivoting of the camera 1502 and/or the light source 1511 can be preprogrammed into the camera 1502 and/or the light source 1511 such that the camera 1502 and/or the light source 1511 rotate or pivot according to a predetermined sequence. In another embodiment, the rotation or pivoting of the camera 1502 and/or the light source 1511 may be in response to an instruction provided to the camera 1502 and/or the light source 1511 by a user, the computing device 1504, or other device.
Fig. 2 illustrates an embodiment of a camera 1502 that may be used with the tracking system 1500. The camera 1502 shown in fig. 2 may include logic 1530, referred to herein as "camera logic," which may be implemented in software, firmware, hardware, or any combination thereof. In fig. 2, camera logic 1530 is implemented in software and stored in memory 1532. However, in other embodiments, other configurations of camera logic 1530 are possible. When implemented in software, the camera logic 1530 may be stored and transmitted on any computer-readable medium for use by or in connection with an instruction execution device that can fetch and execute the instructions.
The embodiment of the camera 1502 shown in fig. 2 may include: at least one conventional processing element 1534, which may include processing hardware for executing instructions stored in the memory 1532. As an example, the processing elements 1534 may include a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a Graphics Processing Unit (GPU), and/or a Quantum Processing Unit (QPU). The processing element 1534 may communicate with and drive other elements within the camera 1502 via a local interface 1536, which local interface 1536 may include at least one bus. The camera 1502 may have a clock 1538 that may be used to track time and operate in synchronization with other cameras 1502.
The camera 1502 may have a communications module 1540. The communications module 1540 may include a Radio Frequency (RF) radio or other device for wirelessly communicating with the computing device 1504 or other components of the system 1500. Power supply 1542 has an interface that allows power supply 1542 to plug into or otherwise connect with an external component (e.g., a wall outlet or a battery) and receive power from the external component.
As shown in fig. 2, camera 1502 may also include an image sensor 1550, a depth sensor 1546, an audio sensor 1548 and a light sensor 1544. The image sensor 1550 may be used to record, capture, or obtain images or video of an area around or near the camera 1502. In one embodiment, image sensor 1550 is configured to capture two-dimensional (2-D) video images of the playing area, including images of the objects being carried or transferred, the person playing the carry, and any other players in the athletic playing area. Depth sensor 1546 may be used to determine the relative distance (relative to depth sensor 1546) of objects in the field of view of camera 1502. An audio sensor 1548 or microphone may be used to record sounds or noise that occur in an area around or near the camera 1502. Light sensor 1544 may be configured to sense ambient light in an area surrounding camera 1502.
The image sensor 1550 may include one or more CCDs (charge coupled devices) and/or one or more active pixel sensors or CMOS (complementary metal oxide semiconductor) sensors. The image or video from the image sensor 1550 may be stored as image data 1552 in the memory 1532. In one embodiment, the image data 1552 may define frames of a captured image. Image data 1552 may be stored in any suitable file format, including but not limited to PNG (Portable network graphics), JPEG (Joint photographic experts group), TIFF (tagged image File Format), MPEG (motion Picture experts group), WMV (Windows media video), quickTime, and GIF (graphics interchange Format). The sound recording from audio sensor 1548 can be incorporated into a video file from image sensor 1550 and stored in image data 1552. If the sound recording from audio sensor 1548 is not part of a video file, the sound recording can be stored in any correct file format, including but not limited to WAV (waveform audio), MP3 (MPEG layer III audio), WMA (Windows media audio), and MPEG, and saved in image data 1552 or elsewhere in memory 1532.
In one embodiment, for each frame of image data 1552, depth sensor 1546 may provide a depth map indicating a respective depth for each pixel of the image frame. The depth map provided by depth sensor 1546 may be stored as depth data 1554 in memory 1532. Note that depth sensor 1546 may be oriented such that the distance measured by depth sensor 1546 is in a direction substantially orthogonal to the plane of the 2D coordinate system used by image sensor 1550, although other orientations of depth sensor 1546 are possible in other embodiments.
At times, camera logic 1530 may be configured to send image data 1552 and depth data 1554 to computing device 1504. The image data 1552 and the depth data 1554 may be analyzed by the computing device 1504 to track a dribbling action and determine one or more dribbling characteristics from the dribbling action, to track a pass action and determine one or more pass characteristics from the pass action, or to track a dribbling-to-pass conversion and determine one or more conversion characteristics. Image data 1552 and depth data 1554 may be time stamped based on the time indicated by clock 1538 to indicate when image data 1552 and depth data 1554 were obtained. Thus, upon receiving image data from the multiple cameras 1502, the computing device 1504, based on the timestamps, may determine which image frames from the multiple cameras 1502 were captured substantially simultaneously to facilitate movement of the trackball. At times, the cameras 1502 may communicate with each other and/or with the computing device 1504 to synchronize their clocks such that a comparison of the time stamps of image frames from one camera 1502 to the time stamps of image frames of the other camera 1502 accurately indicates the time difference at which the two image frames were captured. The image data 1552 and depth data 1554 may be presented to a user for analysis or viewing.
Various types of image sensors 1550 and depth sensors 1546 may be used in camera 1502. In one embodiment, the camera 1502 may be available from Microsoft corporation
Figure BDA0002956604140000161
A camera system. In such a camera, image sensor 1550 and depth sensor 1546 are integrated in the same housing. Image sensor 1550 is configured to capture a video stream including frames of video data, where each frame is defined by a plurality of pixels. Each pixel is associated with two coordinates (an x-coordinate and a y-coordinate) representing a position in 2D space. For each frame, each pixel is assigned a color value (which may include a red component (R) value, a blue component (B) value, and a green component (G) value) that indicates the color of light received by image sensor 1550 from a location in 2D space that corresponds to the coordinates of the pixel. Further, for each pixel, the depthSensor 1546 measures the distance from depth sensor 1546 to real world objects at corresponding locations of pixels in 2D space. This distance (which, as noted above, may be in a direction substantially orthogonal to the plane of the 2D coordinate system used by image sensor 1550) may be referred to as the "depth" of the corresponding pixel. Using image data 1552 from image sensor 1550 and depth data 1554 from depth sensor 1546, the location of an object captured by image sensor 1550 can be determined in 3D space. That is, for a point on the object, its x-and y-coordinates from image data 1552 provided by image sensor 1550 indicate its position along two axes (e.g., the x-and y-axes), and the depth value of the point from depth sensor 1546 (which may be referred to as the "z-coordinate") indicates its position along a third axis (e.g., the z-axis). It is noted that the coordinate system defined by these three axes is not necessarily related to gravity. That is, gravity may be in any direction relative to the axes of the coordinate system depending on the orientation of the camera 1502. Thus, unless a calibration process is performed, the direction of gravity with respect to the coordinate system may be unknown. An example of a calibration process for determining the direction of gravity with respect to a coordinate system is described by U.S. patent application No.9,734,405, entitled "Systems and Methods for Monitoring Objects in Athletic Playing Spaces," and issued 2017, 8, 15, which is hereby incorporated by reference.
In one embodiment, depth sensor 1546 has a wave emitter (e.g., an infrared laser projector or other type of emitter) and a wave sensor for sensing reflections of energy emitted by the wave emitter. The wave emitter emits infrared radiation of various wavelengths into free space, but in other embodiments radiation of other wavelengths outside the infrared spectrum (e.g., visible light) may be emitted, and the wave sensor senses the reflected energy to capture a video stream having frames of video data. Each frame of depth data 1554 from sensor 1546 corresponds to a respective frame of image data 1552 from image sensor 1550. Further, the pixels of the frame of depth data 1554 correspond to (e.g., have the same x and y coordinates) and indicate a depth of at least one corresponding pixel in image data 1552 from image sensor 1550. In another embodiment, depth sensor 1546 may capture depth data 1554 using stereo cameras.
In this regard, for a frame of video data captured by depth sensor 1546, depth sensor 1546 converts the frame into a depth map by assigning each pixel a new color value (referred to herein as a "depth value") representing the depth of the pixel. Thus, when displaying a depth map, objects that are displayed as the same color within the image should be approximately the same distance from depth sensor 1546, noting that it is not generally necessary to actually display a depth map during operation.
As described above, a given pixel of image data 1552 from image sensor 1550 is associated with an x-coordinate and a y-coordinate that indicate the location of the pixel in 2D space, and the pixel is associated with a depth value from a corresponding pixel in depth data 1554 provided by depth sensor 1546, which indicates the z-coordinate of the pixel. The combination of the x, y, and z coordinates defines the location of the pixel in 3D space relative to the coordinate system of image sensor 1550. That is, the x, y, and z coordinates define the location of the point of the pixel where the measured light is reflected from the object toward the image sensor.
Fig. 3 illustrates an embodiment of a computing device 1504. The computing device 1504 may be implemented as one or more general-purpose or special-purpose computers, such as a laptop computer, a handheld computer (e.g., a smart phone), a desktop computer, or a mainframe computer. The computing device may include logic 1560, referred to herein as "device logic," generally for controlling the operation of computing device 1504, including communicating with other components of system 1500. The computing device 1504 also includes logic 1562, referred to herein as an "object tracker," for determining the location and motion of objects, people touching objects, and any other people in the athletic match area, and includes lighting system control logic 1563 to control the lighting system 1510 and the light sources 1511. The computing device 1504 also includes logic 1564, referred to herein as "computer vision logic," for processing and analyzing image data 1552 and depth data 1554 from the camera 1502. The device logic 1560, computer vision logic 1564, lighting system control logic 1563, and object tracker 1562 may be implemented in software, hardware, firmware, or any combination thereof. In the computing device 1504 shown in fig. 3, the device logic 1560, computer vision logic 1564, lighting system control logic 1563, and object tracker 1562 are implemented in software and stored in the memory 1566 of the computing device 1504. Note that the device logic 1560, computer vision logic 1564, lighting system control logic 1563, and object tracker 1562, when implemented in software, may be stored on and transmitted over any non-transitory computer-readable medium for use by or in connection with an instruction execution device that can fetch and execute the instructions.
The computing device 1504 may include at least one conventional processing element 1568 having processing hardware for executing instructions stored in memory 1566. As an example, processing elements 1568 may include a Central Processing Unit (CPU), digital Signal Processor (DSP), graphics Processing Unit (GPU), and/or Quantum Processing Unit (QPU). Processing element 1568 communicates with and drives other elements within computing device 1504 via a local interface 1570, which local interface 1570 may include at least one bus. In addition, an input interface 1572, such as a keypad, keyboard, or mouse, may be used to input data from a user of computing device 1504, and an output interface 1574, such as a printer, monitor, liquid Crystal Display (LCD), or other display device, may be used to output data to the user. In one embodiment, input interface 1572 and output interface 1574 may correspond to input device 1506 and output device 1508, respectively. Additionally, communication interface 1576 may be used to exchange data between components of system 1500 or with network 1518, as shown in FIG. 1.
As shown in fig. 3, the sensor data 1580, the evaluation data 1582, and the camera data 1578 can be stored in a memory 1566 at the computing device 1504. Camera data 1578 may include image data 1552 and depth data 1554 from camera 1502. The sensor data 1580 can include data and measurements from sensors 1514 (e.g., accelerometers or other sensors) and/or any sensors incorporated in the playing area 1516. The camera data 1578, sensor data 1580, and evaluation data 1582 may be used and/or analyzed by the device logic 1560, computer vision logic 1564, and/or object tracker 1562 to track a carryover action for an object and determine one or more characteristics of the carryover action, to track a carryover to transfer transition and determine one or more parameters related to the transition characteristics, to track a projected action for an object and determine one or more characteristics of the projected action, to track a transfer of a carryover action to a projected action and determine one or more parameters related to the transition, to track a transferred action for an object and determine one or more parameters related to characteristics of the transferred action.
The assessment data 1582 may include: information associated with one or more parameters associated with a dribbling characteristic; information associated with one or more parameters associated with a shooting characteristic; information associated with one or more parameters associated with the conversion characteristic and/or information associated with one or more parameters associated with the pass characteristic, such as, for example, a motion associated with a particular dribbling type or a motion associated with a particular pass type. The assessment data 1582 may also include training information, such as may be displayed on the output device 1508 to provide training instructions to the user regarding the "correct" shot making and/or technique, the "correct" dribbling and/or the "correct" pass making and/or technique. The evaluation data 1582 may include one or more test programs based on a "correct" shot style, dribbling style, and/or pass style that may be used to evaluate a shot, dribbling, and/or pass action (including a transition from a dribbling action to a pass action) associated with a user. In one embodiment, the test programs may be displayed to the user on the output device 1508, and the object tracker 1562 may evaluate the user's performance with respect to the test programs in the evaluation data 1582 based on the user's shooting actions, dribbling actions, and/or passing actions captured in the camera data 1578.
The object tracker 1562 may receive camera data 1578, sensor data 1580, information from computer vision logic 1564, and/or other information related to the ball and person with the ball to track the action with the ball and determine one or more characteristics of the action with the ball. Once the characteristics of the dribbling motion are determined, the object tracker 1562 may compare the determined dribbling characteristic(s) to corresponding "correct" dribbling characteristic information in the assessment data 1582 to score or otherwise assess the determined dribbling characteristic(s). The "correct" dribbling characteristics stored in the assessment data 1582 may be preselected parameters or techniques associated with a preferred dribbling action. In one embodiment, each determined dribbling characteristic may have a corresponding "correct" dribbling characteristic stored in the assessment data 1582. The correct dribbling characteristic may be a predetermined number (e.g., a predetermined speed, a predetermined number of dribbling per minute, or a predetermined number of errors). The correct dribbling characteristics may also be defined relative to the body of the person carrying the dribbling (e.g. the dribbling height should not exceed the waist of the dribbling person). Further, the correct dribbling characteristics may be defined with respect to the actions of the person with the ball (e.g., there may be a set of correct dribbling characteristics when the person is running, and a different set of correct dribbling characteristics when the person is walking or at rest). However, in other embodiments, some dribbling characteristics may not have corresponding "correct" dribbling characteristics. In still other embodiments, the "correct" dribbling characteristic may be defined as a range (e.g., greater than a predetermined minimum value, less than a predetermined maximum value, or between a predetermined minimum value and a predetermined maximum value).
The object tracker 1562 may also receive camera data 1578, sensor data 1580, information from computer vision logic 1564, and/or other information relating to the ball and pass, to track the conversion from a dribbling action to a pass action (if applicable) and/or pass action, and to determine one or more characteristics of the conversion and/or pass action. Once the characteristics of the conversion from dribbling to pass motions and/or pass motions are determined, the object tracker 1562 may compare the determined conversion characteristics and/or pass characteristics to corresponding "correct" conversion and/or pass characteristics information in the evaluation data 1582 to score or otherwise evaluate the determined conversion and/or pass characteristics.
The "correct" transition characteristics stored in the assessment data 1582 may be preselected parameters or techniques associated with a preferred transition from a dribbling action to a passing action. In one embodiment, each determined conversion characteristic may have a corresponding "correct" conversion characteristic stored in the evaluation data 1582. The correct switching characteristic may be a predetermined number, such as a predetermined time, a predetermined ball position, or a predetermined number of laps. The correct transfer characteristics may also be defined relative to the body of the person performing the transfer (e.g. the ball should not be positioned over the passer's chest). Further, the correct switching characteristics may be defined with respect to the actions of the person performing the switching (e.g., there may be a set of correct switching characteristics when the person is taking a ball while running, and a different set of correct switching characteristics when the person is taking a ball while walking or at rest). However, in other embodiments, some conversion characteristics may not have corresponding "correct" conversion characteristics. In still other embodiments, the "correct" transition characteristic may be defined as a range (e.g., greater than a predetermined minimum value, less than a predetermined maximum value, or between a predetermined minimum value and a predetermined maximum value).
The "correct" pass characteristics stored in the assessment data 1582 may be pre-selected parameters or techniques associated with a preferred pass action. In one embodiment, each determined pass characteristic may have a corresponding "correct" pass characteristic stored in the evaluation data 1582. The correct pass characteristic may be a predetermined number, such as a predetermined speed, a predetermined pass height, or a predetermined number of turns. The correct pass characteristics may also be defined relative to the body of the person performing the pass (e.g. the pass height should not exceed the passer's chest). In addition, the correct pass characteristics may be defined relative to the passer's actions (e.g., a person may have a set of correct pass characteristics while running, and a different set of correct pass characteristics while walking or at rest). However, in other embodiments, some pass characteristics may not have corresponding "correct" pass characteristics. In other embodiments, a "correct" pass characteristic may be defined as a range (e.g., greater than a predetermined minimum value, less than a predetermined maximum value, or between a predetermined minimum value and a predetermined maximum value).
The "correct" shooting parameters associated with the shooting characteristics stored in the evaluation data 1582 may be preselected parameters or techniques associated with a preferred shooting action. In one embodiment, each determined shooting characteristic may have a corresponding "correct" shooting characteristic stored in the evaluation data 1582. The correct shooting characteristic may be a predetermined number, such as a predetermined speed, a predetermined angle, or a predetermined linear dimension. The correct shooting characteristics may also be defined relative to the body of the person making the shot (e.g., the person's foot should be pointed at the rim). Further, the correct shooting characteristics may be defined relative to the action of the shooter (e.g., a person may have one set of correct shooting characteristics while moving, and a different set of correct shooting characteristics while the person is stationary). However, in other embodiments, some shooting characteristics may not have corresponding "correct" shooting characteristics. In other embodiments, a "correct" shooting characteristic may be defined as a range (e.g., greater than a predetermined minimum value, less than a predetermined maximum value, or between a predetermined minimum value and a predetermined maximum value).
Computer vision logic 1564 may be used to analyze and process image data 1552 and depth data 1554 from camera 1502 stored in camera data 1578. Computer vision logic 1564 may use models, theory, and other techniques to extract information from image data 1552 and depth data 1554 in camera data 1578 to identify or recognize an object to be tracked and one or more participants (including the participants' torso, arms, legs, hands, feet, etc.) participating in a sporting event associated with the object. Computer vision logic 1564 may use a variety of techniques to identify or recognize objects and people, such as content-based image retrieval, pose estimation, optical character recognition, 2D code reading, shape recognition, face recognition, object recognition, pattern recognition, and any other appropriate recognition or identification technique. An exemplary technique for identifying and tracking athletes is disclosed in U.S. patent application Ser. No.15/438,289 entitled "Systems and Methods for Monitoring Objects at sports Events," filed on 21/2/2017, which is incorporated herein by reference.
In one embodiment, computer vision logic 1564 may perform one or more of the following techniques and/or processes on image data 1552 and depth data 1554 from camera data 1578: pre-treating; characteristic extraction; detecting/segmenting; advanced treatment; and decision making. Pre-processing of the camera data 1578 may include processing the data to confirm that the data is in the correct form for subsequent action. Some examples of pre-processing actions may include noise reduction and contrast enhancement. After pre-processing the camera data 1578, the camera data 1578 may be viewed or analyzed to extract various complexity characteristics (e.g., lines, edges, corners, points, textures, and/or shapes) from the camera data 1578. Next, in the detection/segmentation step, decisions can be made regarding the characteristics and/or regions that are relevant and require additional processing. High-level processing of the reduced set of camera data 1578 (as a result of the detection/segmentation step) involves estimation of certain parameters (e.g., object size) and classification of detected objects into different categories. Finally, a decision making step makes a determination of the identity of the detected object or person or indicates that the detected object or person is unknown.
The computer vision logic 1564 may identify objects and people present in the camera data 1578 by processing separate images and videos received from the cameras 1502 and/or images and videos based on any combination or grouping of camera data 1578 from multiple cameras 1502. The computer vision logic 1564 may identify objects or people using stickers (labels) carried by the objects or people, facial recognition techniques (if people are identified), contour analysis techniques (using contours of objects or people), or any other appropriate identification technique.
In one embodiment, the object or person may have a sticker that is attached or affixed to the object or person and may be recorded by the camera 1502. If the person carries a tag (tag), the label may (but need not) be incorporated into the tag carried by the person. Computer vision logic 1564 may identify tags attached to objects or persons and then identify the objects or persons based on information stored in memory 1566 that associates each tag with an object or person. In another embodiment, computer vision logic 1564 may identify a person using facial recognition, or may identify an object or person using a distinguishable or identifiable outline or characteristic of the object or person. For example, the identification of a circle or sphere may indicate the presence of a ball in the frame. Similar to the process of identifying objects or people using stickers, computer vision logic 1564 may identify facial characteristics and/or other contours or characteristics of objects or people in camera data 1578 and then compare the identified facial characteristics and/or other contours or characteristics of assets to information stored in memory 1566 that relates information about the characteristics and/or contours to the objects or people.
Computer vision logic 1564 may send camera data 1578 and/or information about objects or people identified by analyzing camera data 1578 to object tracker 1562. The object tracker 1562 may use information from the computer vision logic 1564 regarding the identified object and/or person to determine a dribbling action for the object and one or more dribbling parameters associated with the dribbling action, a transition from the dribbling action to a pass action, and one or more parameters associated with the transition, a shooting action for the object and one or more shooting parameters associated with the shooting action, a transition from the dribbling action to the shooting action, and one or more parameters associated with the transition, or a pass action for the object and determine one or more parameters associated with the pass action. In one embodiment, the object tracker 1562 may use the synchronized and calibrated camera data 1578 to determine a dribbling action and corresponding dribbling characteristics, dribbling to pass conversion and corresponding conversion parameters, a shot action and corresponding shot parameters, a dribbling to shot conversion and corresponding conversion parameters, or a pass action and corresponding pass parameters. Synchronization and calibration of the camera data 1578 may be accomplished by the computer vision logic 1564 or the object tracker 1562.
Synchronization of the camera data 1578 involves ensuring that corresponding "frames" of camera data 1578 of a given sample processed by the computer vision logic 1564 or the object tracker 1562 are captured substantially simultaneously. In this regard, a sample generally refers to data from measurements taken substantially simultaneously. For example, at a given moment, an image of a ball may be captured by multiple cameras 1502. Further, the position of the ball may be calculated from each image. In such an example, the measured positions are part of the same sample, since the position data from multiple cameras is based on image data captured at substantially the same time. To determine which frames are captured substantially simultaneously, a global time system may be defined. As an example, the computing device 504 may maintain a global time system and adjust the timestamps from each camera 1502 according to the global time system to synchronize the timestamps. That is, simultaneously captured images should have the same adjusted time stamp. Alternatively, the computing device 1504 (or other device maintaining a global time system) may send timing information to the camera 1504 from time to time. The cameras 1504 may then use such information to adjust their respective clocks so that images from the cameras 1504 with the same time stamp are captured substantially simultaneously. Alternatively, the computing device 1504 may analyze the out-of-sync timestamps from the cameras 1502 and determine which frames were captured at substantially the same time. In such embodiments, the computing device 1504 may communicate with the cameras 1504 in a controlled calibration process to evaluate timing differences between the cameras 1504. As an example, each camera 1504 may report a current timestamp to the computing device during the handshaking process, such that the computing device 1504 may determine the camera's time relative to a global time system maintained by the computing device 1504 or otherwise. In other embodiments, other techniques for synchronizing camera data are possible.
Calibration of the camera data 1578 involves correlating pixels in image frames with a global coordinate system so that the computing device 1504 knows which pixels in different frames from different cameras 1502 represent the same physical location in space. This may be accomplished, for example, by ensuring that pixels in "frames" from different cameras 1502 that represent the same physical location are assigned the same global coordinates. By calibrating the camera data 1578, objects and people carrying objects may be tracked over multiple image frames from different cameras 1502 because the position of objects and people carrying objects defined in the global coordinate system may be the same in each image frame regardless of the field of view of the camera 1502 capturing the image frame. Once the camera data 1578 is calibrated, the object tracker 1562 may track objects through multiple image frames as they move in and out of the field of view of the respective image frames. If one or more cameras 1502 become misaligned, the calibration process may be repeated to calibrate the misaligned camera(s) 1502.
In one embodiment, the object tracker 1562 may determine the dribbling action by analyzing successive frames of camera data 1578 to determine changes in the position and/or depth of the identified object and/or changes in the position of the person performing the dribbling action. The object tracker 1562 may determine the dribbling action by detecting a downward trajectory (movement away from the person) of the identified object, then detecting a change in direction of the identified object (e.g., as may be caused by the object contacting the athletic playing surface), and detecting an upward trajectory (movement toward the person) of the identified object. Some exemplary techniques for calculating the trajectory of a sphere that may be used by the Object tracker 1562 may be found in U.S. Pat. No.8,908,922 entitled "True Space Tracking for Flight Using Axisymmetric objects with Diameter measurements" and U.S. Pat. No.8,948,457 entitled "True Space Tracking for Flight Using Axisymmetric objects with Diameter measurements", both of which are hereby incorporated by reference herein. By identifying the changes associated with the upward and downward trajectory of the object or person carrying the object, the object tracker 1562 may determine characteristics associated with the dribbling action. In one embodiment, some of the dribbling characteristics may be determined using conventional mathematical and physical principles and equations based on trajectory information extracted from the camera data 1578. The determined dribbling characteristics may then be stored in memory 1566 and/or scored based on the "correct" dribbling characteristics stored in the assessment data 1582.
As an example, the object tracker 1562 may analyze the trajectory of a ball and identify multiple dribbling. For one or more dribbling, the object tracker 1562 may determine parameters indicative of dribbling characteristics, such as ball speed, dribbling height, repetition rate, dribbling type, etc., and store such parameters for analysis. In some cases, the object tracker 1562 may correlate a given parameter with information that may be used to characterize dribbling performance. For example, if a given dribbling is performed with the left hand, the parameters determined for that dribbling may be correlated in memory with the left hand identifier. Based on parameters associated with such identifiers, the object tracker 1562 may calculate one or more scores or other statistics indicative of performance of the athlete with his left hand. As an example, the average repetition rate, ball speed or dribbling height of the player's left hand may be calculated. If a dribbling type is identified for a particular dribbling, the parameters determined for that dribbling may be correlated in memory with a type identifier indicating the dribbling type, as will be described in more detail below. Based on parameters associated with such identifiers, the object tracker 1562 may calculate one or more scores or other statistics indicative of the player's dribbling performance for the identified dribbling type. If a particular defender can be identified as defending the player, the parameters determined for the dribbling can be correlated in memory with an identifier identifying the defender, as will be described in more detail below. Based on parameters associated with such identifiers, the object tracker 1562 may calculate one or more scores or other statistics that indicate the player's dribbling performance for the defender. In other embodiments, the data may be grouped in other ways to provide further insight into the player's dribbling performance with respect to certain conditions. The system may report any of the parameters, scores, or other statistics described herein to indicate one or more dribbling characteristics of the tracked player. Any such parameters, scores, or other statistics may be used in order to calculate an overall or combined assessment of the player's dribbling performance that may be reported.
Note that the object tracker 1562 may use techniques other than, or in addition to, identifying objects and/or people through the computer vision logic 1564 to determine the dribbling action and one or more characteristics associated with the dribbling action. In one embodiment, the sensor data 1580 may be analyzed by an object tracker 1562 to determine the location and movement of objects and/or people. The sensor data 1580 can then be used to determine a dribbling action and one or more characteristics associated with the dribbling action.
In another embodiment, the object tracker 1562 may determine a transition from a pass action to a take action and/or a pass action by analyzing successive frames of camera data 1578 to determine a change in position and/or depth of the recognition object, a change in position of a person initiating the transition and/or the pass action, and/or a change in a person occupying the recognition object. The object tracker 1562 may determine the transition from a dribbling action to a pass action by detecting the end of an up/down trajectory of an identified object followed by a subsequently initiated pass action. The object tracker 1562 may determine a pass action by detecting a horizontal movement or trajectory of the identified object (e.g., a movement away from the position of a person) from a first location on the athletic playing surface (followed by a change in the person occupying the identified object at a location different from the first location). By identifying changes associated with the trajectory of the object or person initiating the pass action, the object tracker 1562 may determine characteristics associated with the transition and/or pass action. In one embodiment, some conversion characteristics and/or pass characteristics may be determined using conventional mathematical and physics principles and equations based on trajectory information extracted from camera data 1578. The determined conversion characteristics and/or pass characteristics may then be stored in memory 1566 and/or scored based on the "correct" conversion characteristics and/or pass characteristics stored in the assessment data 1582.
As an example, the object tracker 1562 may analyze the trajectory of a ball and determine a pass or shot motion. For each pass, the object tracker 1562 may determine parameters indicative of pass characteristics (e.g., ball speed, pass height at launch, pass height at receive, pass type, etc.) and store these parameters for analysis. For each shot, the object tracker 1562 may determine parameters indicative of the characteristics of the shot (e.g., ball speed, angle of entry, type of shot, etc.) and store these parameters for analysis. In some cases, the object tracker 1562 may associate a given parameter with information that may be used to characterize a pass or shot performance. For example, if a given pass or shot is initiated with the left hand, the parameters determined for that pass or shot may be associated in memory with a left hand identifier. Based on the parameters associated with the identifier, object tracker 1562 may calculate one or more scores or other statistics indicative of the athlete's performance with the left hand.
As an example, the average pass height at launch, ball speed or pass height at reception of the player's left hand may be calculated. If a pass type is determined for a particular pass action, as will be described in more detail below, the parameters determined for the pass may be associated in memory with a type identifier indicating the pass type. Based on the parameters associated with the identifier, the object tracker 1562 may calculate one or more scores or other statistics indicative of the athlete's pass performance for the determined pass type.
If a particular defender can be identified, as will be described in more detail below, the parameters determined for a pass or shot may be associated in memory with an identifier identifying the defender. Based on the parameters associated with the identifier, the object tracker 1562 may calculate one or more scores or other statistics indicative of the player's pass and/or shot performance for the defender. In other embodiments, the data may be grouped in other ways to provide further insight into the performance of a player in passing and/or shooting relative to certain conditions. Any of the parameters, scores, or other statistics described herein may be reported by the system to indicate one or more pass and/or shot performance of the tracked player. Any such parameters, scores, or other statistics may be used to calculate an overall or composite assessment of the reportable player's pass and/or shot performance.
Note that the object tracker 1562 may use other or additional techniques to determine the shot making action and/or pass action and one or more characteristics associated with the shot making action and/or pass action in addition to identifying objects and/or persons through the computer vision logic 1564. In one embodiment, the sensor data 1580 may be analyzed by an object tracker 1562 to determine the location and motion of objects and/or people. The sensor data 1580 can then be used to determine a shot and/or pass and one or more characteristics associated with the shot and/or pass.
Fig. 4 illustrates an embodiment of a process for evaluating a user's shots, dribbling actions and/or passing actions with a tracking system 1500 during a training sequence (training mode) or a competition sequence (competition mode). The process begins with a user initiating a training or competition sequence while the tracking system 1500 is active (step 1602). If the tracking system 1500 is being used in a training mode, the user may select an expected training sequence from the computing device 1504 using the input device 1506. The selected training sequence may then be displayed to the user using output device 1508. The computing device 1504 may store one or more training sequences in the evaluation data 1582. The training sequence may provide guidance to the user on how to perform a particular shot, dribbling, and/or pass. The training sequence may demonstrate a shot, dribbling, and/or passing action on the output device 1508 for simulation by the user. The user may then attempt to repeat the demonstrated shooting, dribbling, and/or passing motions during the training sequence, where the user motions are captured by the camera 1502. During a game sequence, the camera 1502 may capture a user's shooting, dribbling, and/or passing motions during the game. Camera data 1578 from camera 1502 may be provided to computing device 1504 and processed by computer vision logic 1564 to identify objects (e.g., balls) that are projected, taped, or delivered and the person who projected, taped, or delivered, and if a delivery action, the person who received the delivery. Information from the computer vision logic 1564 may then be provided to the object tracker 1562 to identify the user's shot, dribbling, and/or passing actions (step 1604).
The object tracker 1562 may identify a shot, dribbling, and/or passing action with respect to the ball based on the identification information from the computer vision logic 1564 (which identifies the ball) and the trajectory of the identified ball (including any corresponding changes in the depth or position of the ball). Once the object tracker 1562 identifies a dribbling action, the object tracker 1562 may identify one or more characteristics of a shot, dribbling, and/or passing action (step 1606). The object tracker 1562 may identify parameters associated with characteristics of a shot, dribbling, and/or pass by analyzing the trajectory of the ball and camera data 1578 associated with the ball and the person performing the dribbling action and/or initiating the shot or pass.
The object tracker 1562 may determine a person's hand (e.g., right or left hand) for making a shot, dribbling, and/or pass by identifying the person's face (based on facial recognition data from the computer vision logic 1564) and then determining the side of the person associated with the detected shot, dribbling, and/or pass. Alternatively, the object tracker 1562 may identify the left and right hands of a person based on the body contour of the person within the captured image. In one embodiment, the object tracker 1562 may also determine whether a two-handed (e.g., both hands on the object) shooting action or pass action has been performed. In one embodiment, the hands may be initially at the torso (or center) of the person rather than the sides of the person. Once the object tracker 1562 has determined the person's hand performing the dribbling action ("the" dragee "), the object tracker 1562 may then determine dribbling parameters associated with the dribbling characteristics of the dribbling action performed by each of the person's hands. Similarly, once the object tracker 1562 has determined the hands of a person performing a pass ("passer"), the object tracker 1562 may then determine pass parameters associated with the pass characteristics of a pass performed by each or both hands of the person. Further, once the object tracker 1562 has determined the hand of the person performing the shooting action ("the shooting hand"), the object tracker 1562 may then determine shooting parameters associated with the shooting characteristics of the shooting action performed by each or both hands of the person.
The object tracker 1562 may use information about the hitter to determine several types of hitter actions (e.g., cross-hitter, back hitter, or two-leg hitter). The object tracker 1562 may examine information from the computer vision logic 1564 and hitter information to determine if a particular hitter action has been performed. In one embodiment, the object tracker 1562 may determine cross-dribbling by checking for changes in a person's dribbling while the ball remains in front of the person. The object tracker 1562 may determine that a person is wearing a ball between two legs by examining the person's wearing a player's changes while the ball travels under the person's torso (e.g., between the person's legs) from the front of the person to the back of the person. The object tracker 1562 may determine that the ball is behind by checking the person's volleyball hand for changes to travel behind the person.
In another embodiment, the object tracker 1562 may determine one or more dribbling types based on a corresponding set of parameters determined by the object tracker 1562. Each dribbling type (e.g., "dribbling from back to front, right to left between legs") may be defined as a sequence or set of dribbling characteristics that include start and/or end dribbling height, dribbling velocity, dribbling direction, start and/or end acceleration or deceleration, ball spin, or ball spin rate. The object tracker 1562 may determine the specific dribbling characteristics that occur during the dribbling action and then identify the type of dribbling from the dribbling characteristics. Other techniques for detecting the type of action taken with the ball may be used in other embodiments.
The object tracker 1562 may also determine other characteristics of the dribbling action, such as ball speed, dribbling height, repetition rate (e.g., dribbling per second), dribbling power, or other similar characteristics, by analyzing the trajectory of the ball (i.e., the change in detected ball position over subsequent frames of camera data and the corresponding temporal changes that occur between subsequent frames). In one embodiment, the object tracker 1562 may determine the repetition rate by counting the number of times the ball has corresponding downward (e.g., away from the hitter) and upward (e.g., toward the hitter) trajectories associated with the hitter of the user over a predetermined period of time. The object tracker 1562 may determine the dribbling height by measuring the distance between the beginning and the end of a downward trajectory or an upward trajectory using a global coordinate system. The object tracker 1562 may determine the ball speed by dividing the dribbling height by the elapsed time for the ball to complete an upward or downward trajectory. The object tracker 1562 may determine the direction by defining a vertical axis relative to the horizontal plane of the point at which the direction is changed by the ball, and then measure the angle relative to the defined axis at which the ball begins or ends an upward trajectory or a downward trajectory. The object tracker 1562 may determine the dribbling force based on the ball speed of the downward trajectory and the movement of the dribbling hand toward the ball before starting the downward trajectory. The object tracker 1562 may determine that a miss has occurred when the trajectory of the ball shows that it falls in a certain area (e.g., an out-of-range area) or is transferred directly from the player's hand of the player with the ball to the player of the opponent, noting that the player of the opponent may be identified by a jersey color or other identification technique. In another embodiment, the dribbling characteristics of the dribbling action may be determined based on information relating to the direction of gravity and information relating to the position of the sports playing surface. In still other embodiments, still other techniques may be used to calculate the dribbling characteristics of the dribbling action.
The object tracker 152 may correlate each measured dribbling characteristic to a player's left or right hand. Thus, statistics based on the user's left and right hands may be determined. By way of example, the computing device 1504 may determine a dribbling speed, dribbling height, failure rate, or other characteristic of the user's left or right hand. Thus, if desired, the player can see how well he takes the ball with his left hand compared to his right hand.
In addition, the object tracker 1562 may use the passer's information to determine several types of pass types (e.g., a chest pass, a bounce pass, a high altitude pass, a back pass, or a baseball pass). The object tracker 1562 may review information from the computer vision logic 1564 and passer information to determine if a particular pass type has been made. In one embodiment, the object tracker 1562 may determine a front pass by two passers at the person's torso when checking that the ball is traveling away from the person and not touching the playing surface. The object tracker 1562 may determine a bounce pass by examining two passers at the person's torso as the ball travels away from the person and contacts the playing surface. The object tracker 1562 may determine high altitude passes by examining two passers from above a person's head when the ball travels away from the person and does not contact the playing surface. The object tracker 1562 may determine a baseball pass by examining a single passer of a person at or above the shoulder height as the ball travels away from the person. The object tracker 1562 may determine back passes by examining a person's single passer as the ball travels from behind and away from the person. In another embodiment, for a baseball pass type and a pass behind type, the object tracker 1562 may also determine whether the pass is a bounce pass by determining whether the ball contacts the playing surface during travel away from the person.
The object tracker 1562 may also determine other characteristics of the pass action (e.g., speed of the ball, height at pass, height at catch, spin of the ball, angle and direction of departure at pass, or other similar characteristics) by analyzing the trajectory of the ball (i.e., changes in the detected position of the ball over a subsequent frame of camera data and corresponding temporal changes that occur between subsequent frames). In one embodiment, the object tracker 1562 may determine the angle and direction of ball departure by defining a horizontal axis and a vertical axis relative to a corresponding plane passing through the point where the ball departs from the passer's hand, and then measuring the angle relative to the vertical axis and the direction relative to the horizontal axis of the ball's starting trajectory. In one embodiment, an origin of the horizontal and vertical axes may be defined, and the origin may be located at a point corresponding to the front torso of the person. The object tracker 1562 may determine the pass height of a pass or catch by using the global coordinate system to measure the distance between the start of the pass trajectory (if a pass) and/or the end of the pass trajectory (if a catch) and the playing surface. The object tracker 1562 may determine the ball speed by dividing the trajectory distance (i.e., the distance between the passer and the receiver) by the elapsed time for the ball to complete the trajectory. When the trajectory of the ball shows that the ball lands in a certain area (e.g., an out-of-range area) or is directly switched from a passer to an opponent's team, the object tracker 1562 may determine that a miss has occurred, noting that: players of the team may be identified by jersey color or other identification techniques. In another embodiment, the pass characteristics of the pass motion may be determined based on information relating to the direction of gravity and information relating to the position of the athletic playing surface. In other embodiments, other techniques may also be used to calculate pass characteristics for a pass.
The object tracker 152 may associate each measured shot-shooting action and/or pass characteristic with a player's left, right, or both hands. Thus, statistics based on the user's left, right, and both hands may be determined. As an example, the computing device 1504 may determine pass speed, pass height, turn rate, or other characteristics of the user's left, right, and/or both hands. Thus, if desired, the player can see a comparison of his performance in a pass with one hand to his performance in a pass with two hands.
Once the object tracker 1562 has determined parameters associated with the characteristics of the shot, dribbling, and/or pass action (including the conversion of dribbling to passing), the object tracker 1562 may compare the determined characteristics to preferred or "correct" characteristics stored in the assessment data 1582 (step 1608) for the training sequence. Based on the comparison between the determined characteristics and the correct characteristics, the object tracker 1562 may then calculate a score for the user (step 1610). The user's score may be based on how quickly and/or accurately the user may reproduce the displayed training sequence or how the user's shot characteristics, dribbling characteristics, and/or pass characteristics in the game sequence compare to the correct shot characteristics, dribbling characteristics, and/or pass characteristics. The scoring of the user performance may be based on several accuracy factors, such as how close the user is to reproduce the correct sequence, the correct manner of shooting action, the correct height of the dribbling and/or passing action, and/or the correct ball placement or trajectory. In addition, the scoring of user performance in the training mode may also be based on how quickly the user is able to repeat the motion from the training sequence. In contrast, in the play mode, additional factors, parameters, or statistics (e.g., error rate, amount of time held, whether the user is defended or unattended, etc.) may be used to score the performance of the user. Once the user's score is calculated, the score may be displayed on the output device 1508 (step 1612). In one embodiment, the scores may be displayed on the output device 1508 concurrently with the display of the training sequence to inform the user how well the user performed with respect to the training sequence.
After the scores are displayed to the user, a determination is then made by the computing device 1504 as to whether the training or competition sequence has ended (step 1614). If the training sequence or the competition sequence has ended, the process ends. However, if the training sequence or the play sequence has not ended, the process returns to step 1604 to identify further shots, dribbling and/or passing actions from the user that may be evaluated and scored. This repetition of the process may continue until the training sequence or the competition sequence is over.
Once the training sequence or competition sequence has ended, the computing device 1504 may recommend one or more additional training sequences for the user based on the user's performance (e.g., scoring) on the completed training sequence or competition sequence. A higher level training sequence may be recommended if the user performs well on the completed training sequence or competition sequence. Alternatively, if there are certain aspects of a completed training sequence or race sequence in which the user does not perform well, one or more remedial training sequences may be recommended.
The computing device 1504 may also provide the user with the option to view a completed training or competition sequence. The computing device 1504 may play back the completed training or racing sequence on the output device 1508 along with video of the user's actions during the training or racing sequence and concurrent score calculations based on the user's movements. The user is then able to see some parts of the training or competition sequence where the user may encounter problems while in progress.
In one embodiment, if the tracking system 1500 is used in a game situation, the object tracker 1562 may be used to obtain different types of information associated with shooting, dribbling, and/or passing actions that occur during the game. The object tracker 1562 may provide information about each player's shooting, dribbling, and/or passing actions when defended by the defender (e.g., the defender is within a predetermined distance of the ball-holding person and follows the movement of the ball-holding person) or when not defended (e.g., no defender is within a predetermined distance of the ball-holding person). In one embodiment, the object tracker 1562 may also provide information about when a person who shoots, takes a ball, and/or conducts a pass is guarded by more than one defender. In another embodiment, the object tracker 1562 may also determine whether a person shooting, dribbling, and/or passing a ball is tightly guarded or loosely guarded. The object tracker 1562 may determine tight defense if a defensive player is within a first predetermined distance range of a person shooting, dribbling, and/or passing a ball. The object tracker 1562 may determine loose defense if the defender is outside the first predetermined distance range (but within a second predetermined distance range that is greater than the first predetermined distance range).
The object tracker 1562 may provide information about each player's shooting, dribbling, and/or passing actions while being defended by a particular defender. The object tracker 1562 may use information from the computer vision logic 1564, such as facial recognition data, shape data, or pattern data, to identify a defender of a defender shooting, dribbling, and/or passing personnel. In one embodiment, the object tracker 1562 may identify a defender (who affirmatively determines that the player will be considered a defender) by initially determining whether the player has a different color or type of uniform than the person who shot, took the ball, and/or passed the ball. Once the object tracker 1562 determines that the athlete is a defender, the object tracker 1562 may distinguish the individual defender from other defenders by identifying particular characteristics associated with the defender, such as by identifying the defender's uniform number through pattern recognition or identifying the defender's face through facial recognition.
Additionally, once the object tracker 1562 identifies a defensive player, the object tracker 1562 may examine information from the computer vision logic 1564 to determine whether the defensive player has performed a particular defensive action. In one embodiment, the object tracker 1562 may determine that a defensive action has occurred by examining a change in the position of a defensive player of a defensive belted ball and/or passing person relative to the movement of the ball itself. For example, the object tracker 1562 may determine whether an attempt to break a ball is occurring (or has occurred) by examining the movement of a defender's hand toward the position of the ball at about the same time that the defender's body is moving toward a person who shot, takes the ball, and/or passes.
In another embodiment, the object tracker 1562 may determine one or more defensive actions based on a corresponding set of parameters determined by the object tracker 1562. Each defensive action (e.g., "low bow forward breaking the ball with both hands") may be defined as a series or set of defensive characteristics that may include various heights, various speeds, various directions, various orientations, various accelerations or decelerations, hand, arm, shoulder, and leg movements with various rotations and/or various speeds. The object tracker 1562 may determine a particular defensive characteristic associated with a particular defensive action and then identify a type of defensive action from the defensive characteristic. Other techniques for detecting defensive actions may be used in other embodiments.
The object tracker 1562 may use information about the identified defenders to correlate the shot, dribbling, and/or passing action statistics of the person who shot, dribbling, and/or passed the ball with each of the defenders who defend the person. The object tracker 1562 may provide information about: the number of times the defender defends the person, the cumulative amount of time the defender defends the person, the amount of time or percentage of the respective hand (or hands) used by the person to shoot, carry, and/or pass for the type of shot, carry, and/or pass of the defender, the number and/or percentage of errors (e.g., the number of times the person who shot, carry, and/or pass lost control of the ball while the defender was defended), and the ball carrying attributes of the person for the defender. The object tracker 1562 may provide information about the ball speed, dribbling height, repetition rate, and strength of the person's dribbling action for each defensive player with respect to the information provided about the shooting, dribbling, and/or passing attributes. The information provided about the dribbling property may include an average value, a range of values extending from a minimum value to a maximum value, and/or a value that is longest over a period of time of occurrence. In another embodiment, the object tracker 1562 may provide similar information for the case when the person with the ball is unattended. Further, with respect to the provided pass attribute information, the object tracker 1562 may provide information on ball speed, ball height at the time of passing, ball height at the time of catching, spin of the ball, angle and direction of ball departure of the ball from each defender, and the like. The information provided about the pass attributes may include an average value, a range of values extending from a minimum value to a maximum value, and/or a longest-occurring value. In another embodiment, the object tracker 1562 may provide similar information for passers when unattended.
With respect to the provided shot attribute information, the object tracker 1562 may provide information regarding the angle of entry of the shot, the shot position, the spin rate of the ball, the spin axis of the ball, the release height of the shot, or the release rate of the shot by each defender, or any other parameter that may be indicative of a shot. The information provided regarding the attributes of the shots may include an average value, a range of values extending from a minimum value to a maximum value, and/or a value that occurs for the longest time. In another embodiment, the object tracker 1562 may provide similar information for when the basketball shot is in an unprotected state.
Various techniques may be used to track the performance of an athlete relative to a particular defender. By way of example, by analyzing images captured by the camera 1504, the object tracker 1562 may identify athletes and each defensive player using athlete identification techniques described in more detail above. When the player possesses the ball, the object tracker 1562 may analyze the images to determine the player's distance from each identified defensive player as evidenced by the ball appearing in the player's hand from the images or the ball following a trajectory that indicates the player takes the ball (e.g., upon leaving the player's hand and returning to the player's hand after bouncing off the floor of the dribbling area). The player closest to the player may be identified as a defender of a ball-holding player if he is within a predetermined distance of the ball-holding player. If desired, the defensive player may be required to be within a predetermined distance for at least a certain period of time before making a defensive determination to prevent a defensive player passing briefly the player while defending other players from being mistakenly identified as a defensive ball-holding player. In other embodiments, other techniques for determining whether a particular defender defends a player holding a ball are possible. As an example, the body orientation of a defender can be a factor in determining whether he or she is defending a ball-holding player. In this regard, a defender facing a ball holder for an extended period of time may be defending him. The object tracker 1562 may be configured to determine that a defensive player is defending the athlete when the defensive player is facing the athlete within a predetermined distance of the athlete for at least a predefined amount of time. In another embodiment, the user may manually enter data indicative of a defender defending a player holding the ball (e.g., a jersey number or other identifier of the defender) into the computing device 1502 or other device of the system. In still other embodiments, still other techniques are possible.
When it is determined that an identified defensive player is defending the player, the player's shooting, dribbling and/or passing characteristics may be associated with an identifier identifying the defensive player performing the defense. Accordingly, characteristics indicative of the performance of a player's shots, dribbling, and/or passing while defended by an identified defender may be determined from data captured by the system, and the object tracker 1562 may be configured to calculate various scores and statistics indicative of such performance. As time passes as a player is defended by multiple defenders, his performance of shooting, dribbling and/or passing a ball on one defender may be compared to his performance of shooting, dribbling and/or passing a ball on another defender. Note that the information may be used to help train the person who made the shot, took the ball, and/or passed the ball or other purposes, such as deciding which defensive player is most effective in defending the person who made the shot, took the ball, and/or passed the ball.
The object tracker 1562 may determine the effect or movement of a particular shot, dribbling and/or passing characteristic or motion on a defender's defender of a person shooting, dribbling and/or passing. For example, the object tracker 1562 may determine whether a particular shot, dribbling, and/or pass characteristic was successful or unsuccessful for a defender, or whether a person who shot, dribbling, and/or passed a ball was able to make a particular shot, dribbling, and/or pass characteristic for a defender. The shooting, dribbling and/or passing characteristics may be considered successful if the person shooting is able to initiate a shot past a defensive player, if the person carrying the ball is able to proceed past the defensive player, or if the person passing the ball is able to successfully provide the ball to a teammate. In contrast, if a person who made a shot is blocked by a defensive player, if the person with the ball cannot proceed past the defensive player, if the person who passed the ball cannot successfully provide the ball to teammates or if the person makes a mistake in the process of making a shot, taking the ball and/or passing the ball (e.g., losing the ball out of bounds or to the defensive player (or another defensive player) who defended the person), the shot, ball-in and/or pass characteristics may be considered unsuccessful.
Fig. 5 illustrates an embodiment of an object tracker 1562 that may be used by the computing device 1504. The object tracker 1562 may include: ball path logic 1591 for generally determining the path or movement of the ball and the person who shot, took the ball and/or passed the ball, even if the ball and/or the person is hidden from the camera 1502; identification logic 1592 for determining the position of the ball and/or the attacking and defending player or the person on the athletic playing surface; and scoring logic 1595 for evaluating a person's shooting, dribbling and/or passing actions or performance of a defender defending a person shooting, dribbling and/or passing and providing a "score" associated with the person's performance. Scoring logic 1595 may evaluate the person's performance based on information from measurement logic 1597. Measurement logic 1597 may be used to measure the ability of the person who shot, took and/or passed the ball and/or the ability of the person(s) who defended the person who shot, took and/or passed the ball. Improvement logic 1594 may use information from measurement logic 1597 and scoring logic 1595 to determine aspects of a person's performance that may improve him/her. The object tracker 1562 may also include a defender action logic 1593 to generally determine the movement and/or action of the defender, even if the defender is hidden from the camera 1502; and balance logic 1599 to assess the balance and/or fluency of the person(s) shooting, dribbling and/or passing and/or the person(s) defending the person(s) shooting, dribbling and/or passing. The object tracker 1562 may further include performance evaluation logic 1603 to generally determine when a person's performance has changed from the person's expected performance level.
Historical data 1596, fatigue data 1601, and body motion data 1598 used by the object tracker 1562 may be stored in the memory 1566 at the computing device 1504. The historical data 1596 may include information related to previous movements and actions of the person who shot, took the ball, and/or passed the ball during the training sequence and/or the live game sequence. Historical data 1596 may also include data and information regarding the movement and actions of the defender(s) defending the person who shot, took the ball, and/or passed the ball. Fatigue data 1601 may include data and information about the physical state of a person who shot, took and/or passed a ball. The physical state of the person may be correlated to the athlete's fatigue level at a particular time based on the biological information from the biological sensor 1514. For example, the fatigue level of the person may be based on the heart rate and oxygen level of the person. In other embodiments, the fatigue data 1601 may be based on measured characteristics of a person's game, such as an amount of time the person is in the game, a total distance the person has run during game play, an amount of time the person is sprint (e.g., running at a speed that exceeds a defined threshold) during the game or a time window (e.g., the last fifteen minutes or some other period of time), or other parameters that may affect the person's degree of fatigue. The body motion data 1598 may include information related to the position and movement of a person (both the person who shot, takes and/or passes and the defender (s)) and his/her associated body parts (e.g., head, shoulders, elbows, hands, fingers, chest, waist, back, thighs, knees, calves, hips, ankles and feet) during shooting, passing and/or defending shooting, passing and/or passing. The body motion data 1598 may also include left and right side information and front and rear side information associated with the athlete's body part, where applicable.
As previously discussed, the object tracker 1562 may receive camera data 1578, sensor data 1580, information from computer vision logic 1564, and/or other information related to players or persons on the ball and sports playing surface. The ball path logic 1591 may be used to determine (or approximate) the path of the ball and the person shooting, dribbling and/or passing during the shooting, dribbling and/or passing action even if the ball or person cannot be identified by the identification logic 1592 based on the camera data 1578. For example, the recognition logic 1592 (or computer vision logic 1564) may not recognize a ball or a person because the ball may not be present in the camera data 1578. Because the ball is hidden from the field of view of the camera 1502 due to the person(s) shooting, carrying, and/or passing the ball, the defensive shot, the person(s) carrying, and/or one or more other persons on the athletic playing surface, the ball may not be present in the camera data 1578. See, for example, fig. 6. Furthermore, even if a ball is present in the camera data 1578, the recognition logic 1592 may not be able to recognize the ball because the ball is blurred in the camera data 1578 due to poor lighting conditions, partial occlusion of the ball, and/or due to blurring from rapid movement of the ball.
In one embodiment, the recognition logic 1592 may determine whether a foul or other violation has occurred during the game by determining whether a whistle blow has occurred and then stopping the game. The recognition logic 1592 may also determine the occurrence of a foul or other violation by recognizing one or more actions of stopping the game and then the referee (e.g., the referee moving toward the scorer's table and making one or more gestures). The recognition logic 1592 may determine which player foul or violation has occurred and what type of foul or violation has occurred based on the referee's hand and arm movements. For example, a referee may indicate a holding foul by moving his/her hands one or more times to his/her hips. Recognition logic 1592 may analyze the gestures to identify the type of foul (e.g., interpreting the gestures to determine the occurrence of a holding foul when the referee moves his/her hands onto his/her hips within a certain period of time after the whistle). The referee may also use gestures to indicate the number of the offending player (e.g., lifting multiple fingers to indicate the number), and the recognition logic 1592 may interpret such gestures to identify the offending player. The recognition logic 1592 may also be able to determine which player infraction or violation, and the type of infraction or violation that occurred, by processing audio information captured from a referee who spoken the player and the type of infraction or violation.
The ball path logic 1591 may use information from the recognition logic 1592 to determine the path or trajectory of the ball. When the recognition logic 1592 is unable to recognize the ball from the camera data 1578, the ball path logic 1591 may determine the expected trajectory or motion of the ball based on the last known position of the ball from the recognition logic 1592 and other information stored in the memory 1566. The ball path logic 1591 may analyze the body position of the person who shot, took the ball, and/or passed based on the body motion data 1598 and approximate the expected trajectory of the ball and the time to complete the trajectory based on the manner in which the person who shot, took the ball, and/or passed was positioned. Once the recognition logic 1592 is able to recognize the ball from the camera data 1578, the ball path logic 1591 may confirm (or reject) the approximate trajectory of the ball.
For example, if a person takes a ball behind his back or attempts to pass the ball behind during a game, the ball may not be visible to the recognition logic 1592 due to the obstruction of the person's body and the obstruction of other players or people on the athletic playing surface. However, the identification logic 1592 may be capable of detecting the movement of the person's shoulders, arms, and hands and providing this information to the ball path logic 1591. The ball path logic 1591 may then use information from the recognition logic 1592 and the body motion data 1598 to approximate the motion, trajectory, direction, rotation and speed of the ball when it is not visible or detectable to the recognition logic 1592 and to predict the arrival time and position of the ball on the other side of the person (with the ball) or another person (with a pass) when the ball becomes visible or detectable by the recognition logic 1592 from the camera data 1578.
If the ball path logic 1591 receives information from the recognition logic 1592 that the ball is in a position (possibly with margin of error) that the ball path logic 1591 expects, the ball path logic 1591 may determine that the actual trajectory of the ball follows the approximate trajectory determined by the ball path logic 1591. However, if the ball path logic 1591 receives information from the recognition logic 1592 that the ball is in a position that is different than expected, the ball path logic 1591 may determine that the movement of the ball does not follow an approximate trajectory and may approximate the new trajectory of the ball based on the start and end positions of the ball. Additionally, the ball path logic 1591 may store information about the start and end positions of the ball, the revised approximate trajectory, and the person with the ball in the memory 1566 (as may historical data 1596). Then, when the ball is occluded in a similar situation in the future, the ball path logic 1591 may use the stored information about the start and end positions of the ball and the revised approximate trajectory in formulating an approximate trajectory for the ball.
In another embodiment, the ball path logic 1591 may be able to determine the trajectory or motion of the ball even if some (or all) of the ball or the person who shot, took the ball, and/or passed the ball are occluded in the camera data 1578. By way of example, the ball may be obscured from view, but the person's elbow may be seen. The movement of the person's arm near the elbow may indicate when the ball has reached or exited the person's hand. In this regard, a change in the motion of the person's arm may indicate that a ball has reached the person's hand and is pushed down to carry the ball or pushed out to pass the ball. Further, ball path logic 1591 may calculate the position of the ball when it is determined to arrive at or depart from the person's hand based on the position and orientation of the person's elbow. In this regard, the person's arm length may be predetermined and used by logic 1591 to determine the distance of the ball from the person's elbow. Further, the angle of the person's forearms may indicate the direction of the ball relative to his elbows. By determining various positions of the ball at different times when the ball is occluded, the ball path logic 1591 may estimate the trajectory of the ball between such points.
If desired, the ball path logic 1591 may use computer learning and/or artificial intelligence to establish the most likely path that the ball will travel based on any other current data available (e.g., data extracted from the camera data 1578, or data from the sensor data 1580, such as data for depth sensors, motion sensors/accelerometers, or sound information), or from historical data 1596, which includes information of what the person is most likely to do in a particular situation or environment. In this regard, by analyzing the person's movement over time, the ball path logic 1591 may learn how the person may respond to certain conditions (e.g., how fatigued the person is when the person is a two-person collaboration, when the person is driving toward a penalty area, when a defender attempts to break a ball, etc.), and then predict ball movement and trajectory based on such learned trends when a ball is obscured from view under similar conditions.
The ball path logic 1591 may analyze the current data and make a determination regarding the expected movement of the ball based on the current conditions associated with the person who shot, took the ball, and/or passed the ball. For example, if a person who shoots, takes, and/or passes is trapped by two defenders (two defender scenes), the ball path logic 1591 may determine that the person who shoots, takes, and/or passes will be less likely to use a back-taken ball (or other type of taken ball) or a back-passed ball (or other type of passed ball) in the direction of one of the defenders and determine the available directions in which the ball may be projected, taken, and/or passed to approximate the likely movement of the ball. The ball path logic 1591 may then evaluate the approximate movement of the ball, as described above.
If the ball path logic 1591 is unable to approximate the movement of the ball from the currently available data, the ball path logic 1591 may be able to approximate the movement of the ball based on historical data 1596 associated with the person who shot, took the ball, and/or passed the ball. In other words, the ball path logic 1591 may determine the approximate movement of the ball based on previous movements of the person under similar circumstances. For example, in a two defender scenario, the ball path logic 1591 may determine, based on historical data 1596, that a person who shoots, takes, and/or passes will typically attempt to shoot, take, and/or pass between defenders while facing both defenders. Using this determination, the ball path logic 1591 may approximate the trajectory or motion of the ball that causes the ball to travel between defenders. Ball path logic 1591 may then evaluate the approximate movement of the ball as described above.
In one embodiment, the ball path logic 1591 may determine the current location and situation of the person carrying the ball and/or passing the ball and determine possible movements that may be made from that location and situation. The ball path logic 1591 may then determine the probability of each possible movement being made by the person shooting, dribbling and/or passing and use the probability determination in determining an approximate movement of the ball. For example, in a two defender scenario, there may be multiple motions or sequences for a person with a ball, such as: taking out the ball; from left to right with balls crosswise; crossed from right to left with balls; balls are arranged between the front leg and the rear leg from left to right; the ball is held between the front leg and the back leg from right to left; the ball is arranged between the two legs from right to left and back to front; the ball is arranged between the two legs from left to right and back to front; from left to right with a ball on the back; and back with a ball from right to left. In another example using a dual defensive solution, there may be multiple actions or sequences (possibly after completing a dribbling action) for the passer (possibly after completing the dribbling action), such as: bouncing pass between defenders; bouncing pass balls on the left side of the defender; bouncing pass balls on the right side of the defender; thoracic pass between defenders; the chest on the right side of the defender passes the ball; passing the ball through the chest on the left side of the defender; passing the ball over the air by a defender; baseball passing overhead by defenders; passing the ball at the back of the left side of the defender; the back of the right side of the defender passes the ball. However, based on their historical data 1596, a person who shot, took and/or passed a ball may only be able to make a few possible shot, taken and/or passed sequences and may not have the necessary skill level and/or may have not used other possible shot, taken and/or passed sequences in the past. The ball path logic 1591 may assign a higher probability to a previous shot, dribbling and/or pass sequence made by the person and a lower probability to the other shot, dribbling and/or pass sequences. Rather, a more skilled player is able to make most or all of the possible shot, dribbling and/or passing sequences, and the ball control logic 1591 will assign different probabilities to the possible sequences. The ball path logic 1591 may then use the assigned probabilities to determine an approximate movement of the ball. Ball path logic 1591 may then evaluate the accuracy of the approximate movement of the ball as described above.
In one embodiment, the ball path logic 1591 may process a video/audio/depth sensing/motion sensing sequence that includes tagged descriptors provided by the reviewer of the camera data 1578 that describe, in a quantitative or qualitative manner, the dribbling mode, the level of dribbling ability, the conversion mode, the pass type, the passer mode, and/or the level of the passing ability. The ball path logic 1591 may use the tagged descriptors to build a knowledge base for machine learning and/or artificial intelligence. The degree of marking provided in the video/audio/depth sensing/motion sensing data may vary between no marking, a light marking or a full marking. As the knowledge base of the ball path logic 1591 increases, the machine learning and/or artificial intelligence of the ball path logic 1591 may be used to "track" the ball and the motion of the person shooting, dribbling and/or passing the ball for longer periods of time when the ball and person are mostly occluded from the perspective of the camera 1502 or sensor 1514.
In another embodiment, the ball path logic 1591 may be able to use data from only a single sensor (e.g., the camera 1502, audio detector, depth sensor, or motion sensor) to complete ball motion determination over the entire athletic playing surface, even if some aspects of the ball and/or the person who shot, took the ball, and/or passed the ball are occluded at many times. The ball path logic 1591 may determine the movement of the ball using one or more of the techniques described above, only occasionally detecting the ball by the recognition logic 1592 to position/reposition the ball between analysis techniques.
In yet another embodiment, the ball path logic 1591 may use machine learning and/or artificial intelligence to analyze historical data 1596 to reveal pattern and trend information. The ball path logic 1591 may then use this pattern and trend information when determining the probabilities associated with the position and movement of the ball.
Defensive player action logic 1593 may be operable to identify a particular person of a defensive player shooting, dribbling and/or passing, and determine or approximate the movements and actions of the identified defensive player. The defensive player action logic 1593 may determine the movement and actions of one or more defensive players (once identified) of the defensive player(s) shooting, dribbling and/or passing the ball even if the defensive player(s) cannot be continuously identified from the camera data 1578 through the identification logic 1592. For example, the identification logic 1592 (or computer vision logic 1564) may not be able to identify a defender because the defender may not be present in the camera data 1578. As the defender is concealed from the field of view of the camera 1502 by a person with the ball and/or one or more other people on the athletic playing surface, the defender may not be present in some portion of the camera data 1578. Furthermore, even if a defender is present in camera data 1578, identification logic 1592 may not be able to identify the defender because the defender is obscured in camera data 1578 due to poor lighting conditions and/or partial occlusion of the defender (particularly those used to identify characteristics of the defender).
Prior to determining the movement of the defensive player, defensive player action logic 1593 may determine whether the defensive player is a person defending against shooting, passing and/or passing a ball. Defensive player action logic 1593 may determine whether a defensive player or players are defending a player shooting, dribbling and/or passing person based on the distance between the defensive player and the shooting, dribbling and/or passing person and the position and/or orientation of the defensive player relative to the shooting, dribbling and/or passing person. For example, a defender within 5 feet of a person who shoots, takes and/or passes and facing the person who shoots, takes and/or passes may be considered to be a person who defends shooting, taking and/or passing. Once defensive player action logic 1593 determines that the defensive player is a defensive player of a person who shoots, takes and/or passes, defensive player action logic 1593 may identify the particular defensive player using information from identification logic 1592 regarding the identity of the player. The defensive player action logic 1593 may specifically identify the defensive player using identification information directly from the identification logic 1592 or the computer vision logic 1564. In another embodiment, defensive player action logic 1593 may make the identification of a particular defensive player based on information from identification logic 1592. For example, the defensive player action logic may use the body action data 1598 to identify a particular defensive player as each athlete may have a unique body motion profile. Defensive player action logic 1593 may then specify and store specific movements and actions of a particular defensive player in response to the actions of the person who shot, took the ball and/or passed the ball. The measurement logic 1597 may use information stored by the defensive action logic to evaluate the performance of the defensive player.
In one embodiment, defensive player action logic 1593 may identify the location of a defensive player's fingers, hands, elbows, shoulders, chest, head, waist, back, thighs, knees, calves, buttocks, ankles, feet, and/or other body parts in 3-D space. Further, once the individual body parts are identified, defensive player action logic 1593 may determine the relative position of the identified body parts to each other. The defender action logic 1593 may provide the body action data 1598 with information of the defender's body for use by the object tracker 1562. For example, the balance logic 1599 may use the body motion data 1598 to measure or infer the balance of the defensive player and the defensive player's ability to do so accordingly. In one embodiment, the defensive player's balance may be relative to the balance of normal persons from the selected group or the historical data 1596 may be used to "normal" balance relative to a particular defensive player. In another embodiment, as the players on the athletic playing surface alternate between offensive and defensive, the defensive player action logic 1593 may specifically identify each player and store the corresponding information for each player.
The defensive player action logic 1593 may use information from the identification logic 1592 to determine the movement and/or action of the defensive player. Additionally, defensive player action logic 1593 may assign parameters to defensive player movements and/or actions and classify the outcome of a particular defensive movement and/or action. Examples of some of the categories that may be used are: a defender foul, which may include information about the type of foul and other parameters associated with the foul; a defensive player breaking a ball (snap) from a person shooting, carrying and/or passing the ball, which may include information about the action causing the ball to break and other parameters associated with the ball break; the defensive player maintaining a defensive position with respect to the person who shot, took and/or passed the ball (e.g., a position where the defensive player faces the person who shot, took and/or passed the ball and is between the person and the basketball rim); the defender does not maintain the defending position for the person who shoots, takes and/or passes the ball; or other active result descriptors.
When the identification logic 1592 is unable to provide specific information about the location of the defensive player, perhaps due to shading or lighting, the defensive player action logic 1593 may determine the expected movement of the defensive player based on the last known location of the defensive player from the identification logic 1592 and other information stored in the memory 1566. The defensive player action logic 1593 may use computer learning to establish the most likely motion and/or action that the defensive player will make based on any other current data available (e.g., data extracted from camera data 1578 or data from sensor data 1580, such as depth sensors, motion sensors/accelerometers, or sound information), or from historical data 1596 that includes information about what the person is most likely to do in a particular situation or environment.
The defensive player action logic 1593 may analyze the current data and make a determination regarding the defensive player's expected movement and/or action based on the current conditions associated with the defensive player. For example, if the defensive player defends a person with a ball to the right of the defensive player, and the defensive player has previously moved to the right, the defensive player action logic 1593 may determine that the defensive player may continue to move to the right and the defensive player is unlikely to slide to the left. Thus, defensive player action logic 1593 may use computer learning and/or artificial intelligence to determine the likely direction in which a defensive player may be moving, view the results of the defensive sequence based on the location of the defensive player from identification logic 1592, and determine which defensive action was actually used.
If the defensive player action logic 1593 is not able to accurately approximate the defensive player's motion and/or action from the currently available data, the defensive player action logic 1593 may be able to approximate the defensive player's motion and/or action based on historical data 1596 associated with the defensive player. In other words, the defensive player action logic 1593 may determine the approximate movement and/or action of the defensive player based on the previous movement of the person under similar circumstances. For example, when a person with a ball moves to the left of the defensive player, the defensive player action logic 1593 may determine that the defensive player will likely move back one step based on the historical data 1596 and then move to the left. Using this determination, the defensive player action logic 1593 may approximate the movement and/or action of the defensive player to be backward and then to the left. Once the defender's information becomes available from the identification logic 1592, the defender action logic 1593 can evaluate the defender's approximate movements.
In one embodiment, the defensive player action logic 1593 may determine a current position and situation of the defensive player with respect to the person who shot, took the ball and/or passed the ball and determine possible motions and/or actions that may be made from the position and situation. Defensive player action logic 1593 may then determine a probability of the defensive player making each possible motion and/or action and use the probability determination in predicting or otherwise estimating an approximate motion or action of the defensive player. For example, if a defender is defending a person who cross-belts from right to left (from the defender's perspective), there are a number of motions and/or actions that the defender can make, such as: low bow step forward to break the ball with both hands; low bow step forward to reach up with right hand; low bow step forward to reach ball side with right hand; sliding to the left, keeping a distance from a person with the ball; a step backward to allow more space to prevent the person with the ball from advancing toward the basket; jumping forward to block the view of the person and prevent passing or shooting; jumping left or right or up to prevent people from passing; or "ankle break" stumbling because the dribbling action is very effective, causing the defender to lose defending position and/or balance. However, based on their historical data 1596, the defenders may only be able to make some possible movements or actions, and may not have the necessary skill level and/or may have not used other possible movements or actions in the past. Defensive player action logic 1593 may assign a higher probability to motions and/or actions previously performed by the person and a lower probability to other motions and/or actions. In contrast, a more skilled athlete may be able to perform most or all of the possible movements and/or actions, and the defensive player action logic 1593 will assign different probabilities to the possible sequences. Defensive player action logic 1593 may then use the assigned probabilities to predict approximate movement and/or actions of the defensive player. Then, once information about the defensive player becomes available from the identification logic 1592, the defensive player action logic 1593 may evaluate the approximate movement of the defensive player to determine whether the prediction is accurate.
In one embodiment, defensive player action logic 1593 may process various sequences (e.g., video sequences, audio sequences, depth sensor sequences, or motion sensor sequences) about the defensive player including a marker (or descriptor of a marker) with information about defensive player mode and/or defensive player capabilities in a quantitative or qualitative manner. The tags provide information and/or description about the content of the sequence (e.g., the action of a defensive player) and may be associated with the sequence (or file) similar to metadata. The sequence may have a single marker describing the action of the defensive player or multiple markers describing different actions of the defensive player. The indicia may correspond to an action or category of action (e.g., breaking or blocking) identified by the defensive player action logic 1593. The user may review the sequence (which may be obtained from camera data 1578) and apply the appropriate flag(s) to the actions of the defensive player in the sequence. When applying the tags, the user may select from a predetermined list of tags and/or may create their own tags. The degree of labeling provided in the sequence data can vary between no labeling, light labeling, or heavy labeling. The defender action logic 1593 can use the tagged descriptors to build a knowledge base for machine learning and/or artificial intelligence. As the knowledge base of the defensive action logic 1593 grows, the machine learning and/or artificial intelligence of the defensive action logic 1593 may be used to "track" the movement of the defensive for a longer period of time if the defensive is largely occluded from the field of view of the camera 1502 or sensors 1514.
In another embodiment, the defensive player action logic 1593 may be able to use data from only a single sensor (e.g., the camera 1502, audio detector, depth sensor, or motion sensor) to complete defensive player motion and/or action determination over the entire athletic playing surface even though the defensive player may be occluded many times. The defensive player action logic 1593 may determine the movement of the defensive player using one or more of the techniques described above, detecting the defensive player only occasionally through the identification logic 1592 to assess/re-assess the position of the defensive player between analysis techniques.
The measurement logic 1597 may be used to analyze data regarding the person who shot, took and/or passed the ball and the defenders of the person who shot, took and/or passed the ball. The measurement logic may use information from the identification logic 1592, the ball path logic 1591, the defender action logic 1593, the balancing logic 1599, the historical data 1596, the body action data 1598, and/or the evaluation data 1582 to analyze the performance and abilities of the person shooting, carrying, and/or passing and the defender(s) of the person shooting, carrying, and/or passing.
Measurement logic 1597 may determine the proficiency of the person carrying the ball with respect to a number of different characteristics of the ball. For example, some of the dribbling characteristics of a person who dribbling that may be evaluated by measurement logic 1597 may include: a person with a ball makes a very low dribbling, a very fast dribbling, a fast change in speed of the dribbling (i.e. acceleration or deceleration), a fast change in direction of the dribbling, a number of fast changes in direction of the dribbling, a very fast stop of forward or sideways movement while keeping the dribbling, a fast transition from dribbling to hand shooting, a fast transition from dribbling to pass (for a wide variety of pass types and situations) and/or any other desired dribbling characteristics. Each of these dribbling properties can be described by one or more quantitative parameters. For example, a very low dribbling may be characterized by keeping the dribbling height (actual or average) below a predetermined value, a very fast dribbling may be characterized by a person maintaining the dribbling number per second above a predetermined value, a fast change in dribbling speed may be characterized by completing a change in dribbling number per second within a predetermined time period, a fast change in dribbling direction may be characterized by completing a change in direction within a predetermined time period, a plurality of fast changes in dribbling direction may be characterized by completing several changes in direction within a predetermined time period, a very fast stop of forward or sideways movement while keeping the dribbling may be characterized by ending an active movement within a predetermined time period and/or a predetermined distance (while keeping the dribbling action), a fast transition from dribbling to hand shooting may be characterized by a time from the dribbling action to the shooting action within a predetermined time period, and a fast transition from dribbling to passing may be characterized by a time from the dribbling action to the passing action within a predetermined time period. Each dribbling characteristic may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency with the dribbling characteristic. In one embodiment, the measurement logic 1597 may provide a person's dribbling characteristics relative to an individual defender.
The measurement logic 1597 may also determine the proficiency of the person with the ball with respect to the ability to achieve the same ball pattern each time. Measurement logic 1597 may evaluate a person's ability to complete a training sequence that may require a specified dribbling speed, a specified dribbling height, a specified speed change, a specified change in dribbling position, a specified change in head/shoulder/elbow/hand/finger/chest/waist/thigh/knee/ankle/foot position, and/or a specified balance maintenance. The measurement logic 1597 may also determine the proficiency of the person with the ball by evaluating whether the person is able to repeat the same ball-taking action in an efficient manner in a game situation. Each of these cases may be described by a quantitative parameter or a set of parameters. For example, to assess a person's proficiency in completing the training sequence, measurement logic 1597 may individually assess the person's completion of each of the various tasks (which may correspond to one or more parameters) in the training sequence. Each of these parameters may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency in the action of dribbling.
The measurement logic 1597 may evaluate the performance of the person with the ball based on the number and type of different actions performed by the person with the ball in response to the same or similar circumstances. In other words, measurement logic 1597 may determine the proficiency of a person not repeating the same dribbling pattern each time. The ability of a person to change the dribbling action used in response to a particular situation may be used to limit the effectiveness of a defensive player in identifying and responding to repetitive patterns in the person's dribbling action. A measure of a person's ability to not repeat the same dribbling pattern can be described by one or more quantitative parameters. Each of these parameters may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve the non-repeat ability.
The measurement logic 1597 may determine the pass's proficiency with respect to a number of different pass characteristics. For example, some pass characteristics of a passer that may be evaluated by measurement logic 1597 may include a very quick pass, providing a ball at a predetermined location relative to the receiver, a quick transition from a dribbling to a pass, and/or any other desired pass characteristics. Each of these pass characteristics may be described by one or more quantitative parameters. For example, a very quick pass may be characterized by passing the ball at a speed greater than a predetermined value, providing the ball at a predetermined position relative to the person receiving the pass may be characterized by providing the ball within a predetermined distance of a point (e.g., center of chest) of the person receiving the pass, and a quick transition from a dribbling to a pass may be characterized by a time to transition from a dribbling action to a passing action within a predetermined period of time. Each of the pass characteristics may be further characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency in passing the characteristics. In one embodiment, measurement logic 1597 may provide pass characteristics of the person relative to the individual defender.
The measurement logic 1597 may also determine the pass proficiency of the passer with respect to the person's ability to effect the same pattern of passes each time. Measurement logic 1597 may evaluate a person's ability to complete a training sequence that may require a change from a specified pass of the dribbling type, a specified pass speed, a specified pass position, a specified head, shoulder, elbow, hand, finger, chest, waste, thigh, knee, ankle, and/or foot position, and/or a specified balance maintenance. The measurement logic 1597 may also determine the pass's proficiency by evaluating whether the pass can repeat the same pass in a highly efficient manner in a game situation. Each of these cases may be described by a quantitative parameter or a set of parameters. For example, to assess the proficiency of the person in completing the training sequence, measurement logic 1597 may individually assess how well the person is completing each individual task (which may correspond to one or more parameters) in the training sequence. Each of these parameters may be further characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency in a pass.
The measurement logic 1597 may evaluate the pass's performance based on the number and type of different movements made by the pass in response to the same or similar conditions. In other words, the measurement logic 1597 may determine the proficiency of not repeating the same pass pattern each time by the passer. The ability of a passer to alter the pass through action used in response to a particular situation may be used to limit the effectiveness of a defensive player in identifying and responding to a repetitive pattern in the person's pass through action. For example, the ability of the person to perform different types of passes (e.g., bounce passes, chest passes, etc.) after performing different dribbling games (e.g., back dribbling, cross dribbling, leg dribbling, etc.) may limit the effectiveness of defensive players. The ability to measure that person's ability to not repeat the same pass pattern can be described by one or more quantitative parameters. Each of these parameters may be further characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve the non-repeating capability.
The measurement logic 1597 may determine the skill of the shooter with respect to a number of different shooting parameters associated with the shooting characteristics. For example, some of the basketball shots parameters of the basketball player that may be evaluated by measurement logic 1597 may include any desired parameters associated with the type of shot, the entry angle of the shot, the drop point of the shot (e.g., the depth of the shot and the left and right position of the shot), the rotational speed of the shot, the rotational axis of the shot, the release height of the shot or the release speed of the shot, and/or any other desired parameters associated with the characteristics of the shot. Each of these shooting parameters may be described by one or more quantitative parameters. For example, the entry angle of a shot may be characterized by whether the entry angle is greater than or less than a predetermined value or otherwise within a predetermined range, the drop point of the shot may be characterized by the ball shooting within a predetermined distance(s) of a location point associated with the rim, and the rotational speed of the ball may be characterized by shooting greater than or less than a predetermined value or otherwise within a predetermined range. Each shot characteristic may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency at that shot characteristic. In one embodiment, measurement logic 1597 may provide the shooting characteristics of the person relative to the individual defender.
Measurement logic 1597 may determine the proficiency of the person in transitioning from dribbling to passing with respect to a number of different characteristics. For example, measurement logic 1597 may evaluate one or more dribbling characteristics of the person during a time period associated with the end of a dribbling action (as described above), and evaluate one or more pass characteristics of the person during the beginning (and possibly the completion) of a pass action (as described above). In addition, the measurement logic 1597 may also evaluate the transfer characteristics of the person when switching from a dribbling action to a passing action. Each of these dribbling, passing and/or transfer characteristics may be described by one or more quantitative parameters. For example, a quick transition from a dribbling to a pass may be characterized by a time from the dribbling action to the pass within a predetermined time period. Each of the dribbling, passing, and/or conversion characteristics may be further characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency on the respective characteristic.
The measurement logic 1597 may also determine the proficiency of a person in switching from dribbling to passing relative to the person's ability to effect the same pattern of switching each time the person is. Measurement logic 1597 may evaluate a person's ability to complete a training sequence that may require a specified transition from a dribbling type, a specified transition to a pass type, a specified change in position of the head, shoulder, elbow, hand, finger, chest, waste, thigh, knee, ankle, and/or foot, and/or a specified maintenance of balance. The measurement logic 1597 may also determine the proficiency of the person in switching from dribbling to passing by evaluating whether the person is able to repeat the same dribbling and passing actions in a highly efficient manner in a competition situation. Each case can be described by a quantitative parameter or a set of parameters. For example, to assess the proficiency of the person in completing the training sequence, measurement logic 1597 may individually assess the completion of each individual task (which may correspond to one or more parameters) of the training sequence by the person. Each of these parameters may be further characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency in the transition from a dribbling action to a pass action.
The measurement logic 1597 may evaluate the performance of a person's transition from dribbling to passing based on the number and type of different actions the person takes when ending a dribbling action and starting a pass when dealing with the same or similar situations. In other words, the measurement logic 1597 may determine the proficiency of the person not repeating the same pattern of switching from dribbling to passing each time. The ability to switch from dribbling to passing to change the person's ability to handle the dribbling and passing actions for a particular situation may be used to limit the effectiveness of the defensive player in identifying and handling the repetitive pattern of the person. For example, the ability of the person to perform different types of passes (e.g., bounce passes, chest passes, etc.) after performing different dribbling actions (e.g., back dribbling, cross dribbling, leg dribbling, etc.) may limit the effectiveness of defensive players. The ability of the person to measure the same pattern of not repeating a switch from dribbling to passing can be described by one or more quantitative parameters. Each of these parameters may be further characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve the non-repeating capability. In one embodiment, the measurement logic 1597 may use machine learning and/or artificial intelligence to directly or indirectly measure and/or evaluate a player's performance in shooting, dribbling, and/or passing.
In one embodiment, measurement logic 1597 may determine the number of attacks by the person with the ball and/or pass. An attack may be defined as the last pass to a person that directly results in a shot score. Additionally, for an attack, the passer must move directly toward the basket in a "scoring action," which may include dribbling. Measurement logic 1597 may be used to determine when a person with the ball and/or pass makes a pass to a teammate and when the teammate receiving the pass makes (and throws) a shot (i.e., a shot score) under the basket. Measurement logic 1597 may track the movements and actions of the teammate receiving the pass and determine if the teammate receiving the pass has performed a "scoring action". Measurement logic 1597 may determine scoring actions based on the movements and actions of teammates participating in the shot scoring and many other factors, such as the amount of time between the receipt of a pass and the shot scoring, the movements of teammates toward the basket, and the location at which a shot is made relative to the location at which the pass is received. The measurement logic 1597 may also track the number of passes made by the passer for each of his/her teammates. In another embodiment, the measurement logic 1597 may determine whether a person with a ball has made a shot on the basket (and is making a shot).
The measurement logic 1597 may also evaluate the effectiveness of the person with the ball relative to the defender(s) defending the person. Measurement logic 1597 may use information from balance logic 1599 to determine the change in the defensive player's body orientation and position and balance as a result of the dribbling action. For example, a player may cause a defender to stumble and/or fall after performing a particular dribbling action-e.g., cross dribbling-which enables the player to "defeat" the defender and proceed to a vacant location on the basket or field. The measure of the ability of the person with the ball to negatively affect the balance and position and orientation of the defensive player to enable the person with the ball to advance to the basket may be described by one or more quantitative parameters. Each of these parameters may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of ability to disrupt the defensive player's body orientation and position and balance.
The measurement logic 1597 may also determine the ability of the person shooting, dribbling and/or passing to accomplish one or more relevant goals (e.g., high assault and/or low miss times). These goals may be calculated based on the overall performance of the person or with respect to individual defensive players. A player's ability to achieve a relevant goal and determine how much of a higher order goal is achieved due to shooting, dribbling, and/or passing skills may be described by a quantitative parameter or set of parameters. For example, measurement logic 1597 may determine the effectiveness of a dribbling action in creating a vacant pass route (which results in scoring the person who picked up the ball) for the person who took the ball when evaluating the passing performance and/or number of attacks of the player. Each of these parameters may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve a higher order goal.
Measurement logic 1597 may determine the proficiency of a defender for many different defending characteristics. For example, some defensive characteristics of a defensive player that may be evaluated by measurement logic 1597 may include the defensive player having a very fast forward speed, a very fast forward acceleration, a very fast bow step acceleration, a very low forward bow step, a very fast lateral defensive speed, a very fast lateral defensive acceleration, a very low lateral defensive position, a very fast change in direction of side shift, a very fast end of side shift, a very fast turn rate, a very fast transition from a ballistics take defensive position to a pass intercept position, a very fast transition from a ballistics take defensive position to a basketball shooting defensive position, and/or any other desired defensive characteristics. Each of these defensive characteristics may be described by one or more quantitative parameters. For example, a very fast forward speed may be characterized by keeping the forward speed (actual or average) above a predetermined value, a very fast forward acceleration may be characterized by having an acceleration rate above a predetermined value, a very fast forward bow-step acceleration may be characterized by having a bow-step acceleration rate above a predetermined value, a very low forward bow-step may be characterized by keeping the forward bow-step position of the defensive player below a predetermined height, a very fast lateral defensive speed may be characterized by keeping the lateral speed (e.g., the speed of movement to one side) above a predetermined value, a very fast lateral defensive acceleration may be characterized by having a lateral acceleration rate above a predetermined value, a very low lateral defensive position may be characterized by keeping the lateral defensive position below a predetermined height, a very fast change in direction of side movement may be characterized by switching from side movement to opposite side movement within a predetermined time period, a very fast end of side movement may be characterized by stopping movement within a predetermined time or a predetermined distance, a very fast turning speed may be characterized by keeping the turning speed (actual or average) above a predetermined value, a very fast transition of the ball from the very fast turn-in position of the ball intercepting the defensive ball to the turn may be characterized by a very fast turn from the ball-taking time period from the turning of the defensive ball-taking the defensive position to a very fast turn of the defensive ball-taking time period. Each defensive characteristic may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency in the defensive characteristic. In one embodiment, the measurement logic 1597 may provide defensive characteristics of a defensive player relative to an individual offensive player (e.g., a particular person with a ball).
The measurement logic 1597 may also determine the defensive proficiency of the defensive person with respect to the ability of the defensive person to implement one or more modes of defensive action. Measurement logic 1597 may evaluate the defender's ability to complete a training sequence that may require a specified forward speed, a specified defense height, a specified speed change, a specified defending position change, a specified body position change, and/or a specified balance maintenance. The measurement logic 1597 may also determine the level of proficiency of the defensive player by evaluating whether the person is able to repeat the same defensive movement or action in an efficient manner in a competition situation. Each of these cases may be described by a quantitative parameter or a set of parameters. For example, to assess the proficiency of a defender in completing a training sequence, measurement logic 1597 may separately assess the completion of each separate task (which may correspond to one or more parameters) in the training sequence by the defender. Each of these parameters may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve proficiency in defensive motion and/or actions.
The measurement logic 1597 may evaluate the defender's performance based on the number and type of different movements the defender takes in response to the same or similar circumstances. In other words, the measurement logic 1597 may determine the proficiency of a defensive player not repeating the same defensive motion and/or action for a given situation. The ability of a defensive player to alter defensive movements and/or actions used in response to a particular situation may be used to limit the effectiveness of a player with a ball in identifying and responding to repetitive patterns in the defensive action of the defensive player. A measure of a person's ability not to repeat the same defensive mode of operation may be described by one or more quantitative parameters. Each of these parameters may also be characterized by some type of qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve the non-repeat ability.
The measurement logic 1597 may also determine the ability of a defender to accomplish one or more relevant goals, such as, for example, a large number of snap-offs, a high number of blocks, a large number of pass-offs, a large number of throw-offs, and/or a large number of blockages. The goal may be calculated based on the overall performance of the defender or relative to the individual attacking athlete. The relative goals achieved by the defensive player and how much the achievement of the relative goals is due to a measure of the ability of the defensive professional skill can be described by a quantitative parameter or set of parameters. For example, the measurement data 1597 may determine the effectiveness of a defensive action in positioning a defensive player to deflect a ball from a person with the ball or to break the ball from the person with the ball. Each of these parameters may also be characterized by a qualitative or quantitative score from scoring logic 1595 that indicates the level of skill required to achieve the relevant goal.
In one embodiment, measurement logic 1597 may determine the player's response to the defender using "block-out" and the defender's response to the block-out. Catch-off is a known term for basketball and generally refers to a game or situation when an offensive player (hereinafter referred to as an "offensive catch-off") establishes a fixed position when there is no ball to block the path of a defensive player of another offensive player (hereinafter referred to as an "offensive catch-off target") heading toward the catch-off being set. The attack block target may own the ball or may attempt to receive a pass from a person with the ball. Measurement logic 1597 may detect the onset of a block-out by determining that an offensive block-out in the vicinity of a defender targeting the block-out establishes a fixed position such that the defender's path intersects the offensive block-out's fixed position.
Note that there are various factors that may be used to determine whether a downshift has occurred. As an example, if a defensive player contacts an offender within a predetermined time after establishing a fixed position of the victim, the probability of the victim may be increased. Additionally, the orientation of the offensive tamper to the defender of the offensive tamper target may indicate whether tampering has occurred. In this regard, offensive block-and-tear pickers often face defenders when setting up block-and-tear to help increase the width of the block-and-tear, and thereby help increase the effectiveness of the block-and-tear to disrupt the defender's path. Additionally, the proximity of an attack block-out target relative to an attack block-out may indicate whether block-out is occurring. In this regard, the attack block target typically passes within a close distance as it passes by the attack block, or even comes into contact with the attack block as it passes by the attack block. Thus, a person detecting that an attacking tampering target has passed within a predetermined distance of an attacking tampering may indicate the occurrence of tampering. Measurement logic 1597 may detect any of the events described above as indicating a tie-down, and may detect the occurrence of a tie-down based on any combination of these factors. As an example, in evaluating whether the fixed position of an offensive breaker constitutes a break, measurement logic 1597 may calculate an increase in the break score by an amount for each detected event indicative of the occurrence of a break when the offensive breaker is in a fixed position. If the catch-off score exceeds a predetermined threshold, the measurement logic 1597 may detect the occurrence of catch-off. In other embodiments, other techniques for detecting the occurrence of a fence break are possible.
When a block is detected, measurement logic 1597 may evaluate how each player performs during the block and track the results over time to determine a score, referred to herein as a "block score," that indicates the skill level of each player in making or keeping guard against the block. As an example, the measurement logic 1597 may determine how the defender responds to the catch split. Measurement logic 1597 may determine whether a defender has entered "up" or "down" to block off, stop the sport, or switch the defending task with another defender so that the defender is no longer defending the person with the ball.
In this regard, as is commonly understood in basketball, it is often desirable for a defensive player to defend against a block removal by entering "above" the block removal. By "above" the tamper is generally meant that the defensive player passes the attacking tamper on the same side of the attacking tamper target. This is often a more challenging course of action for a defensive player, as it is often difficult to "punch through" the tamper to remain on the same side of the offensive tamper target. However, entering the tamper "up" typically allows the defensive to maintain a well-guarded position relative to the offensive tamper target by the tamper remaining close to the offensive tamper target. In contrast, access "below" the tamper typically refers to when a defensive player passes the offensive tamper on the opposite side of the offensive tamper as the offensive tamper target. This is generally easier for a defender to achieve "over" the access stop, but it can result in a separation between the defender and the offensive stop target, which is undesirable because it typically gives the offensive stop target the opportunity to perform, such as unwarely shooting or driving a basket toward the basket.
Measurement logic 1597 may determine whether a defender is above or below gear-out by determining the defender's position relative to the offensive gear-out and the offensive gear-out target. For example, based on images captured by a camera or otherwise, the measurement logic 1597 may determine whether the defensive player and the offensive decoy target pass on the same side of the offensive decoy. If so, measurement logic 1597 determines that the defender has entered the above "catch. If measurement logic 1597 determines that the defender and the attack block and tear target pass on opposite sides of the attack block and tear, measurement logic 1597 determines that the defender has entered block and tear "below".
Measurement logic 1597 may track how the defender responds to being blocked over time, and may also track how the defender responds to blocking from various offensive players. As an example, measurement logic 1597 may track the number of times a defensive player enters "up" for a given period of time and provide a parameter indicating such a number (e.g., the percentage of gear-out that the defensive player enters "up" for gear-out). Measurement logic 1597 may similarly track other results, such as the number of times a defensive player enters "below" or otherwise prevents tampering. The measurement logic 1597 may also determine various parameters indicative of the effectiveness of the defensive player's response to the fence open. As an example, for each gear break, the measurement logic 1597 may determine whether the defender is able to maintain a defensive position relative to the attack gear break target (e.g., within a particular distance of the attack gear break target and/or between the attack gear break target and the basket) or whether the attack gear break target is able to perform a particular action by the gear break (e.g., within a predetermined time period after the attack gear break has passed), such as space casting the basket or driving toward the basket without defender's defense. Measurement logic 1597 may track the number of times one or more outcomes occur over a given time period and provide a parameter indicating such number (e.g., the percentage of upsets that occur for a certain outcome). The parameters tracked by measurement logic 1597 may be correlated with the offensive player so that the performance of the offensive player by the defensive player may be determined and evaluated. As an example, the data provided by measurement logic 1597 may be used to determine how many times a defender enters a shift (or takes some other action) "above" a particular shift set by the shift-in-person-the number of times such a shift is "below" relative to his entry. Thus, it is possible to evaluate the performance of the gearshift set by the defensive player for a particular gearshift.
Measurement logic 1597 may similarly track offensive player movements and actions related to gear splitting. In this regard, the same or similar actions and events tracked by measurement logic 1597 for evaluation of a defender's game may be used to evaluate an offensive player's game. As an example, measurement logic 1597 may track the number of times that an attack on a stuck target during a stuck opens causes his defensive player to enter "under" the stuck open or perform some other action. Measurement logic 1597 may also track the number of times an attack block target can shoot a basket, carry a ball toward a basket, pass a ball to another offensive player (which may lead to an attack), or perform some other action resulting from the block.
Measurement logic 1597 may similarly evaluate the performance of the offender. As an example, measurement logic 1597 may determine the proximity of an offensive tamper to a defender of the offensive tamper target. Measurement logic 1597 may determine the speed at which an attacking fence breaker can set fence breaking, how fast the fence breaker obtains a fixed position, and the time at which the fence breaker sets fence breaking, i.e., the time between when the fence breaker sets fence breaking and the time when a defender reaches the fence breaker or contacts the fence breaker. The measurement logic 1597 may evaluate the general effectiveness of an offensive decoy by tracking one or both of the offensive decoy target and the defensive's response, and may also evaluate the effectiveness of the decoy for individual teammates and/or individual defensive persons. In particular, similar to the tracking of defenders described above, the measurement logic 1597 may correlate the parameters tracked for a given offensive player with the catch-and-break defenders in order to track the performance of the offensive player for certain defenders. As an example, the data provided by measurement logic 1597 may be used to determine how many times a particular attacker, attack and knock-out target or pair of attacker and attack and knock-out target has caused a particular defensive player to enter "below" (or perform some other action) the knock-out.
In one embodiment, measurement logic 1597 may evaluate the effectiveness of an attacking victim by determining when the victim performs an illegal victim. Measurement logic 1597 may determine an illegal gear-split based on whether the person was committed, sometimes referred to as a "mobile" gear-split. Measurement logic 1597 may also determine illegal gear breaks by evaluating the movement of an offender, regardless of whether a foul is blown. Measurement logic 1597 may determine illegal tampering by determining the extent to which the buttocks or torso of the disassembler moves during the disassembly. The measurement logic 1597 may also detect illegal tampering if the disabler moves (e.g., "sticks out") the hips, knees, legs, elbows, etc. while in a fixed position to impede the progress of the defensive player in an impermissible manner.
In one embodiment, the measurement logic 1597 may use entropy modeling to determine when dribbling, conversion and/or passing unpredictability, breaking unpredictability and/or defensive unpredictability are beneficial or detrimental to the player and/or team. For example, the measurement logic 1597 may determine that the player's ball-in, conversion, and/or pass unpredictability is beneficial because the player's unpredictability may make it more difficult for a defender to "spy" the player's movements. However, if the player has no good control of the ball and produces a large number of misses or low scores or attack counts, the measurement logic 1597 may determine that the player's ball-in, conversion, and/or pass unpredictability is detrimental.
The improvement logic 1594 may be used to analyze data regarding the person who shot, took and/or passed the ball and the defenders of the person who shot, took and/or passed the ball and recommend methods for improving the shot, taken and/or passed ability or defending ability and predicting the amount of improvement in shot, taken and/or passed ability or defending ability. The improvement logic 1594 may use information from the identification logic 1592, ball path logic 1591, defensive player action logic 1593, balance logic 1599, measurement logic 1597, historical data 1596, body action data 1598, and/or evaluation data 1582 to identify opportunities to improve the performance and abilities of the person who shot, took and/or passed the basketball and the defensive player(s) who shot, took and/or passed the basketball.
The improvement logic 1594 may recommend specific exercise routines, game training and technical modifications based on the specific performance aspects that need to be improved. For example, if the measurement logic 1597 indicates that a person is carrying a ball or passing it too high such that the ball is often broken by a defensive player, the improvement logic 1594 may recommend one or more training or exercise routines that require the person to carry the ball at a lower carry height or pass at a lower pass height. In another example, if the measurement logic 1597 indicates that defenders often allow a ball carrier to easily move past them on the way to the basket, the improvement logic 1594 may recommend one or more training or exercise routines to improve lateral defending speed.
The improvement logic 1594 may map particular training or exercise routines to performance aspects. The improvement logic 1594 may also map skill level designations (e.g., requiring significant improvement) to training or exercise routines. Then, when the improvement logic 1594 identifies a performance aspect that needs improvement, the improvement logic 1594 may select training or exercise routines that have been mapped to the performance aspect that needs improvement. The improvement logic 1594 may also narrow the selection of training or exercise routines from the mapping based on the evaluation of performance aspects by the measurement logic 1597 so that the selected training or exercise routine better matches the actual skill level of the person.
In another embodiment, the improvement logic 1594 may not be able to recommend specific exercise routines, competitive training and technical modifications because there may be multiple aspects of performance that need to be improved and/or there may be multiple exercise routines, competitive training and technical modifications that may be used to address a particular aspect that needs to be improved. For example, if a player often has their shot blocked or shot touched by a fingertip while making a jump with the ball, the problem may be that they are not sufficiently separated from the defender before making the ball, or the problem may be that the transition from the ball to the shot position is slow, or the problem may be that the shooting hands are low (determined by measurement logic 1597 based on the trajectory of the ball shot and the position of the player's body parts, such as his/her hands and/or elbows), or the problem may be a combination of the aforementioned challenges. In another example, if a defender is often unable to break a dribbling shot, such as blocking or reaching the ball, the problem may be the inability to limit the separation of the person with the ball before the shot is made, or the problem may be a slow transition from the dribbling defending position to the shooting defending position, or the problem may be the placement of the hands off-center, or the problem may be a combination of the above challenges.
In the case of an improvement or multiple possible improvements that cannot be readily identified, the improvement logic 1594 may select other players (e.g., other ball-carriers or defensive players) that have previously exhibited similar ball-carrying or defensive performance and then tabulated the improvements (e.g., improvements above a threshold amount) based on the shooting, ball-carrying, and/or passing or defensive characteristics determined by the measurement logic 1597. The improvement logic 1594 may store information about each person's completed training or exercise routine in the historical data 1596. The improvement logic 1594 may also store information on the person's performance level after completion of the training or exercise routine and correlate changes in the athlete's performance level with the training or exercise routine. The improvement logic 1594 may review the exercise techniques and improvement progress of the selected player from the historical data 1596 to determine the optimal set of exercise techniques for the person or defender who made the shot, took the ball and/or pass being analyzed by the improvement logic 1594.
In another embodiment, improvement logic 1594 may use information from scoring logic 1595 to determine the performance aspects that require the most improvement. The improvement logic 1594 may consult historical data 1596 of other athletes (e.g., other ball carriers, passers, or defenders) with similar performance aspects in need of improvement and their corresponding practice techniques to determine the optimal set of practice techniques and predicted improvements for the aspects in need of improvement.
In one embodiment, the historical data 1596 may include a large database of many athletes having many parameter types that have experienced many practice and play dribbling scenarios that may all be quantitatively measured. The improvement logic 1594 may implement a method to maximize the improvement process in the most efficient manner. For example, the improvement logic 1594 may identify patterns across multiple quantitative dimensions in order to describe a particular problem and then specify the best approach to improve.
The balance logic 1599 may be used to measure and/or classify the effectiveness of the athlete's balance on the athlete's performance. For example, if a person with a ball has good balance, the person can move left, right, back, forward, up, down more efficiently and at different speeds, accelerations, heights, and angles using different ball-carrying techniques. In one embodiment, the balance logic 1599 may use machine learning and/or artificial intelligence to measure and/or classify the athlete's balance, directly or indirectly.
In one embodiment, the balance logic 1599 may directly assess the athlete's balance by determining and analyzing the athlete's center of gravity relative to the athlete's body. The balance logic 1599 may determine that the athlete has good balance if the athlete's center of gravity does not change position quickly in response to the athlete's movements. Balance logic 1599 may also make indirect determinations regarding balance based on factors such as fluency, rapid acceleration, foot placement, and/or slowness. For example, if balance logic 1599 determines that the athlete has fluent movements, balance logic 1599 may determine that the athlete has better balance than an athlete whose movements are less fluent. Similarly, if the balance logic 1599 determines that the athlete has rapid acceleration in one or more movements, the balance logic 1599 may determine that the athlete has better balance. The balance logic 1599 may also make determinations regarding the athlete's balance based on the athlete's foot placement in response to various conditions.
Further, the balance logic 1599 may also be used to determine the ability of a defender to respond to a particular situation. The ability of a defensive player to respond to a situation depends on the action of the person carrying the ball. For example, if a person with a ball is attempting to shoot a shot with a ball (as determined by ball path logic 1591), balance logic 1599 may determine whether the defender is in a low or extended position and thus determine the defender's ability to respond. For example, if the defender is already in the extended position determined by balance logic 1599, balance logic 1599 may determine that the defender has no available expected muscle contractions to properly respond to the upward movement made by the person with the ball. Further, balance logic 1599 may also determine whether the defensive player's response capabilities are limited by the physical location of other defensive players or the physical location of the person who shot, took the ball and/or passed the ball.
In one embodiment, the historical data 1596 may include data obtained during training sequences in a restricted training space (e.g., bounded dribbling area 1516). An example of a restricted training space is described by: U.S. patent No.9/734,405 entitled "Systems and Methods for Monitoring Objects in Athletic Playing Spaces," published 8/15 in 2017, which is incorporated herein by reference. When the historical data 1596 is obtained during the training sequence, information about the movement of the ball relative to the person's movement may be more easily obtained because the camera(s) 1502 and sensor 1514 may be placed in appropriate locations to reduce and possibly eliminate any obstruction of the ball or person. The complete tracking of the ball and person in the historical data 1596 may allow the ball path logic 1591 to more accurately determine the probability of movement when the ball or person is occluded. The ball path logic 1591 may determine the expected movement of the ball based on information in the historical data 1596 similar to the location and position of a carrier or passer when the ball is occluded.
Performance evaluation logic 1603 may be used to evaluate the performance of an athlete and determine whether the athlete's performance level is less than the athlete's expected performance level. Some reasons that an athlete's performance level may be lower than the expected performance level may include: an athlete has unreported injuries, the athlete is affected by a substance, or the athlete intentionally attempts to change the natural outcome of a sporting event or a particular portion of a sporting event (e.g., the athlete intentionally attempts to lose a sporting event, misses a shot during a sporting event, or performs another type of action, such as whether to cover a break, for the intentionally changing the outcome of a wager related to a sporting event). The performance evaluation logic 1603 may use information and/or data from the measurement logic 1597, historical data 1596, fatigue data 1601, body motion data 1598, balance logic 1599, scoring logic 1595, and/or identification logic 1592 to determine whether the athlete's performance is below the performance level of the prospective athlete.
The performance evaluation logic 1603 may classify the action taken by the athlete (e.g., a shot, pass, or dribbling) based on information from the measurement logic 1597 and obtain a performance characteristic (which may be represented by a representative parameter) associated with the action. In one embodiment, the actions may be categorized based on the identity of the athlete engaging in the action and the type of action that has been taken. In other embodiments, the performance may be further categorized based on other factors, such as the status of the sporting event (e.g., a particular time and/or score of the sporting event), the level of fatigue of the athlete, the defensive player guarding the athlete, the level of defense applied to the athlete, the athlete's hands used for performing the performance, etc. Further, performance characteristics of the action may include characteristics related to ball flight or ball motion in the action and characteristics related to body positioning of the player performing the action (e.g., balance, head position, hand position, leg position, body position, etc.).
Once an action is classified, the performance evaluation logic 1603 may evaluate performance parameters related to the characteristics of the action against historical performance parameters of the same classified action (i.e., actions having the same factors) stored in the historical data 1596. As the athlete engages in training sequences and/or participates in a sporting event, information and/or data regarding the athlete's actions (e.g., passing, shooting, and/or dribbling) is obtained, categorized, and stored in historical data 1596 to build a profile of the athlete's performance. For each categorized action, the profile of the athlete's performance may include an expectation of the performance characteristics for that action. The expectation for a particular performance parameter may include an expected value, a range of expected values, or other suitable parameters. The anticipation of a performance parameter may be based on the athlete's previous actions (and corresponding performance parameters). Performance parameters of previously recorded actions may be numerically and/or statistically processed to generate expectations. Further, in one embodiment, each performance parameter of an action may have a respective expectation, and each performance parameter may be evaluated against the expected value. In other embodiments, all performance parameters (or a portion thereof) may be evaluated as a group against expectations associated with the group. For example, if the action is a jump shot, some performance parameters that may be evaluated relative to expected values may include parameters associated with an entry angle, a release height of a shot, a shot position, and any other performance characteristic suitable for a jump shot.
The performance evaluation logic 1603 may determine a probability that the athlete's motion is an atypical (i.e., unexpected) motion for the athlete based on an evaluation of the performance parameters for the motion and the expected performance parameters for the motion. Alternatively, the performance evaluation logic 1603 may determine a probability that the athlete's actions are typical (i.e., expected) for the athlete. The determined probability may then be compared to a threshold to determine a likelihood that the player's action is atypical of the player based on the player's previous performance. If the determined probability is within the threshold, the performance evaluation logic 1603 may derive that the athlete's actions are typical for the athlete. However, if the determined probability is outside of the threshold, the performance evaluation logic 1603 may determine that the athlete's actions are atypical for the athlete and take additional steps to determine whether the athlete's performance degradation is intentional (or caused by an injury).
In one embodiment, the performance evaluation logic 1603 may take additional steps to notify the user that the athlete's performance is atypical (i.e., below the athlete's expected performance level) so that the user can evaluate the athlete's performance. In another embodiment, performance evaluation logic 1603 may store information for atypical movements of the athlete and use the stored information for atypical movements of the athlete to determine a probability that the athlete's performance does not meet the athlete's full abilities. If the athlete has only a small number of atypical actions (e.g., 1 or 2) in the sporting event (or portion thereof), performance evaluation logic 1603 may generate a lower probability that the athlete's performance will decrease. If the athlete has more atypical actions (e.g., 4 or 5) or a predetermined number of consecutive atypical actions (e.g., 3-4) during the sporting event (or portion thereof), performance evaluation logic 1603 may result in a higher probability that the performance of the athlete is at a reduced level.
Fig. 8 illustrates one embodiment of a process for evaluating whether an athlete is playing at a reduced performance level during a sporting event (or game sequence). The process begins with the computing device 1504 identifying an action (e.g., a shot, a dribbling, or a pass) taken during a sporting event (step 102). The computing device 1504 may then identify the individual athlete taking the action using any suitable technique (e.g., facial recognition) (step 104). The action may be classified and a parameter associated with a characteristic of the action may be determined or obtained (step 106). In one embodiment, actions may be classified based on the identity of the person making the action and the type of action being performed. However, in other embodiments, in addition to the identity of the person and the type of action, other factors (e.g., sporting event conditions, the level of fatigue of the athlete, a defensive player guarding the athlete, the level of defense applied to the athlete, the hand of the athlete for performing the action, etc.) may be used to classify the action. For example, the identified action may be a shooting action classified as a jump shot having measured parameters associated with characteristics associated with the shooting action such as an entry angle, a release height of the shot, and a shot position.
Once the relevant parameters for the identified performance have been determined, the determined parameters may be compared to the expected parameters for the same classified performance for the identified athlete (step 108). The expected parameters for each classified action of the identified athlete may be stored in the historical data 1596 and based on previous actions of the identified athlete. The comparison of the determined parameter to the expected characteristic may be used to determine whether there is a deviation between the determined parameter and the expected parameter (step 110). In one embodiment, the parameters may be compared individually, but in other embodiments, the parameters may be compared collectively. Further, the comparison may involve comparing the determined parameter to an expected value or range of values to determine a deviation from the expected parameter.
The deviation, if any, between the determined parameter and the expected parameter may be used to determine a probability that the identified action of the athlete is an atypical action of the athlete (step 112). An atypical action may be an action for an athlete that has parameters that do not correspond to similar actions previously taken by the athlete. Once the probability of an atypical action is determined, the determined probability may be compared to a threshold probability (step 114). If the determined probability is less than the threshold probability, then, for the athlete, the determined action, although an atypical action for the athlete, is deemed not to indicate a decrease in the athlete's performance level, and the process ends. However, if the determined probability is greater than the threshold probability, additional steps are taken to determine if the identified athlete is performing at a reduced performance level (step 116), and the process ends. In one embodiment, additional steps may include providing a notification to the user via the output device 1508 to enable the user to review the athlete's performance to determine whether the athlete is intending to perform at a reduced performance level. In another embodiment, the determination that the athlete is performing at a reduced level may be based on the number and/or frequency of atypical actions by the athlete during the sporting event. For example, 3 atypical actions occurring within 10 minutes of game play or 3 consecutive atypical actions may indicate that the athlete is intentionally performing at a reduced performance level. If it is determined that the athlete is performing at a reduced performance level, a notification may be provided to the user via the output device 1508 that the athlete is deemed to be performing at a reduced performance level.
In one embodiment, the process of FIG. 8 may be dynamically performed in real-time as the sporting event occurs. A user (e.g., a coach) may use dynamically generated information about player performance to make decisions about team play and player replacement. In another embodiment, the process of FIG. 8 may be performed on a video recording of a sporting event to accumulate data regarding the performance characteristics of the athlete and/or to verify the integrity of the results of the sporting event (i.e., to confirm that the results of the sporting event were not affected by one or more athletes intentionally performing below their expected performance levels).
In one embodiment, the performance evaluation logic 1603 may use machine learning to determine whether the athlete is performing at a reduced performance level. As is known in the art, machine learning typically involves training a computer by using artificial intelligence to identify data patterns that may lead to certain outputs or results by analyzing a sample data set. Such machine learning methods may be used by the computing device 1504 to identify certain characteristics associated with actions taken by an athlete during a sporting event. These characteristics may include characteristics associated with the action itself or the athlete performing the action.
In one embodiment, the machine learning system may be trained to learn that parameters associated with a set of characteristics correspond to how an athlete typically performs a particular action during a sporting event. Then, when the athlete takes an action during the sporting event, the parameters associated with the action may be compared to the set of parameters associated with how the athlete typically takes the action. If there is a significant difference between the parameters, the machine learning system may identify that the action is an atypical action for the person, and the athlete may perform at a reduced performance level.
For further explanation, assume that historical data 1596 includes data indicative of various categories of shooting characteristics of the athlete, such as jump shots. Such data may have been defined by tracking a large number (e.g., thousands, e.g., more than 10000, 100000, or 1000,000) of hops for that player, or in some embodiments by tracking previous hops for other players. Such data may include a number of measurements of the type of shot, such as the angle of entry into the rim, the shot position relative to the rim, the release height, the shot height, or any other measurable property that affects the performance of a shot (including any of the shot properties described above). The data may also include information indicative of movement of a body part of the athlete, such as the elbow, foot, hand, head, torso, etc. of the athlete while attempting a jump shot. Such data may have been analyzed to determine, for each measured attribute, an expected range for that measurement (i.e., a range that represents that the athlete may be attempting a normal shot (relative to the athlete's past performance) when the measurement is within range). These predefined ranges may be included in the historical data 1596 stored in the memory 1562.
During a basketball game, the system 1500 may track players, and as players attempt to make a shot, the object tracker 1562 may track and analyze the shot to classify it. As an example, a player's body posture, jump height, hand or arm movement may indicate the type of shot the player is attempting. For purposes of illustration, assume that the object tracker 1562 determines that an athlete is attempting to jump. In this case, the object tracker 1562 determines various shot characteristics for the jump shot and compares it to shot characteristics for the same type of shot (i.e., the jump shot in this example) for the same player stored in the historical data 1596. As an example, the object tracker 1562 may determine an entry angle for the shot and compare the entry angle to an entry angle range indicated by the historical data 1596 to determine whether the entry angle is within the player's normal range based on the player's past performance over a large number of shots. Such a comparison may indicate a normal shot when the entry angle is within a predetermined range, or an atypical shot when the entry angle is outside the predetermined range. Similar comparisons may be made for other attribute measurements to provide an overall score indicating whether the shot is considered normal or atypical. For example, the score may be defined such that when more parameters indicate a normal shot, the score is higher (or alternatively, lower in another embodiment). In such an example, the object tracker 1562 may consider the shot as normal when the score is above a predefined threshold and as atypical when the score is below a predefined threshold. In some embodiments, the score may be determined by a machine learning system that provides a higher (or alternatively, in another embodiment, a lower) score when the parameter associated with the attribute indicates that normal shots are more likely.
Note that some characteristics may be given more weight in the shot estimation relative to other characteristics. As an example, it may be determined that the value of the angle of entry may be a particularly important characteristic indicating the quality of a shot or whether the player is attempting to make a shot, and that characteristic may be given greater weight relative to one or more other characteristics determined to be less important in indicating the quality of a shot or the player's intent.
The object tracker 1562 may track the above-described shooting assessment during all or part of the basketball game to determine a score indicative of the player's overall performance over the time period. In some embodiments, the score may indicate how well the athlete's performance is consistent with his/her past performance as indicated by the historical data 1596 (e.g., whether the athlete's overall performance over the time period is normal or atypical). As an example, the score may indicate a ratio of shots determined to be normal to shots determined to be atypical. In some embodiments, the score may be determined by the algorithm to indicate a probability that the athlete intentionally played at a reduced performance, that the performance was reduced for one or more reasons of interest (e.g., injury or intent to affect the outcome of a wager associated with the game), or that at least some of the shots were intentionally missed.
Note that the object tracker 1562 may consider various factors in determining a score that indicates whether a particular shot or overall performance is atypical. As an example, when it is determined that the player has a high degree of fatigue, the score may be adjusted or otherwise controlled such that poor shots have less of an impact on the score. That is, it is reasonable that the shooting characteristics of the player change with the fatigue of the player. When the athlete is deemed to be highly tired, the score of the athlete may be adjusted so that it is less affected by a parameter that is slightly outside of an expected range. Alternatively, the range compared to the shooting parameters may be adjusted according to the fatigue level of the player, so that when he or she is highly fatigued, a larger deviation from normal is required to trigger the atypical assessment. It should be noted that when the athlete is deemed to be similarly fatigued, the change to the range may be based on the athlete's past performance, as shown in historical data 1596.
The object tracker 1562 may similarly take into account other factors that may affect the performance of a shot. As an example, as described above, the object tracker 1562 may be configured to determine a value that indicates the degree of guard or closeness of a defender to a player while the player is performing an action such as dribbling or shooting. If the value indicates that the player is closely guarded when shooting, the player's shooting performance score may be adjusted or otherwise controlled to account for defensive guarding such that poor shooting has less of an impact on the score. That is, it is reasonable that the shooting characteristics of the player are changed according to whether the player is defended or not or how tight the player is. When an athlete is deemed to be under tight defense, the score of the athlete may be adjusted to be less affected by a parameter being slightly outside of an expected range. Alternatively, the range compared to the shooting parameters may be adjusted based on the determined defensive guard for shooting so that when he or she is closely guarded, a greater deviation from normal is required to trigger the atypical assessment. It should be noted that when the athlete is deemed to be similarly defending, the change to range may be based on the athlete's past performance, as shown in historical data 1596.
In evaluating the overall score of an athlete's performance, it may be possible to weigh some shots more heavily than others. As an example, based on some game conditions, the object tracker may identify that one or more shots are more important than other shots or are otherwise weighted to a greater degree than other shots, such as shots near the end of the game or shots based on the score of the game. The object tracker 1562 may receive user input indicating that a situation in the race is deemed critical or more important than other situations, but the object tracker 1562 may also automatically recognize such a situation. As an example, the object tracker 1562 may compare a team's game score with another team's game score and determine that a shot made when the difference of the two scores is within a certain range is to be weighted. As an example, a shot when the difference in team scores is less than 5 points (or some other threshold) may be weighted higher than a shot when the difference is greater than 5 points. In other embodiments, the difference in team scores may be compared to a threshold, such as a score of a wager associated with a game. As an example, if the player attempts to manipulate the outcome of the wager by intentionally making a missed shot or otherwise underperforming when the difference in game scores is close to the difference (or in some other game situations), the object tracker 1562 is more likely to identify its performance as atypical because the shots taken in this situation are more weighted than the shots in other situations.
In at least some of the embodiments described above, information regarding a player's current shot or other action is described as being compared or otherwise evaluated against historical data of previous shots made by the same player. However, it should be emphasized that the historical data need not be obtained from the same athlete. For example, the following are possible: the player's current shot or other action is compared or otherwise evaluated in the same or similar manner relative to historical data from the actions of one or more other players. Regardless of the source of the historical data, using such historical data, a large number of samples may be included for the type of action being analyzed to provide statistical accuracy and significance, allowing the system to discern intent based on even minor deviations in typical behavior.
In another embodiment, the object tracker 1562 may implement a machine learning system to assess whether negative actions (i.e., actions with negative consequences) from the athlete (e.g., missed shots, bad passes thrown, inadvertent dribbling, etc.) or reduced performance are intentional. The machine learning system may receive as input camera data 1578, sensor data 1580, and/or parameters generated by object tracker 1562, and generate an output indicating whether the system believes the athlete is likely to intentionally take a negative action, such as intentionally making one or more actions (e.g., shooting a basket) that reduce performance. The output of the machine learning system can then be used to make a determination that the athlete is intentionally taking negative action for illegal reasons (e.g., controlling the outcome of a particular wagering bet). In one embodiment, the output of the machine learning system may be a probability value such that the higher (or lower) the value from the machine learning system, the greater the probability that the athlete intentionally takes a negative action (e.g., missed shot) or is otherwise intentionally underperforming.
The machine learning system may evaluate a plurality of parameters associated with the athlete's movements to generate an output. The plurality of parameters evaluated by the machine learning system may correspond to parameters provided by the object tracker 1562 (e.g., parameters indicative of a shot trajectory), but the plurality of parameters may also include "self-generated" parameters from the machine learning system. The self-generated parameters may be determined by nodes of the neural network that implement a deep learning process to improve output. The self-generated parameters may be based on information or data from one or more of the camera data 1578, sensor data 1580, or input parameters from the object tracker 1562.
The machine learning system may be trained prior to evaluating an athlete's performance using machine learning. Training of a machine learning system may involve providing many inputs (e.g., thousands of inputs or more) to the machine learning system to train parameters whose learning indicates an athlete's intent. As an example, any type of sensor (e.g., a camera) described herein may be used to capture historical data associated with a large number of shots by an athlete (and/or other athlete), which may include raw sensor data and/or processed sensor data, such as parameters measured from the sensor data (e.g., trajectory parameters or body motion parameters). The object tracker 1562 implementing a machine learning system may analyze these data to learn parameters indicative of intent. In the context of a neural network, the learned parameters may be defined by values stored in nodes of the neural network to translate inputs into desired outputs. In this manner, the machine learning system may learn performance characteristics that may indicate intentional negative actions (e.g., missed shots) and evaluate parameters indicative of these characteristics to evaluate when an athlete intentionally made a negative action (e.g., missed shots).
Such machine learning may be used to implement the concepts described above or non-machine learning embodiments similar to the concepts described above. As an example, as described above, certain trajectory parameters, when within certain ranges, may indicate an intent to miss a shot. When the object tracker 1562 implements a machine learning system, it may learn the necessary parameters so that when the trajectory parameters are within a range indicating an intent to make a missed shot, the output of the machine learning system indicates that the missed shot may be intentional.
In some embodiments, the machine learning system implemented by the object tracker 1562 may be trained using shot data from a large number of shots (or other types of actions) taken by multiple users. During training, the machine learning system may be configured to learn parameters indicative of performance characteristics that may indicate an intent to miss a shot (or other type of negative action). Such parameters may be based on the trajectory of the object launched by the athlete or the athlete's body movements when launching the object (or performing another type of action).
When a player performs an action (e.g., makes a basketball shot) that is to be evaluated as to whether his performance is atypical (e.g., whether the player intentionally made a missed shot), the object tracker 1562 may provide sensor data from the shot (as determined by one or more sensors (e.g., cameras)) as input to the machine learning system. The object tracker 1562 may also calculate various parameters based on sensor data indicative of athlete performance and provide these parameters as input to the machine learning system. As an example, the object tracker 1562 may identify objects launched by the athlete towards the target in the sensor data and calculate one or more trajectory parameters indicative of the trajectory of the objects. Based on the identification of the athlete taking the action, the object tracker 1562 may also provide as input historical data 1596 associated with the identified athlete and indicating the athlete's historical performance in performing a plurality of previous actions of the same type (e.g., the same shot type). In some embodiments, the object tracker 1562 may algorithmically determine the shot type of a current shot and, based on that determination, search for and provide historical data 1596 indicative of that shot type to the machine learning system, alternatively, the object tracker 1562 may provide historical data 1596 associated with multiple shot types for use by the machine learning system in evaluating the athlete's current shot.
In any case, if a current shot fault is determined (i.e., not passing a basket), the machine learning system of the object tracker 1562 may evaluate the input of the current shot against the learned parameters to determine a value indicating whether the shot is a deliberate fault. In making such an assessment, the machine learning system may determine at least one parameter that characterizes the athlete's performance in performing the current action (e.g., shooting a basket) relative to the athlete or other athlete performing the same type of action in a previous sporting event, as indicated by historical data 1596. That is, if the learned parameters, when applied to inputs of the machine learning system (which may include inputs for a current shot and inputs based on historical data of previous shots taken by the athlete or other athlete), indicate that the athlete's performance in the current action is atypical relative to the athlete's performance in past actions indicated by the historical data 1596, the machine learning system may provide an output indicating that a missed shot may be intentional. In some embodiments, the output may include a value indicating the probability that a missed shot was intended. It should be noted that any of the factors described above may be used by the machine learning system to assess the performance and intent of the athlete.
Figure 6 illustrates an offensive and defensive basketball player on a sporting playing surface. As can be seen in the embodiment of fig. 6, the cameras 1502 may be located at each end of the athletic playing surface 1650 and may capture the entire athletic playing surface 1650 within the field of view 1652 of the cameras 1502. The camera 1502 may also capture an offensive player 1654, a ball 1656, and a defensive player 1658. The camera 1502 may be capable of capturing a large amount of information or very little information about the offensive player 1654, ball 1656, and defensive player 1658, depending on their location on the athletic playing surface 1650 and relative to the camera 1502. As described above, the ball 1656 may be obscured from the camera's field of view 1652 based on the position of the offensive and defensive players 1654, 1658, as shown in fig. 6. The camera 1502 may also capture information about other athletes (not shown) on the athletic playing surface 1650. Images captured by the camera 1502 of the offensive player 1654, the ball 1656 and the defensive player 1658 may be processed by the object tracker 1562 as described above.
In one embodiment, the computing device 1504 may use an augmented reality system to provide a training sequence that simulates the play of a basketball shoot, a tee shot, and/or a pass or a defender. As shown in fig. 7, the augmented reality system 1710 may include augmented reality logic 1712. Augmented reality logic 1712 may be implemented in software, hardware, firmware, or any combination thereof. Augmented reality logic 1712 may be part of object tracker 1562 or a separate system in memory 1566. The augmented reality logic 1712 may use the communication interface 1576 to send images associated with the simulated game situation to the user interface 1714 used by the user (e.g., head-mounted). In one embodiment, the user interface 1714 may have a closed configuration (e.g., a full-sized helmet) that prevents the user from seeing any surrounding physical environment, or in another embodiment, the user interface 1714 may have an open configuration (e.g., a pair of glasses) that allows the user to view the surrounding environment while the game projection is in progress. In one embodiment, the user interface 1714 may be an output device 1508. The system 1500 may capture the user's response to the simulated game situation with the camera 1502 and/or sensor 1514 and process the captured information with the object tracker 1562 as described above. The object tracker 1562 may provide information about the user's response to the augmented reality logic 1712, which may update the simulation provided to the user.
Augmented reality logic 1712 may match the skills of the simulated defensive player to the specific skills to be developed in the training sequence. For example, the skill level of the simulated defensive player may be set lower than the skill level of the person in the training sequence to allow the person who shot, took and/or passed the ball to develop new actions that are effective for the less skilled defensive player. In contrast, the skill level of the simulated defensive player may be set higher than the person who shot, took the ball and/or passed the ball to allow the person to increase their skills or learn a protective ball sequence. Augmented reality logic 1712 may also be used to simulate multiple defenders during a training sequence. In one embodiment, simulating the skill level of the defensive player may be based on the skill of the actual defensive player as represented by the defensive characteristics of the defensive player collected and stored in the historical data 1596. In another embodiment, the skill level of the simulated defensive player may be based on a combination of the skills of several different defensive players or may be generated by augmented reality logic 1712 to provide a desired set of skills in the simulated defensive player.
Augmented reality logic 1712 may also match the skills of the simulated ball carrier or passer to the specific skills that the defender is to develop in the training sequence. For example, the skill level of the simulated tee or passer may be set lower than the skill level of the defender in the training sequence to allow the defender to develop new actions that are effective for less skilled tees or passers. In contrast, the skill level of the simulated belters or passers may be set higher than the defenders to allow the defenders to improve their skills or learn a defending sequence. Augmented reality logic 1712 may also be used to simulate multiple offensive athletes during a training sequence. In one embodiment, the skill level of the simulated carrier or passer may be based on the skill of the actual carrier or passer, as represented by the carrier's dribbling characteristics or passer's pass characteristics collected and stored in the historical data 1596. In another embodiment, the skill level of the simulated player or passer may be based on a combination of the skills of several different players or passers, or may be generated by augmented reality logic 1712 to provide a desired set of skills in the simulated player or passer.
In another embodiment, augmented reality logic 1712 may also be used to add simulated court lines or simulated basketball loops to an environment where a court does not actually exist during a training sequence. By providing simulated boundaries and/or basketball rims to the user's user interface 1714 when such conditions are not physically present, the augmented reality logic 1712 may provide a more realistic training environment and increase the benefit of the user's training sequence.
Augmented reality logic 1712 may develop skill levels in training sequences that simulate defensive players or people who shoot, take and/or pass balls by using machine learning and/or artificial intelligence in conjunction with historical data 1596. For example, factors such as the balance, footfall, foot placement and/or acceleration of the athlete (e.g., body part placement and/or orientation) and/or ball motion may be used in addition to the athlete's movements to establish a skill level that simulates the athlete(s).
In yet another embodiment, the augmented reality system 1710 may be used to allow one or more athletes to operate in an environment with 10 or more athlete practice presentations. In another embodiment, the augmented reality system 1710 may be used by two separate people in different locations to simulate a one-to-one competition between people. One of the persons may be an offensive player and the other may be a defensive player. The cameras 1502 may capture information about each athlete at their respective locations and provide the information to their corresponding object trackers 1562, which may process the information. The object tracker 1562 may then provide information about one athlete (e.g., location on the course, player posture, player motion, etc.) to the augmented reality logic 1712 used by another athlete, which augmented reality logic 1712 may then use the information about the one athlete from the object tracker 1562 to simulate the one athlete's location and motion in a simulation provided to another athlete. For example, a defender may be located at a penalty line and at a fixed location, while a ball-carrier may be located at a "three second zone arc top" and move toward the defender. The object tracker 1562 in the defender's system 1500 can capture the defender's position on the course as well as the defending pose and position of the defender. Information about the defender can then be provided to the defender's augmented reality system 1710, which can then generate a simulation of the defender at the defender's location at the penalty line and with respect to the defender's fixed location to the tee. Similarly, the object tracker 1562 in the hitter's system 1500 may capture the hitter's position on the course as well as the hitter's stance and motion. Information about the ball carrier may then be provided to the defensive player's augmented reality system 1710, which may then generate a simulation of the ball carrier at its position at the three second zone arc apex and with respect to the ball carrier's movement toward the defensive player. Similar techniques may be used with other numbers of athletes in any number of locations. As an example, five pairs of five games may be simulated, with each player at a different court or other physical location and viewing simulations of the other nine players.
In one embodiment, as the capture rate of information from the cameras 1502 and/or sensors 1514 increases, fewer cameras 1502 and/or sensors 1514 are needed to obtain the same amount of information and/or data for processing. For example, a camera 1502 capturing 1000 frames per second will provide more data to the computing device 1504 than a camera 1502 capturing 30 frames per second.
In another embodiment, an audio sensor may be used to determine one or more dribbling characteristics. For example, the audio sensor may detect changes in sound associated with a dribbling action, and the object tracker 1562 may use the detected sound changes to determine a corresponding change in dribbling characteristics, such as dribbling rate or speed. In addition, the audio sensor may also be used to help determine the trajectory of the ball when it is occluded from the field of view. For example, an audio sensor may detect the sound emitted when a ball hits the floor and provide this information to the recognition logic 1592 and ball path logic 1591 to help determine the trajectory of the ball. The sound of a ball hitting the floor may be used together with other detected information, such as the detection that the ball leaves the person's hand, to determine the speed of the ball based on the time difference between the detected information and the sound when the ball hitting the floor is detected. Further, detection of a ball hitting the floor in a repetitive pattern may indicate that the person is carrying the ball even if the ball is blocked from the field of view of the camera 1502.
A variety of different wired and wireless communication protocols may be used to transmit information communicated between different components in the system. For example, for wired communications, USB compatible, firewire compatible and IEEE 1394 compatible hardware communication interfaces and communication protocols may be used. For wireless communications, hardware and software compatible with standards such as Bluetooth, IEEE 802.11a, IEEE 802.11b, IEEE 802.11x (e.g., other IEEE 802.11 standards such as IEEE 802.11c, IEEE 802.11d, IEEE 802.11e, etc.), irDA, wiFi, and HomeRF.
Although the foregoing invention has been described in some detail by way of illustration and example for purposes of clarity and understanding, it will be recognized that the above described invention may be embodied in many other specific variations and embodiments without departing from the spirit or essential characteristics thereof. Certain changes and modifications may be practiced, and it is understood that the invention is not to be limited by the foregoing details, but rather is to be defined by the scope of the appended claims.

Claims (15)

1. A system for evaluating athlete performance during a sporting event, the system comprising:
at least one sensor configured to capture sensor data relating to an athlete performing a plurality of actions of a certain type during a time period of a sporting event;
at least one processor configured to receive the sensor data and analyze the sensor data for each of the actions to determine at least one parameter characterizing performance of the athlete while performing the respective action relative to historical data associated with one or more athletes performing a plurality of actions of the type during a previous sporting event, the at least one processor configured to determine, for each of the plurality of actions, a score indicative of the overall performance of the athlete for the action performed by the athlete during the time period based on the at least one parameter, the at least one processor further configured to determine a probability that the athlete is intentionally underperforming during the time period based on the score; and
an output interface configured to provide an output indicative of a probability that the athlete is determined by the at least one processor to be purposely underperforming with respect to the action.
2. The system of claim 1, wherein the plurality of actions comprises a player launching the object toward at least one goal during a plurality of shots of at least one object at the sporting event.
3. A system for evaluating athlete performance during a sporting event, the system comprising:
at least one sensor configured to capture sensor data relating to an athlete launching an object toward a goal during a projection of the object at a sporting event;
at least one processor configured to receive the sensor data and identify the object within the sensor data, the at least one processor configured to determine a trajectory of the object for the shot based on the sensor data and determine at least one parameter based on the trajectory, the at least one processor further configured to determine a probability that the athlete intentionally missed the shot based on the at least one parameter; and
an output interface configured to provide an output indicative of the probability.
4. The system of claim 3, wherein the object is a basketball, and wherein the goal is a basketball goal having a hoop.
5. The system of claim 4, wherein the at least one parameter indicates an entry angle of the basketball into the hoop.
6. The system of claim 3, wherein the at least one parameter indicates a direction of the object.
7. The system of claim 3, wherein the at least one parameter is indicative of a velocity of the object.
8. The system of claim 3, wherein the probability is based on a comparison of the at least one parameter to historical data indicative of past performance of the athlete launching objects toward goals in previous sporting events.
9. The system of claim 3, wherein the at least one parameter characterizes performance of the object's projection toward a goal relative to the athlete's historical data relating to the athlete's projection of the object toward the goal during a previous sporting event.
10. A method for evaluating an athlete's performance during a sporting event, the method comprising:
capturing sensor data relating to a player projecting an object toward a goal at a sporting event with at least one sensor;
receiving, with at least one processor, the sensor data;
determining, with the at least one processor, at least one parameter based on the sensor data, the at least one parameter characterizing performance of the athlete in projecting the object toward the goal;
determining, with the at least one processor, a probability that the athlete intentionally missed the goal when casting the object toward the goal based on the at least one parameter; and
providing an output with an output interface indicating a probability that the player intentionally missed the goal when projecting the object toward the goal.
11. The method of claim 10, wherein the object is a basketball, and wherein the goal is a basketball goal having a hoop.
12. The method of claim 11, wherein the at least one parameter includes an entry angle of the basketball into the hoop.
13. The method of claim 11, wherein the at least one parameter indicates a direction of the object.
14. The method of claim 11, wherein the at least one parameter is indicative of a velocity of the object.
15. The method of claim 11, further comprising storing, in a memory prior to the capturing, historical data indicative of a plurality of shots by the athlete, wherein each of the shots comprises the athlete shooting an object toward a goal, wherein determining the probability is based on the historical data.
CN201980057186.5A 2018-07-02 2019-07-02 System and method for determining reduced athlete performance in a sporting event Active CN112969513B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211237654.7A CN115487484A (en) 2018-07-02 2019-07-02 System and method for determining reduced athlete performance in a sporting event

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862693436P 2018-07-02 2018-07-02
US62/693,436 2018-07-02
PCT/US2019/040228 WO2020010040A1 (en) 2018-07-02 2019-07-02 Systems and methods for determining reduced player performance in sporting events

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211237654.7A Division CN115487484A (en) 2018-07-02 2019-07-02 System and method for determining reduced athlete performance in a sporting event

Publications (2)

Publication Number Publication Date
CN112969513A CN112969513A (en) 2021-06-15
CN112969513B true CN112969513B (en) 2022-10-18

Family

ID=69059670

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201980057186.5A Active CN112969513B (en) 2018-07-02 2019-07-02 System and method for determining reduced athlete performance in a sporting event
CN202211237654.7A Pending CN115487484A (en) 2018-07-02 2019-07-02 System and method for determining reduced athlete performance in a sporting event

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202211237654.7A Pending CN115487484A (en) 2018-07-02 2019-07-02 System and method for determining reduced athlete performance in a sporting event

Country Status (2)

Country Link
CN (2) CN112969513B (en)
WO (1) WO2020010040A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019144142A1 (en) 2018-01-21 2019-07-25 Stats Llc System and method for predicting fine-grained adversarial multi-agent motion
US11660521B2 (en) 2018-01-21 2023-05-30 Stats Llc Method and system for interactive, interpretable, and improved match and player performance predictions in team sports
EP3912090A4 (en) 2019-03-01 2022-11-09 Stats Llc Personalizing prediction of performance using data and body-pose for analysis of sporting performance
US11554292B2 (en) 2019-05-08 2023-01-17 Stats Llc System and method for content and style predictions in sports
WO2021247371A1 (en) 2020-06-05 2021-12-09 Stats Llc System and method for predicting formation in sports
CN116324668A (en) 2020-10-01 2023-06-23 斯塔特斯公司 Predicting NBA zenithal and quality from non-professional tracking data
US20220253679A1 (en) * 2021-02-05 2022-08-11 Stats Llc System and Method for Evaluating Defensive Performance using Graph Convolutional Network

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8409024B2 (en) * 2001-09-12 2013-04-02 Pillar Vision, Inc. Trajectory detection and feedback system for golf
US11196811B2 (en) * 2006-12-01 2021-12-07 Fitistics, Llc Data communications between an exercise device and a personal content device
US8172722B2 (en) * 2008-12-05 2012-05-08 Nike, Inc. Athletic performance monitoring systems and methods in a team sports environment
US9599632B2 (en) * 2012-06-22 2017-03-21 Fitbit, Inc. Fitness monitoring device with altimeter
US20170272703A1 (en) * 2014-04-30 2017-09-21 Gchd Partners, Llc Athletic performance data acquisition systems, apparatus, and methods

Also Published As

Publication number Publication date
CN115487484A (en) 2022-12-20
WO2020010040A1 (en) 2020-01-09
CN112969513A (en) 2021-06-15

Similar Documents

Publication Publication Date Title
US20240071140A1 (en) Systems and methods for evaluating player performance in sporting events
US10010778B2 (en) Systems and methods for tracking dribbling and passing performance in sporting environments
US9886624B1 (en) Systems and methods for tracking dribbling in sporting environments
CN112969513B (en) System and method for determining reduced athlete performance in a sporting event
US11450106B2 (en) Systems and methods for monitoring objects at sporting events
US10607349B2 (en) Multi-sensor event system
US11117035B2 (en) Video analytics for human performance
CN109862949B (en) System for evaluating basketball projection performance
US9025021B2 (en) System and methods for translating sports tracking data into statistics and performance measurements
US20100030350A1 (en) System and Method for Analyzing Data From Athletic Events
US11452929B2 (en) Smart soccer goal
CN113599788B (en) System and method for monitoring athlete performance during a sporting event
US20180326284A1 (en) Systems and methods for cricket/baseball game scoring and umpiring
US10441848B1 (en) System for automatic evaluation of martial arts moves
EP4325448A1 (en) Data processing apparatus and method
EP4325443A1 (en) Data processing apparatus and method
WO2017143814A1 (en) Method, device and system for ball game data statistics, smart basketball and wrist band
CA3189146A1 (en) System and method for object tracking and metric generation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant