NZ758097B2 - Systems and methods for analyzing event data - Google Patents
Systems and methods for analyzing event data Download PDFInfo
- Publication number
- NZ758097B2 NZ758097B2 NZ758097A NZ75809711A NZ758097B2 NZ 758097 B2 NZ758097 B2 NZ 758097B2 NZ 758097 A NZ758097 A NZ 758097A NZ 75809711 A NZ75809711 A NZ 75809711A NZ 758097 B2 NZ758097 B2 NZ 758097B2
- Authority
- NZ
- New Zealand
- Prior art keywords
- play
- camera
- objects
- periodically
- target situation
- Prior art date
Links
- 238000004805 robotic Methods 0.000 claims abstract description 26
- 230000001276 controlling effect Effects 0.000 claims abstract description 12
- 230000004044 response Effects 0.000 claims abstract description 9
- 230000015572 biosynthetic process Effects 0.000 claims description 102
- 150000002500 ions Chemical class 0.000 claims description 37
- 238000000034 method Methods 0.000 claims description 9
- 238000005755 formation reaction Methods 0.000 description 83
- 230000003068 static Effects 0.000 description 25
- 230000006399 behavior Effects 0.000 description 17
- 238000010192 crystallographic characterization Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 13
- 239000003550 marker Substances 0.000 description 10
- 230000000386 athletic Effects 0.000 description 8
- 230000000875 corresponding Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 241000229754 Iva xanthiifolia Species 0.000 description 6
- 230000002459 sustained Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 238000005562 fading Methods 0.000 description 4
- 229940035295 Ting Drugs 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 230000001965 increased Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 2
- 101700030310 FUS Proteins 0.000 description 1
- 101700014034 Lcn3 Proteins 0.000 description 1
- 101700062818 NP Proteins 0.000 description 1
- 101710026336 S5 Proteins 0.000 description 1
- 101710023234 Segment 5 Proteins 0.000 description 1
- 241000287181 Sturnus vulgaris Species 0.000 description 1
- XCCTYIAWTASOJW-XVFCMESISA-N Uridine-5'-Diphosphate Chemical compound O[C@@H]1[C@H](O)[C@@H](COP(O)(=O)OP(O)(O)=O)O[C@H]1N1C(=O)NC(=O)C=C1 XCCTYIAWTASOJW-XVFCMESISA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000000903 blocking Effects 0.000 description 1
- UIIMBOGNXHQVGW-UHFFFAOYSA-M buffer Substances [Na+].OC([O-])=O UIIMBOGNXHQVGW-UHFFFAOYSA-M 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000002596 correlated Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000003247 decreasing Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000004513 sizing Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0021—Tracking a path or terminating locations
- A63B2024/0025—Tracking the path or location of one or more users, e.g. players of a game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/10—Positions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/30—Speed
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/806—Video cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2225/00—Miscellaneous features of sport apparatus, devices or equipment
- A63B2225/20—Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2225/00—Miscellaneous features of sport apparatus, devices or equipment
- A63B2225/50—Wireless data transmission, e.g. by radio transmitters or telemetry
- A63B2225/54—Transponders, e.g. RFID
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2243/00—Specific ball sports not provided for in A63B2102/00 - A63B2102/38
- A63B2243/0066—Rugby; American football
- A63B2243/007—American football
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0021—Tracking a path or terminating locations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/87—Combinations of radar systems, e.g. primary radar and secondary radar
- G01S13/878—Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
-
- G06K9/00724—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
Abstract
Disclosed are methods and systems for controlling a camera at a sporting event. Real-time geographic positions of a plurality of objects in the sporting event are periodically received and processed to identify at least one condition that precedes a target situation involving the plurality of objects. A robotic camera is then controlled in response to the at least one condition being identified so as to to capture at least one of the plurality of objects prior to, and during at least part of, the target situation. s. A robotic camera is then controlled in response to the at least one condition being identified so as to to capture at least one of the plurality of objects prior to, and during at least part of, the target situation.
Description
SYSTEMS AND METHODS FOR ANALYZING EVENT DATA
CROSS REFERENCE TO RELATED APPLICATIONS
This application claims priority to U.S. Patent Application Serial No.
61/292,386, filed y 5, 2010, the disclosure of which is incorporated herein by
reference.
The present application has been divided out of New Zealand patent
application 749991 (NZ 749991). NZ 749991 was d out of New Zealand patent
application 716581 (NZ 716581). NZ 716581 was divided out of New Zealand patent
application 701372 (NZ 701372). NZ 701372 was divided out of New Zealand patent
application 600699. In the description in this present specification nce may be made
to subject matter which is not within the scope of the appended claims. Any t matter
described in the present ation and that is not within the scope of the presently
appended claims should be readily identifiable by a person skilled in the art, and may assist
in putting into practice the invention as defined in the appended claims.
NZ 600699 is the national phase entry in New Zealand for
, which published as on 14 July 2011. WO
2011/085008 is incorporated herein by reference.
FIELD
This disclosure relates to a computer-implemented method that
predicts s of a static player formation. The disclosure also relates to methods that
predict behaviour of an ng team in a static formation and that predict the success of
a dynamic-play execution.
DEFINITION
In the specification the term “comprising” shall be understood to have
a broad meaning similar to the term ding” and will be understood to imply the ion
of a stated integer or step or group of integers or steps but not the exclusion of any other
integer or step or group of integers or steps. This definition also applies to variations on the
term “comprising” such as “comprise” and “comprises”.
OUND
Systems that track objects in an event, such as ipants in an
athletic event, are known. For example, U.S. Patent Application ation No.
129825 to DeAngelis et al., which is incorporated herein by reference, discloses
systems and methods to facilitate autonomous image capture and picture production. A
location unit is attached to each tracked object (e.g., participants in an athletic event). An
object tracking device es location information from each location unit. A camera
control device ls, based upon the location information, at least one motorized camera
to capture image data of at least one tracked object.
It is also known to manually create video and still images of an event.
For example, a video feed of an event (e.g., an American football game) is typically
generated by highly trained camera persons and highly trained production staff who select
camera shots and combine graphics into the video feed. Video images and/or still picture
production can be lly or fully automated using systems and methods disclosed in U.S.
Patent Application Publication No. 2008/0129825.
In many American football games, two 'standard' views are manually
filmed using two digital video cameras; one on the sideline, and one in an end zone. These
views are then manually 'broken down' by humans watching the videos, clipping them into
plays, and identifying interesting attributes of each play. One of the most obvious attributes
is simply who was on the field for each team at a given time. This is also one of the most
difficult things to determine from the video since the resolution is not sufficient to clearly
determine each of the players' numbers, thus making it difficult to identify all of the players.
The reference to prior art in the background above is not and should
not be taken as an acknowledgment or any form of suggestion that the referenced prior art
forms part of the common general knowledge in New Zealand or in any other country.
According to a first aspect of the disclosure there is provided a
computer-implemented method that predicts s of a static player formation,
comprising comparing the static formation to a plurality of standard ions to find a
ng standard formation, determining one or more deviation types in the static
formation relative to the matching standard formation, retrieving from a database standard
previous plays executed with the matching standard formation and none of the deviation
types, and deviant us plays executed with the matching standard formation and any
of the deviation types, calculating a standard success percentage based on outcomes of
the standard us plays and, for each of the deviation types, a deviant success
percentage based on outcomes of the deviant previous plays executed with said each of
the ion types, and ing the standard success percentage and each deviant
success percentage.
The method may further comprise determining the static formation
from measured positions of a plurality of players g the static formation.
The method may further comprise measuring the positions of the
players.
The method may further se logging the one or more deviations
in the database.
According to another aspect of the disclosure there is provided a
computer-implemented method that predicts behavior of an opposing team in a static
formation, comprising comparing the static formation to a plurality of standard formations to
find a matching standard formation, determining one or more deviation types in the static
formation relative to the ng standard formation, retrieving from a database previous
plays ed by the opposing team in a type of game situation with formations that match
the matching standard formation, tabulating from the retrieved previous plays standardformation
percentages that each of a plurality of play-types was ed, sorting the
retrieved previous plays into groups based on the determined deviation types, tabulating,
for each of the groups, deviation-formation percentages that each of the ypes was
executed with the one ined deviation type of said each of the groups, and reporting
the standard-formation percentages and the deviation-formation tages.
The method may further comprise determining the static formation
from measured positions of a plurality of players of the opposing team forming the static
formation.
The method in this aspect may include any one or more of the
features of the method defined in the first aspect of the disclosure above.
According to a r aspect of the disclosure there is provided a
computer-implemented method that predicts success of a dynamic-play execution,
comprising, comparing the dynamic-play execution to a ity of standard-play executions
to find a matching standard-play execution, determining one or more deviation types in the
dynamic-play execution relative to the matching standard-play execution, retrieving from a
database standard previous plays ed with the matching standard-play execution and
none of the deviation types, and t previous plays executed with the matching
standard-play execution and any of the deviation types, calculating a standard success
tage based on outcomes of the standard previous plays and, for each of the deviation
types, a deviant success percentage based on outcomes of the deviant previous plays
ed with said each of the deviation types, and reporting the standard success
percentage and each deviant success percentage.
The method may further se determining the dynamic-play
execution from measured paths of a plurality of players executing the c-play
execution.
Determining the one or more ion types in the dynamic-play
execution may include determining the one or more deviation types from the measured
paths of the players.
Determining the one or more deviation types from the measured
paths may include comparing each of the measured paths to a corresponding predefined
path of the matching standard-play execution.
The method may further comprise measuring the paths of the
players.
The method in this aspect may include any one or more of the
features of the method defined in the first aspect of the disclosure above.
According to yet another aspect of the disclosure there is provided a
computer-implemented method that predicts or of an opposing team executing a
dynamic-play execution, comprising, comparing the dynamic-play ion to a plurality of
standard-play executions to find a matching standard-play execution, determining one or
more deviation types in the dynamic-play execution relative to the matching standard-play
execution, retrieving from a database previous plays executed by the opposing team in a
type of game situation with dynamic-play executions that match the matching dynamic-play
ion, ting from the retrieved previous plays standard-play percentages that each
of a plurality of play-types was ed, g the retrieved previous plays into groups
based on the determined deviation types, tabulating, for each of the groups, deviation-play
percentages that each of the ypes was executed with the one determined deviation
type of said each of the groups, reporting the standard-play percentages and the deviationplay
percentages.
The method in this aspect may include any one or more of the
features of the method defined in the previous aspect of the disclosure above.
The t disclosure further es a computer-implemented
method for video capture of a target situation in an athletic event, comprising: analyzing
positions of participants in the athletic event to identify occurrence of a pre-target condition
that precedes the target situation; and upon identification of the pre-target condition,
directing a robotic camera to position within its field of view at least one of (a) one or more
participants of st for the target situation and (b) an area associated with the target
situation.
The term ‘comprising’ as used in this specification and claims means
‘consisting at least in part of’. When interpreting statements in this specification and claims
which include the term ‘comprising’, other features besides the features prefaced by this
term in each statement can also be present. Related terms such as ‘comprise’ and
‘comprised’ are to be interpreted in similar manner.
The t disclosure further provides a computer-implemented
system for video capture of a target situation in an athletic event, comprising: a position
tracker for determining positions of ipants in the athletic event; a robotic camera for
capturing video of the athletic event; a processor; a memory storing: definition of a pretarget
ion that es the target situation, and machine-readable instructions that,
when executed by the processor, (a) compare the positions with the definition of the pretarget
condition to identify occurrence of the pre-target condition, and (b) upon identification
of said occurrence of the pre-target condition, direct the robotic camera to position, within
its field of view, at least one of (i) one or more of the participants of interest for the target
situation and (ii) an area associated with the target ion.
A computer-implemented method is disclosed for determining a
target situation in an athletic event. In one embodiment, positional information ing the
relative positions of a group of ed participants is initially received from a participant
tracking system. Aggregate motion of the selected participants is detected in real-time using
the positional information. The target situation is determined to have occurred when a
change in the ate motion occurs in accordance with a ermined characteristic
during an initial time interval.
In another embodiment, a video feed of an event is annotated by
receiving positional information indicating the position of a selected ipant in the event
from a tracking system. The path of travel of the participant is determined from the positional
information, and graphical ation indicating the path of travel, and information
identifying the participant, is overlaid onto the video feed.
BRIEF DESCRIPTION OF DRAWINGS
The present disclosure will now be described, by way of non- limiting
example only, with nce to the accompanying gs, in which:
Figure 1 shows an exemplary system for automatically ting
event characterization information from event data;
Figure 2 shows the system of Figure 1 in greater detail;
Figure 3 shows an e in which aggregate motion of event
participants is used to determine the occurrence of a particular target situation in an event;
Figure 4A shows an exemplary method that is used with certain
embodiments to detect a target situation in an event by analyzing the aggregate motion of
participants in the event;
Figure 4B is a flowchart showing exemplary steps performed in
detecting a play in a sporting event;
Figure 5A is a art showing exemplary steps performed in using
positional information to determine certain target situations;
Figure 5B is a flowchart showing exemplary steps performed in
generating solutions and recommendations to increase a team's future success percentage;
Figure 5C is an exemplary diagram showing a rd line formation
and a deviant of the standard ion;
Figure 5D is a flowchart showing exemplary steps performed in
predicting an opponent's behavior based on detecting static formation deviations which are
correlated with historical behavior;
Figure 5E is a flowchart showing ary steps performed in
generating solutions and endations to improve the performance of a team based
on detecting deviations in dynamic play execution
Figure 5F is an exemplary diagram showing a standard route and a
deviant route in two similar patterns (paths) run by a wide receiver;
Figure 5G is a flowchart showing exemplary steps med in
predicting an opponent's behavior based on detecting deviations in dynamic play execution
and correlating these deviations to us behavior in ic situations;
Figure 5H is an exemplary diagram showing a standard route and a
deviant route in two similar ns (paths) run by a slot receiver;
Figure 6A is a flowchart showing exemplary steps performed in using
positional information to provide automatic annotation of a video feed of an event;
Figure 6B is an exemplary diagram showing player identification
graphics indicating a video display;
Figure 6C is an exemplary diagram showing player identification
graphics indicating the off-screen location of an object;
Figure 6D is an exemplary m g a window containing a
highlighting shape overlaid on a video stream;
Figure 7 is a flowchart showing exemplary steps performed in
evaluating a ipant's performance;
Figure 8 is a flowchart showing exemplary steps performed in
automating the video g of selected parts of an entire game; and
Figure 9 is a diagram showing the use of a wireless device with the
present system.
DETAILED DESCRIPTION
The present disclosure may be understood by reference to the
following ed description taken in conjunction with the drawings described below. It is
noted that, for purposes of illustrative clarity, certain elements in the drawings may not be
drawn to scale.
Systems and methods disclosed herein analyze event data, such as
respective ime locations of participants in an event, to advantageously tically
generate information characterizing one or more aspects of the event (event
characterization information). In the case of a sporting event, examples of possible event
characterization information include (1) identification of formations and/or plays, (2)
beginning and/or end of a play, (3) players' paths of travel, (4) lines of scrimmage, (5)
identification of players, and (6) position and orientation of coaches and/or officials.
Such event terization information, for e, is provided to
interested recipients (e.g., event tors, coaches, and/or event officials) automatically
or upon demand. In some embodiments, event characterization ation is
ageously used to enhance a video feed 109 of an event, such as by overlaying
graphical information onto the video feed. Some embodiments of the systems and methods
disclosed herein may partially or fully automate imaging of an event (e.g., generating a video
feed of the event) and/or control delivery of event images to recipients.
The present system and methods provide the basic functionality for
implementing, among other things, a ime feed showing who is on a playing field at all
times. The feed can be automatically added as a data track to the original digital video.
Figure 1 shows an exemplary system 100 for automatically
generating event characterization ation from event data. System 100 receives a video
feed (live video stream) 109 and other event data 102, which includes information such as
ipant on information and system instructions 107, and automatically generates
event characterization information 104. Exemplary embodiments of system 100 are
operable to automatically generate event characterization information 104 in real-time,
where the term "real-time" in the context of this disclosure and appended claims means
information 104 is generated as the event occurs. For example, identification of a play in a
sporting event in real-time means that the play is identified as it occurs, as opposed to the
play being identified at a later time (e.g., upon post-analysis of the event).
Figure 2 shows exemplary system 100 in more detail. System 100
includes an input/output (1/0) subsystem 106 operable to e event data 102 and user
input 118, and to output event characterization information 104. 1/0 tem 106
includes, for example, a USB (Universal Serial Bus) and/or Ethernet interface for connecting
to one or more external systems. It is to be noted that input 118 is received from a system
user to te and/or provide certain information specific to each of the system functions
described below.
In one embodiment, 1/0 subsystem 106 is communicatively coupled
to a video feed 109 from one or more video cameras 117, and a tracking system 108,
information from which is transmitted via link 107, which also provides data including
information pertaining to event participants and instructions for system 100 including
requests to be sed. Video cameras 117 may be manually or robotically controlled.
Tracking system 108 determines ons and/or velocities of event participants from
location units such as active RFID tags affixed to event participants by triangulating the
position of the location units to determine respective positions and/or velocities of the
participants. System 100 may alternatively receive event data 102 from Internet 110 via 1/0
subsystem 106.
In the present ment, system 100 r includes a processor
112, a data store 114, and a database 115. Processor 112 is a computing device, which
includes, for example, a general purpose microprocessor, processes event data 102 to
generate event characterization information 104 in se to instructions 116, in the form
of software and/or firmware 116, stored in data store 114. Examples of methods executed
by sor 112 to generate event characterization ation 104 are discussed below.
Data store 114 typically includes volatile memory (e.g., dynamic
random access memory) and one or more hard drives. gh the components of system
100 are shown grouped together, such components could be spread over a number of
systems, such as in a distributed computing environment and/or in a distributed storage
environment.
In one embodiment, event characterization information 104 is
transmitted to an annotation system 124, which tes a video feed 120 (e.g., a live
video feed) of the event to produce an annotated video feed 122. In certain embodiments,
annotation system 124 overlays graphical information onto video feed 120, as is known in
the art, and the graphical information includes event characterization information 104.
The present system advantageously uses information derived from
the aggregate motion of event participants. The ate motion of multiple participants in
an event can indicate the occurrence of a particular incident, target situation, or
circumstance of interest (hereinafter collectively, "target situation") in the event (e.g., the
beginning of a play in a sporting event). An aggregate motion value represents tive
motion of two or more participants. An aggregate motion value for selected event
ipants at a given point in time can be determined, for e, by summing the
velocities of the participants at that time or determining an average velocity of the
ipants at that time. A particular target situation can be detected by recognizing
changes in aggregate motion values and/or sequences of aggregate motion values that are
known to coincide with the target situation, thereby indicating that the target ion
occurred.
Figure 3 shows an example in which aggregate motion of event
participants and knowledge of the type of event can be used to ine the occurrence
of a ular target situation in the event. More specifically, Figure 3 is a graph 300 of
normalized ate motion versus time for a 24 second interval of an American football
game. Certain target situations in a football game can be recognized by comparing actual
aggregate motion values of graph 300 to aggregate motion values known to occur with a
given target situation. For example, segments A, B, D, and E are respectively characterized
by a moderate value of aggregate motion, a small value of aggregate motion, a large
increase in aggregate motion over a d time duration, and a sustained large value of
aggregate motion. Such sequences of aggregate motion values are known to occur with the
preparation for and executing of an offensive play, and the beginning of an ive play
can thus be inferred from the sequence.
Specifically, segment A represents the offensive squad breaking a
huddle and heading toward a line of scrimmage, segment B ents an ive squad
assembling at a line of scrimmage and entering a pre-play condition, segment D represents
a beginning of a play, and segment E ents the play in progress. Accordingly, an
offensive play can be detected by recognizing the sequence of aggregate motion values
associated with segments A, B, D, and E. Point C, which is characterized by a brief spike
in aggregate motion, represents a player going into motion before the ball was snapped.
A sequence of ate motion values in graph 300 can also be
recognized to determine an end of the play. In particular, segments E, F, G, and H are
tively characterized by a sustained large value of aggregate motion, a substantial
decrease in ate motion over a defined time on, a moderate value of ate
, and moderate but smaller value of aggregate motions. Such sequence of aggregate
motion values are known to occur with the ending of a play. In particular, segment E
represents the play in progress (as previously noted), segment F represents the end of the
play, segment G represents the players moving from the post play positions back to the
next huddle, and segment H represents players beginning to assemble for the next huddle.
Accordingly, certain embodiments of system 100 at least partially
determine event characterization information 104 from ate motion of participants of
the event, such as by processor 112 executing ctions 116 to perform an analysis
r to that discussed above with respect to Figure 3.
Figure 4A shows an exemplary method 400 that can be used with
certain embodiments of system 100 to detect a target situation in an event by analyzing the
aggregate motion of participants in the event in real- time. Method 400 is performed, for
e, by processor 112 of system 100 executing instructions 116. As shown in Figure
4A, in step 402, aggregate motion of a number of participants of the event is detected, such
as by processor 112 calculating an average velocity of selected participants in the event
from velocity data calculated from event data 102. Processor 112 may calculate respective
participant velocities from changes in participant positions and then determine an e
velocity from the respective participant velocities. Calculated aggregate motion values may
be stored in database 115 for subsequent use.
In step 404, a change in aggregate motion is determined. For
example, processor 112 may determine a difference n two tially determined
aggregate motion values stored in database 115. In step 406 a target situation is detected
if the change in aggregate motion ed in step 404 has a predetermined characteristic
(block 407), and/or if a specific sequence of aggregate motion values detected (block 408),
and/or if specific positions or orientation of event participants is detected (block 409).
For example, with respect to block 407, processor 112 may detect
the beginning of a sporting event play if the change in aggregate motion meets a
ermined increases by at least a threshold value within a given time period as ied
by instructions 116, such as similar to the increase shown in segment D of graph 300 (Figure
3). As another example of step 406, processor 112 may detect an end of the sporting event
play if aggregate motion decreases by at least a threshold value within a given time period
as ied by ctions 116, such as a value similar to the decrease that occurred in
segment F of graph 300.
In block 408, a specific sequence of aggregate motion values must
occur before a target situation is determined to be detected. For example, ion of a
play beginning may require a minimum aggregate motion value to precede a rapid increase
in aggregate motion values and/or a maximum sustained aggregate motion value to follow
the rapid increase, similar to sequences B, D and D, E of Figure 3, respectively. As another
example, detection of a play ending may require a maximum sustained aggregate motion
value to precede a rapid decrease in aggregate motion values and/or a moderate aggregate
motion value to following the rapid decrease, similar to sequences E, F, and F, G of Figure
3, tively.
In block 409, event data 102 must have certain teristics in
addition to known aggregate motion characteristics to detect a target situation. Examples
of such additional characteristics include positions and/or orientation of participants relative
to each other or relative to a playing field. In the case of a football game, the beginning of a
play may be detected if a certain number of s are in certain predetermined positions
indicating formation of a starting line prior to a sufficiently rapid se (e.g., 6 feet/second
minimum aggregate speed in a 0.3 second period) in aggregate motion values.
The choice of the specific participants to be considered when
determining the ate motion value in step 402 s on the specific intended
application of method 400. For example, in the case of American football, only the players
on one team might be considered when determining an aggregate motion value. As another
example, only players considered likely to be involved in a high level of motion during a
particular target situation, such as running backs, receivers, and quarterbacks, may be
ered when determining an aggregate motion value. The specific participants
considered when determining an aggregate motion value may vary depending on the target
ion to be detected or determined. For example, different players may be considered
in aggregate motion determinations when detected in the beginning of an offensive play
and the kicking of a field goal.
Figure 4B is a flowchart g ary steps performed in
detecting a play in a sporting event. As shown in Figure 4B, initially, at step 410, specific
players in a ng event are selected for inclusion in an aggregate motion tabulation.
Combining the motion of multiple players minimizes the impact of the random movement of
individual players and accentuates the differential movement associated with specific target
situations. Certain s or players at certain positions inherently exhibit higher levels of
differential motion than . Selecting players with typically high levels of differential
movement for the aggregate tabulation, and ignoring the remaining players, minimizes the
effect of random motion while maximizing ential motion levels at various stages of a
target situation.
In an American football game, certain 'skill' positions have a relatively
high level of differential motion associated with the beginning or end of a play, thus their
inclusion in an aggregate motion tabulation increases the differential levels of aggregate
motion. Skill positions include wide ers, running backs, and ive backs. Linemen
typically have low differential motion during play start/stop and so their inclusion in the
tabulation reduces the differential levels of aggregate motion.
At step 412, a pre-play target set of conditions is fied. Situations
of interest are generally preceded by a definable (sport-specific) set of ions, players,
positions, relative movements, and the like. The occurrence of this target set of conditions
is an indication that a target situation will occur in the near future and is used as a precondition
to a refined set of player position and alignment criteria.
The pre-play target conditions in an football are met when
there are exactly 11 players from each team on the playing field and both teams are on their
own side of the line of age. This ion occurs toward the end of segment A in the
graph shown in Figure 3.
At step 414, a system 'arm' condition is identified. In addition to preplay
conditions, a target situation is often immediately preceded by a definable (sportspecific
) set of conditions, players, positions, relative movements, and the like. A system
arm ion is an indication that the target situation is imminent and is used as a dition
to more specific, motion based criteria, described below.
In American football one arm condition is known as a 'line set'. This
condition is defined by a certain number of linemen being stationary for a defined period
(typically <800 ms) and the offensive and defensive linemen being oned within a
defined distance of each other (typically <2 meters). This situation occurs toward the end of
segment Bin the graph shown in Figure 3.
At step 416, a start-of-play condition is identified. The beginning of a
target situation (e.g., start of play) is characterized by a specific aggregate motion profile.
In most cases this will be a rapid increase in aggregate motion but ing on the sport
other aggregate motion profiles may exist. If the real time aggregate motion profile matches
the aggregate motion start profile then the start of a situation has been detected.
In American football, ately prior to the snap of the ball, all
offensive players (with minor exceptions) are required to be motionless. This condition
results in a very low aggregate motion baseline, which was established during the arm
condition. As soon as the ball is snapped, all players begin moving nearly simultaneously,
with the position players often moving rapidly. This results in the aggregate motion radically
increasing over a very short period of time. This situation matches the profile for ofplay
and occurs toward the end of segment D in the graph shown in Figure 3.
At step 418, an aggregate motion baseline of play is established.
Following a start event the target situation will typically reach and maintain some level of
sustained aggregate motion. This establishes an aggregate motion baseline value for the
play.
Following the start of a play in American football, the players are
typically moving at a reasonably stable level of aggregate motion. The ude of this
level will vary depending on the type of play. In the case of a long pass play, the level of
aggregate motion will be relatively high, and on a running play it will be relatively low.
Regardless of the type of play, a sustained aggregate motion of some level will generally
be established. This condition exists as segment E in the graph shown in Figure 3.
At step 420, an end-of-play condition is identified. The end of this
target situation (i.e., end-of-play) is characterized by a specific aggregate motion profile. In
most cases this profile exhibits a gradual, yet constant, decrease in aggregate motion, with
an initially faster decrease (e.g., a se in aggregate speed of 40% in 0.5 seconds) in
the motion. If a ular real time aggregate motion profile matches the ate motion
stop e then the end of a play has been detected.
In American football, when the referee blows his whistle, indicating
that a play has ended, the s will begin to slow down. While the aggregate motion will
ately begin to decline, since the players do not all stop instantaneously, or at the
same instant, the decline will be more gradual than the play start. However, the end-of-play
profile is identified by ate motion consistently decreasing over a predefined vely
short period of time, for example, 800 milliseconds. In practice, this duration is dictated by
the specific sport and the specific ion of interest in that sport. This condition exists as
segment F in the graph shown in Figure 3.
Once a target situation has ended, system 100 begins looking for the
next target situation to enter its pre-play condition. In an an football game, once a
play is over the system monitors the players, positions, etc., as described above, seeking
to identify the next pre-play condition. This condition exists in t G and carries
through into segment H in the graph shown in Figure 3. If a game is still in progress at this
point, the procedure described above resumes at step 412.
As noted above, the present system can determine the occurrence
of a target situation at least partially from positional information. es of target
situations determined from analysis of positional ation include players breaking (from)
a huddle, reaching a set position in a line formation, and the ing of a play. For
example, in an American football game, s' ons relative to each other, or relative
to a particular yard marker may indicate that the players are lined up at the line of scrimmage
immediately prior to beginning a play.
Figure 5A is a art showing exemplary steps performed in using
positional information to determine certain target situations in real- time. As shown in Figure
, at step 505, tracking information, which includes the ve positions of event
participants of interest, is received from tracking system 108. At step 510, the relative
positions of selected participants (e.g., players on a particular team) are determined from
analysis of the tracking information. At step 515, if the positions of the selected participants
meet certain predefined criteria, then a corresponding target situation is detected, at step
520.
In one embodiment, the predefined criteria includes relative positions
of participants determined by analyzing the tracking information to detect the participants'
ons relative to certain position indicators, such as yard line markers in a football game.
The criteria may also include the orientation of participants. e.g., the direction in which the
participants are facing.
Examples of target situations that can be determined from positional
information include team in huddle, s in a particular formation, and players' on
relative to the line of scrimmage. Relative ons of coaches and officials can enable
detection of a target situation such as a coach signaling a 'time out' or an official ing
a penalty. Positional information may also be used to analyze actions of officials and
assist with their training.
In American football, there are a finite number of basic ive and
defensive formations in common use. On top of these there are many standard variations
of these formations. Some of these variations are common to nearly all teams, while others
are specific to individual teams.
Beyond these standard and variant formations there are an infinite
number of subtle formation variations, both intentional and unintentional.
Currently, defensive ng staffs routinely study an upcoming
opposition's offensive formations and subsequent play ion/execution. In conjunction
with various game situations (e.g., 3rd down and long) they calculate the percentage of time
an opponent runs certain plays in specific situations. For example, in a 'third and long'
situation, when in a particular formation, the offense passes the ball 75 percent of the time
and when passing from this formation the ball is passed it to a wide receiver 43 percent of
the time.
The objective of compiling these statistics is to improve the accuracy
with which the defense can predict which play the opposing offense will run in a given
situation and, in turn, select the defensive formation with the highest likelihood of success.
The identification of subtle variations in player formations allows the systematic prediction
of which play the offense is most likely to run. An offense may intentionally employ a subtle
formation ion as they believe there is advantage to be gained from this variation based
on the play that is about to be run. For instance their pass blocking may be more effective
when employing a very subtle increase in lineman spacing.
In analyzing video data from a team's past mances this
variation can be systematically identified. This analysis may lead to learning that, in a third
and long situation, when in a particular formation, and where the offensive line assumes a
slightly wider space than normal, a particular offense passes the ball a certain percentage
(e.g., 95.8 percent) of the time.
The present system compares ions, on a play by play basis,
against a catalog of historical plays of the same class and systematically identifies subtle
formation variations within each specific play. In the methods shown in Figures 5B - 5H
(described , a er program 116 is used to systematically determine tically
icant correlations n subtle formation variations and plays run when these
specific subtle variations were present. Each of the examples in Figures 5B - 5H is set forth
in the context of an football; nevertheless, the methods described in accordance with
these figures are applicable to other sports as well. This process systematically distills an
infinite le number of subtle variations down to a finite number of meaningful
predictors, which increases play prediction cy, improves the ability to choose the
most appropriate ion and thus may systematically improve a team's s
percentage.
Figure 5B is a flowchart showing exemplary steps performed in
generating ons and recommendations to se a team's future success percentage
based on detecting deviations in static formations, ating these deviations to specific
outcomes (play results) and comparing these correlations to the outcomes of previous
ions. Figure 5C is an exemplary diagram showing a standard line ion 553 and
a deviant 555 of the standard ion, wherein "X''s indicate the s on one team.
Operation of the present system is best understood by viewing Figures 5B and 5C in
conjunction with one another.
Using player location data for a group of players (such as an
offensive football squad in the present example), at a particular point in a game (i.e. just
before a situation of interest, such as the snap of the ball), the relative positions of the
players is established, at step 530, in Figure 5B. Player location data can be acquired from
tracking system 108 via feed 107. The relative positions of these players define a static
formation 553 for that group of players, which formation is associated with the subsequent
play.
The static formation 555 established in step 530 is compared against
a library (in database 115) of well known classes of standard formations and accepted
variants of those standard formations to identify a best case match with a standard
formation, at step 532. In the e shown in Figure 5C, the standard formation thus
identified is shown in box 553. In this particular rd formation 553, the line spacing
(distance between the players at the left and right tackle ons, as indicated by marker
550) is 7 yards, and wide receiver X1 (circled) is lined up 5 yards away from the right ,
as indicated by marker 552.
Once a best case match has been made, deviations between the
determined static formation 555 and the standard library version of that formation 553 are
identified, at step 534. These deviations can be as subtle as, for example, where the
average line spacing is slightly wider (8 yards, as indicated by marker 551) than in the
standard library formation (7 yards in the present example). These deviations may be
significantly larger, as where a wide receiver lines up 10 yards away from the respective
tackle (as indicated by marker 553), as opposed to 5 yards (as indicated by marker 552),
per the standard library formation shown in Figure 5C.
Having identified a deviation between the previously captured static
formation 555 and the standard library formation 553, at step 536 this deviation is logged to
database 115 along with a number of associated attributes such as ion type (e.g.,
wide offensive line spacing), matched library formation (class & variant), play results
(success or failure), and opposing formation (which type of defense was set up against the
deviant offense in the static formation). Although yardage gained or lost is one measure of
success, there may be other, more appropriate, es of s depending on the
circumstances. For example, if an offense is facing third down and 10 (yards to a first down)
and they gain 9.8 yards, then with respect to gain vs. loss, the play might be judged, in the
abstract, to be a s, but in this particular situation it was actually a failure. The above
example is ic to football and the parameters of success/failure will vary with specific
situations.
The above examples represent only two deviations which might be
identified. In practice there may be 'intentional' deviations and many subtle, entional'
deviations from the standard formation. Although the majority of these deviations may be
tentatively deemed vant to the play outcome, all deviations are nevertheless logged
into se 115, as they may become relevant in the future as additional data is collected.
Once a best case match has been made, deviations between the
static formation 555 and the standard library n 553 of that play are systematically
evaluated. At step 538, system 100 accesses play deviation information in database 115 to
identify deviations for which there are multiple instances and correlates these to play
es (both positive and negative).
Having identified these correlations, at step 540 these play outcomes
are then compared to play outcomes when a particular deviation was not present, i.e., the
deviant formation outcomes are ed against play outcomes resulting from
corresponding 'standard' formations. Previous formations, with associated deviations, are
repetitively compared against standard ions to get a best-case match for each, which
information is then logged in database 115 along with attributes indicating, such things as
the success/failure of the formation (e.g., the number of yards gained/lost using a particular
deviant offensive formation against a specific defensive ion).
At step 542, the system uses the correlations thus established to
generate a report for the coaching staff proposing ons and/or recommendations such
as those indicated in the example below:
Positive Outcome Variation Detected
Squad: Offense
Formation Class: passing
Formation t: split wide receiver
Deviation Type: sed line spacing Standard Success: 52.6%
Deviation Success: 63.1%
Recommendation(s):
Increase line spacing in split receiver formations.
Investigate line spacing increases in passing class formations.
Figure 5D is a flowchart showing exemplary steps performed in
predicting an opponent's behavior based on detecting deviations in static formations and
correlating these deviations to historical behavior in given situations. Operation of the
present system is best tood by viewing Figures 5D and 5C (described above) in
conjunction with one another.
Using player location data for an offensive football squad, in the
present example, at a particular point in a game (i.e. just before a situation of interest, such
as the snap of the ball), the relative positions of those players is established, at step 580, in
Figure 5D. The relative positions of these players define a static ion 555 (shown in
Figure 5C) for that group of players, which formation is associated with the subsequent play.
The static formation 555 established in step 580 is compared against
a library (in database 115) of s of standard formations for the specified team of
interest and accepted variants of those rd ions, for a specific team of interest,
to identify a best case match with a standard formation used by that team, at step 582. In
the example shown in Figure 5E, the standard formation thus fied is shown in box 553.
In this particular standard formation 553 (which is the same ion as ted in the
example of Figure 5B), the line spacing is 7 yards, and wide receiver X1 is lined up 5 yards
away from the right tackle.
Once a best case match has been made, potentially significant
ions between the defined static formation 555 and the standard library version 553 of
that formation are identified, at step 584. Having fied a deviation between the static
formation 555 and the standard library formation 553 for the team of st, at step 585
this particular deviation is logged to database 115 along with a number of associated
attributes such as deviation type (e.g., wide ive line spacing), matched library
formation (class & variant), situation (e.g., which down and the number of yards to go), and
subsequent type of play run. This type of information may be used by a defensive squad to
analyze an offensive squad for specific 'down and distance' situations to determine, on a
statistical basis, what type of play this ive squad runs when faced with a particular
situation, for example., third down and between 7 and 10 yards to a first down.
At step 586, system 100 accesses historical play data in database
115 to selectively retrieve previous plays for specific situational categories, for example,
first down and ten yards to go, from between the nt's 10 and 20 yard lines, for a
team of interest. At step 587, the results are then sorted into groups based on standard
formations for the team of interest and a tabulation is made of the percentage of times
ic plays were run from this standard formation given a specific type of game ion.
The results are then further sorted based on common, identifiable, and sometimes subtle,
deviations from the standard formation 553. After fying correlations between formation
deviations and their outcomes, at step 588 these outcomes are then compared to play
outcomes when a particular deviation was not present, i.e., the deviant formation outcomes
are compared against play outcomes resulting from corresponding 'standard' formations.
At step 589, a report is ted in which these tabulations are
cataloged based on situations of st for the coaching staff. The report is used in
preparing a team to more accurately predict what the team of interest will do in a given
situation, from a specific formation and how specific deviations in that formation refine the
likelihood of a particular or. A typical report may include information such as that
ted in the example below:
Behavior Prediction based on situation, formation and variant
Squad: offense
Down: third
e: 7< x <10
Formation Class: passing
Pass 80%
Run 20%
Formation variant: split wide receiver
Deviation type: increased line spacing
Pass: 85%
Run: 15%
ion Variant: split wide receiver
Deviation Type: increased wide receiver spacing
Pass: 93%
Run: 7%
Play recognition is a type of target situation that may be ed by
the use of information such as the path of travel of an event participant, as determined from
positional, velocity, and path information. This information can be compared to a database
of known plays to recognize a particular type of play. In the embodiments described below
with respect to Figures 5E - 5H, se 115 is populated with information indicating
previous formations and plays run by a particular team in given game situations.
Figure 5E is a art showing exemplary steps performed in
generating solutions and recommendations to improve the performance of a team based
on detecting deviations in dynamic play execution, correlating these deviations to specific
outcomes and comparing the correlations to the outcomes of previous situations. Figure 5F
is an ary diagram showing a standard route 573 and a deviant route 574 in r
patterns ) run by a wide receiver. Operation of the present system is best understood
by viewing Figures 5E and 5F in conjunction with one another.
Using a player location data set for a selected group of ipants
(such as an offensive football squad) captured for the full duration of a situation of interest
(e.g., an entire play), the path of each individual participant is determined, at step 590. The
collection of these individual paths defines a dynamic play execution. In step 592, the
dynamic play execution established in step 590 is compared against a y of well known
classes of standard play executions (and accepted variants of those standard executions)
stored in database 115, to establish a best case match with a standard type of play.
This comparison is considered from the perspective of individual
paths, which are compared to predefined paths and the paths treated as a tion of
individual data points. Although there may be multiple paths, each player has a predefined
path, so the paths can be processed individually. While the paths are actually two
dimensional, they are treated simply as collections of te data points, which can be
evaluated for deviation from a standard path. What might be considered a significant
ion will vary by sport, situation of interest, and by player position. When ering,
for example, a wide receiver in an offensive football play, a deviation of more than 1.5 yards
from predefined path may be considered significant.
In finding matches between deviations so that they can be grouped
together, each standard play execution is considered as a collection of individual,
predefined paths. Each individual path comprises a tion of specific segments
consisting of legs and inflection points. As an example, a wide er route might be
described as follows:
Wide Receiver Path Segments
Start of play
Segment 1 - straight for 5 yards
Segment 2 - 90 degree turn toward center of field
Segment 3 - 10 yards straight
Segment 4 - 45 degree turn in opposite direction of segment 2 turn
Segment 5 - straight until end of play
Once a path within a dynamic play execution has been identified,
then the t in which the deviation ed is identified. Deviations in individual paths
are selected for further evaluation and, once selected, these paths are further classified
such that they can be d with, and compared to, previously recorded deviations.
Once a best case match has been made between the dynamic play
execution established in step 590 and a standard type of play, deviations between each
path of interest within the dynamic play execution set and the paths defined in the standard
library version of that play execution are evaluated, at step 594. In Figure 5F, two paths for
offensive player X1 (e.g., a wide receiver) in formation 591 are shown - path 574 is the path
ed from the dynamic play execution established in step 590, and path 573 is the path
with the best case match selected from the standard y of plays. The deviations
determined by the evaluation made in step 594 may be as subtle as a wide receiver making
a jog (at arrow 575) in his pattern where the receiver changes his 'cut point', as shown in
Figure 5F.
The present example represents one le path deviation which
might be identified. In practice there may be a large number of deviations present in a single
play and possibly even multiple deviations in a single player's path. Having fied a
deviation between a path within a dynamic play execution and the standard library path for
that play execution, at step 595 this deviation is logged to database 115 along with a number
of associated attributes such as deviation type (e.g., wide receiver path), deviation specifics
(e.g., additional course changes), matched library formation (class & variant), play outcome
ss or failure), and opposing formation (which type of defense). Although the majority
of the deviations may be tentatively deemed vant to the play e, all deviations
are nevertheless logged in database 115 as they may become relevant in the future as
additional data is collected.
At step 596, deviation information in database 115 is accessed to
identify significant correlations n s path deviations and play outcomes (both
positive and negative). Having identified these correlations, at step 597 the outcomes are
then compared to corresponding rd play outcomes, that is, the results of a standard
play that had been executed as intended (e.g., as the play was initially drawn on a
oard) when a particular deviation was not present. At step 598, these correlations are
then used to generate a report for the coaching staff including ve success of deviant
and standard paths, and optionally proposing ons and recommendations. A typical
report may include information such as the following:
Positive Outcome Variation Detected - Dynamic Play Execution
Squad: Offense
Execution Class: g
Execution Variant: Split wide receiver
Deviation Type: Receiver Path
Deviation Specific: Additional course changes
Standard s: 52.6%
Deviation Success: 61.6%
endation(s):
Incorporate additional course changes in wide er path.
Investigate additional course changes in all receiver routes.
Figure 5G is a flowchart showing exemplary steps performed in
ting an opponent's behavior based on detecting deviations in dynamic play execution
and correlating these deviations to previous behavior in ic situations. Figure 5H is an
exemplary diagram showing a standard route and a t route in similar patterns (paths)
run by a slot receiver. Operation of the present system is best understood by viewing
Figures 5G and 5H in conjunction with one another.
Using a player location data set for a selected group of participants
(such as an offensive football squad) of a team of interest, captured for the full duration of
a particular situation (e.g., an entire play), the path of each individual participant is
ined, at step 5105. The collection of these individual paths defines a dynamic play
execution.
In step 5110, the dynamic play execution established in step 5105 is
compared against a library d in database 115) of well known classes of standard play
executions and accepted variants of those standard executions, for a specific team of
interest, to establish a best case match for a selected standard type of play. Once a best
case match has been made, ions between each path of interest within the dynamic
play execution set and the paths defined in the standard library version of that play
execution are evaluated, at step 5115.
In Figure 5H, two paths for offensive player X1 (e.g., a slot receiver)
in formation 5101 are shown - path 5103 is the path ed from the c play
execution established in step 5105, and path 5102 is the path with the best case match
selected from the standard library of plays. Note that 'standard' path 5102 and deviant path
5103 have respective 'cut point' distances 5111 and 5112. The deviations determined by
the evaluation made in step 5115 may be as subtle as a slot receiver cutting his 'in '
path short (at arrow 5107) relative to where he would normally change direction at the
standard cut point (at arrow 5108), as shown in Figure 5H.
Having identified a deviation between a path within a c play
execution and the standard library path for that play execution and team of st, at step
5120 this deviation is logged to se 115 along with a number of associated attributes
such as deviation type (e.g., slot receiver path), deviation specifics (e.g., motion duration),
d y formation (class & variant), situation (e.g., down number and yards to first
down), and subsequent type of play run.
At step 5125, information in database 115 indicating previous
performances for the team of interest is accessed to retrieve selected plays for specific
situational categories. At step 5130, the plays are then sorted into groups based on
standard play executions for the team of interest, and the corresponding frequency with
which specific behaviors (e.g., which player ran the ball) ed are tabulated. The sorted
results are refined based on common, identifiable, and often subtle, deviations from the
standard play execution. The percentages of times specific behaviors occurred (e.g., who
the ball was thrown to in a specific situation) are tabulated for instances when a play
execution deviation was present.
At step 5135, the system accesses ation in database 115 to
identify deviations for which there are multiple instances and compares the behavior (the
ic type of play executed) in specific play executions when a ular deviation is
present, to behavior when the deviation is not t.
A defensive squad may want to analyze an offensive squad for
specific 'down and distance' situations on a statistical basis to determine what an offensive
squad typically does when faced with a third down and between 7 and 10 yards to first
down. Dynamic play deviation information can be used to refine a team's prediction ability
and improve their success percentage.
A report is thus ted, at step 5140, to catalog the predicted
behavior of a team of interest as a function of deviant play execution and situations of
interest, as determined above. A coaching staff may use this report in preparing their team
to more accurately predict what the team of interest will do in a given situation during a
specific play execution, and how specific ions in that execution te the likelihood
of a particular behavior (e.g., who the ball is thrown to). A typical report may include the
following information:
or Prediction based on situation, dynamic play execution and deviation
Squad: offense
Down: third Yardage: 7< x <10
Formation class: passing
Pass to slot 25%
Pass to other 55%
Run 20%
Formation Variant: slot receiver motion
Deviation Type: shortened motion duration
Pass to slot: 80%
Pass to other: 15%
Run: 5%
In one embodiment, player movements can be traced in real time
onto live feed 109, ally positioned on the field surface as the camera moves, from
detected start of play until detected end of play. In another ment, player paths are
tically shown in real time on a graphic screen. Data collected (e.g., via feed 107, or
from database 115) by system 100 is associated with the corresponding video footage;
therefore, if a video is selected for replay, the associated data may be used to generate
c and statistics for combining with, or overlaying onto, video feed 109.
A generated graphic of the field and players can be a perspective
view which allows fading between live action footage and graphic views. If the graphics are
generated to have the same aspect ratio and viewing angle as the camera view, player
traces and marked paths remain constant when fading from ted graphic to camera
view. This avoids the switching from a side ctive view of a camera to a generated
plan view to show a play. Once transitioned to the generated perspective graphic view, the
graphic can be rotated to provide the most appropriate viewing angle for showing the play.
Figure 6A is a flowchart showing exemplary steps performed in using
positional information to provide real-time automatic annotation of a video feed 120 of an
event. Figure 6B is an exemplary diagram showing player identification graphics and the
traced path of a player on a video display 130. A graphic showing the path of travel of one
or more selected players 660, 661 can be displayed either in real time, or after the end of a
play. As shown in Figure 6A, at step 605, ng information from tracking system 108 is
received for event ipants of interest. At step 610, the path of travel of one or more of
the participants is calculated, using positional information ated from tracking system
108 data. At step 615, a graphic 652, indicating the path of travel of the selected
participant(s), for example, the path for player 661, is overlaid onto the video feed 120, as
indicated in Figure 6B.
System 100 can also show, via output 104, the identity and locations
of multiple players on the field, and their ated teams (e.g., players of team A in red,
players of team Bin blue). This information can be yed on a graphic representing the
actual g field, or overlaid on the live video feed 109, as indicated in step 620.
In one embodiment, the present system keeps continuous track of
selected off-screen objects so that the off-screen on of the objects is indicated, and
the s are highlighted immediately upon entering the field of view. A 'camera view'
nate system is used, wherein the center of the screen is assigned the coordinate
(0,0), the upper left has the coordinate (-1, -1), and the lower right is (1,1). Note that the X
and Y scales are not the same, since video displays (including sion screens) have an
aspect ratio by which the screen width is greater than the screen height. Thus the point
represented by the coordinate (0.5,0) is located further to the right of center-screen than the
point represented by nate (0,0.5) is located down from the center. It should be noted
that the coordinate system employed by the present system may be different than that
described herein and still provide the same function.
Using the coordinate system described above, it is vely simple
to determine if an object is on screen, as both the X and Y coordinates will be >= -1 and <=
1. When reporting the location of an object, its coordinates can be< -1 or> 1, meaning it is
off . At high zoom levels, object coordinates can be much larger than 1 or much
smaller than -1.
Figure 6C is an exemplary diagram showing player identification
graphics indicating the off-screen location of selected objects. By calculating the direction
of an reen object relative to a border of a display screen 130, the present system can
determine which location along the appropriate screen border is closest to the object. A
highlighting indicator or marker 675 is placed at this location proximate the border of the
screen 130 to indicate that the off-screen object (e.g., player 676 or 677) is in a ular
direction relative to the images displayed on the screen. Dotted box 670 represents the
potential field of view of a camera (e.g., video camera 117) which is providing the video feed
109 displayed on screen 130. When a previously reen object again becomes visible
'on-screen', the marker may change its appearance and continue tracking the object, as
shown in Figure 6B.
One example of off-screen tracking is a close-up of the quarterback
and the linemen around him (indicated by arrow 671), where two wide receivers 676, 677
are not in view on screen 130, as shown in Figure 6C. Each wide receiver's l location
is ted with a marker 675(1), 675(2) positioned next to the appropriate edge of the
screen 130, thus allowing a viewer to tell which wide receiver the rback is g
toward at a given point in time. Marker 675 may include identifying text, or may simply be
color-coded to represent one or more players of a specific type.
Player ties can be indicated via output 104 in real time, for
example, via a player identification graphic 657 overlaid onto the video feed such that it is
close to the player's head or body. Graphic 657 shows, for example, the player's number
and name, but may, alternatively or additionally, provide other information such as the
number of yards gained or lost on a particular play, as indicated by graphic 658. In other
embodiments, all of, or certain parts of, selected players 655 may be highlighted, as
indicated by an optionally blinking 'dot' 656, or other part of the , such as the player's
head or helmet 655. One or more players to be highlighted can be user- selected (via user
input 118, such as a handheld device described below) or selected by the system. For
example, the system may automatically identify a quarterback, or all eligible receivers may
be automatically identified after a quarterback throws the ball.
In another embodiment, certain players can be highlighted as a result
of detection of a target situation, such as when two players are within a predetermined
distance of each other, e.g., when a receiver is within a predetermined distance of a
defensive back.
System 100 can also draw the line of age and yard markers
and y them onto video feed 109. In the case of American football, the approximate
line of scrimmage can be determined from the players' (e.g., linemens') positions and the
distance to a first down can be automatically calculated and added as an annotation.
Participant parameters, such as distance traveled, velocity, and/or acceleration, can also
be displayed on a graphic 658 via output 104.
The graphics generated by the present system may be partially
arent or opaque, depending on the particular graphic being displayed and whether
the graphic is an overlay or not. Graphics may fade between an image of an event (e.g.,
live action footage) in the video feed and a particular graphic. Graphics may include images
that represent actual players, as commonly done in video games.
Graphics may have same aspect ratio and viewing angle as image of
an event, such that player path traces and marked paths remain constant when fading
between the graphic and the image, y ing a smooth transition during the fading
process. ately, a graphic may have a different aspect ratio and/or viewing angle than
the corresponding image to present a view of the event that is different than the video image
of the event.
Figure 6D is an ary diagram showing a window 685 containing
a highlighting shape 680 overlaid onto a video feed 690. In one embodiment, rather than
modifying the incoming video feed 109 frame-by- frame, the present system d uses a
standard video player to overlay, on top of a video stream 690 (e.g., video feed 109), a
window 685 which includes a ratiometrically correct highlighted image of each player being
highlighted. This overlay window 685 is transparent, except for semi-transparent areas filled
with any color except black. To create a highlight, a white, semi- arent oval (or a
highlighting tor of other d color/shape) 680 approximately the size of the player
to be highlighted (player 681 in Figure 6D) is drawn on the overlay window 685 at the
approximate position of the player. The position of the player is determined from location
information extracted from tracking system 108. The highlighting indicator 680 is overlaid
on the streamed video image 690 to create an image with highlight 682, while the rest of
the video image remains unchanged. With this method, rather than having to deal with the
higher bandwidth video data stream, the present system has a simpler and less timeconstrained
task of creating overlay updates independent of more frequent video frame
updates, since the highlighting indicator 680 is wn only when the position of the
highlighted player changes, in the composite displayed image, by a predetermined
displacement.
When a single player is being d by the camera, the system
constantly modifies the zoom level in an effort to maintain the yed player at a relatively
constant size in the video frame regardless of how near or far away the player is from the
camera. In the case where only one player is tracked, the sizing of the highlight is relatively
constant except at the minimum and maximum zoom levels.
When other players that are not being tracked appear in the video
feed, the highlight size becomes ingly dynamic. The size of the player in the video
frame, and therefore the required size of the highlight, is generally based on how much
closer or further away from the camera these other players are in comparison to the tracked
player. In either case (both camera-tracked, and non-camera-tracked players), the system
continuously calculates a size-in-the-video-frame metric each time new location ation
arrives for a player. This metric is used to determine the size of the highlighting shape, and
is based on information including the video camera on, the location of the player(s),
the pan & tilt settings of the camera, and the current camera zoom level.
The translation of this information into a size-in-the-video- frame
metric es a series of calculations/transforms including determining a camera's field of
view based on pan, tilt and zoom of a plane parallel to the lens, and correcting that field-ofview
measurement based on the degree to which the field is not parallel to the lens (i.e.,
correcting for camera angle, relative to field). Once the field-of-view of the camera (e.g.,
camera 117) is calculated, then the position and size within that field of view is calculated
for each of the location units (on players of interest) within the view. This calculation also
corrects for the camera angle. Rather than use the raw noisy location data, both the f-view
and the size-in-the-video-frame calculations are based on filtered location data. The
filtering may be identical to that used in controlling the camera motion.
In one embodiment of the t system 100, the path of travel of a
participant is tically analyzed and yed to evaluate the mance of a
participant. Figure 7 is a flowchart showing exemplary steps performed in evaluating a
participant's performance. As shown in Figure 7, at step 705, the path of travel of one or
more ed ipants is determined. The distance traveled by the participant, and/or
the participant's velocity may also be determined. At step 710, paths of travel for multiple
players are compared to determine how well a particular player was able to perform during
a given play (e.g., in ng players from an opposing team, or in 'covering' another
player). In the case of officials, their paths show where the officials traveled during a
particular play. This ation may be helpful in evaluating an official's effectiveness.
At step 715, one or more players whose path meets predetermined
criteria is tically highlighted on a graphic. For example, 'open' players (i.e., offensive
players who are separated from all defensive players by a certain ce) or blocked
players (i.e., those whose velocity during a certain time period is less than a minimum
threshold and who are positioned in sufficiently close proximity to a player on the opposite
team), by changing the color of these players as displayed on a graphic, which may also
show the players' path of travel.
A graphic showing a path of travel may also show orientation of the
participant(s), for example, the direction in which a quarterback or referee was facing. A
c may automatically change configuration in response to a target situation, for
example, a dashed line may be displayed during play and a solid line displayed at the end
of play.
In one embodiment, system 100 may control the imaging of an event
at least partially in response to event characterization information 104. The system may
automatically direct a robotic camera 117 to capture or 'cover' a target situation such as the
beginning of a play, or when certain players are positioned within a predetermined distance
of each other. For example, a camera may be automatically directed to cover an area of
interest such as the line of scrimmage, a huddle, or a particular participant or participants
in response to, or in anticipation of, a target situation, e.g., camera 117 may be directed to
cover a quarterback upon detection of the beginning of a play. This procedure may help
ensure that play is not missed due to other action on the field.
In any game there are a number of situations which have the potential
to evolve into a target situation. Some examples include:
- Two hockey players who have had confrontations in the past are likely to get
into a fight at some point during a game. Every time they are near each other, they can be
targeted in high zoom with a robotic camera 117 in anticipation of a target situation.
- Two football players have a historically notorious match-up. Every time they
are near each other, they can be targeted in high zoom with a robotic camera 117 in
anticipation of a target situation.
- A particular ball player is a good three point r. Every time he is
near the three point line, he can be targeted in high zoom with a robotic camera 117 in
anticipation of a target situation.
In one embodiment, system 100 has access to the positions of all
s on the field of play and a predefined, prioritized list of conditions to watch for.
When system 100 identifies the conditions which precede a target situation, the
system directs a robotic camera 117 to zoom in on and track the appropriate subject(s).
A simple e is a hockey game in which there are two players
who fought in the last game. The odds that they will fight again are high and thus any time
they are in close proximity, a situation of interest is determined to exist. Should a target
situation subsequently occur, high zoom video footage becomes available before
occurrence of the event defined by the target situation. In the case of a hockey fight there
is often an extended period of close proximity during which glances, gestures and stares
are exchanged in a period preceding the actual confrontation.
System 100 can cause video to be ted only during time
intervals where the system has detected a play in process. Video buffers may capture
leaders or trailers to ensure that an entire play is ed. atively, the entirety of an
event, or a significant n of it may be recorded, in which case the system may
tically post-edit the video ing to remove footage that does not include plays in
progress.
In one ment, an entire game video is tely automated.
This automation emulates what is typically done with a manually operated camera. Figure
8 is a flowchart showing exemplary steps med in automating the video filming of
predetermined types of segments of a game. The exemplary process shown in Figure 8
may be performed for a game of American football with two cameras, for example, with one
camera 117 at an end zone and another camera 117 in a press box.
In one embodiment, a camera 117 in the press box (or other vantage
point) automatically captures all (22) players on the field during the entire game, at step
805. The press box camera may either record the entire field of play or, alternatively, zoom
in to more closely capture a smaller area on the field in which all of the players are located.
As indicated in Figure 8, at step 810, a camera 117 in the end zone is zoomed in on a
scoreboard and records for a predetermined duration (e.g., 10 seconds). At step 815, upon
detection of a predetermined type of situation (e.g., s moving to a line of scrimmage),
the end zone camera moves and/or zooms to capture all players on the field. At step 820,
upon detection of a line set condition, both press box and end zone cameras begin
recording. At step 825, upon end-of-play detection, both cameras continue recording for a
predetermined time, e.g., 5 seconds, and then stop recording. If it is not yet the end of the
game (step 830), then steps 810 - 825 are repeated until the game ends.
In certain embodiments, system 100 automatically transmits event
characterization information to a recipient's wireless device, such as a mobile phone, k
er, or other portable wireless device, using UDP protocol, for e. Figure
9 is a diagram showing the use of a wireless, typically ld, device 128 with the t
system. Users may select which event characterization information and/or video/still images
of an event are displayed on their mobile device 128. For e, a coach or a spectator
may elect to view selected athlete performance parameters, or a spectator may select one
of a number of video feeds, such as one that is covering the spectator's te athlete.
In one embodiment, user-configurable video feeds from one or more
cameras 117 at an event facility may be broadcast throughout the facility. Users with
handheld devices 128 may access the specific video feed of their choice via a wireless
broadcast from system 100 or via a wireless communication system connected thereto.
Coaches, referees, spectators, and tators may also use handheld devices 128 to
choose their own particular video feed from system 100.
Coaches and/or officials may also direct event characterization
information and/or images to be displayed for training and/or reviewing purposes on a large
display device 130, such as a stadium scoreboard. A coach may l video playback
from the field using a handheld device 128 (such as a net-book or other portable computing
device) and may select video or graphic displays for viewing during training ns.
Replays can be displayed on the handheld device, or on a larger display unit such as the
stadium oard 130.
In American ll, a minimum of one referee is assigned to count
the players on each team just prior to the snap of the ball. This is not only a difficult task to
perform correctly, given the time constraints, it also deters this referee from ng other
things immediately prior to the snap.
In one embodiment, system 100 continuously monitors the number
of players on each team and notifies referees via handheld devices 128 (via a tone, vibrating
mechanism, etc.) when either team has too many players on the field at the snap of the ball.
The present method also provides coaches with real-time access to players on the field as
well as with specific statistical data regarding their performance. Event spectators using
their own handheld s, capable cell phones, etc., are provided access to a menu of
data s that display information such as such as who is on the field, statistics, replays,
and so forth.
Changes may be made in the above methods and systems without
departing from the scope thereof. It should thus be noted that the matter contained in the
above description and shown in the accompanying drawings should be interpreted as
illustrative and not in a limiting sense. The following claims are intended to cover generic
and specific es described herein, as well as all statements of the scope of the present
method and , which, as a matter of language, might be said to fall therebetween.
Claims (20)
1. A camera control method, comprising: periodically receiving ime geographic positions of a plurality of objects in a sporting event; processing the periodically-received positions to identify at least one ion that precedes a target situation ing the plurality of objects; and lling a robotic camera, in response to the at least one condition being identified, to capture at least one of the plurality of objects prior to, and during at least part of, the target situation.
2. The camera control method of claim 1, the plurality of objects including a plurality of participants of the sporting event.
3. The camera control method of claim 1 or claim 2, the plurality of objects including at least one piece of sports equipment used for the sporting event.
4. The camera control method of any one of claims 1 to 3, wherein said controlling the robotic camera includes controlling the robotic camera to track the at least one of the plurality of objects, based on the periodically-received positions, while the at least one of the ity of objects moves during the target situation.
5. The camera control method of any one of claims 1 to 4, the sporting event being a football game, wherein: the at least one condition is players breaking from a huddle prior to a beginning of a play of the football game; and said lling the robotic camera includes controlling the robotic camera to e the players prior to, and during the beginning of, the play.
6. The camera control method of any one of claims 1 to 4, the ng event being a football game, n: the at least one condition is an occurrence of first and second formations on opposite sides of a line of scrimmage prior to a beginning of a play of the football game; and said controlling the robotic camera includes controlling the robotic camera to capture the players in the first and second formations prior to, and during the beginning of, the play.
7. The camera control method of any one of claims 1 to 4, the at least one condition being identified from an aggregate motion determined from the periodically-received positions.
8. The camera control method of any one of claims 1 to 4 or claim 7, the sporting event being a football game, wherein: the at least one condition is a beginning of a play of the football game; and said controlling the robotic camera includes controlling the robotic camera to capture a rback prior to, and during at least part of, a pass.
9. The camera control method of any one of claims 1 to 4 or claim 7, the sporting event being a football game, wherein: the at least one condition is a beginning of a play of the ll game; and said controlling the robotic camera includes controlling the robotic camera to capture a football prior to, and during at least part of, a pass.
10. The camera control method of any one of claims 1 to 9, further comprising detecting, with a tracking system, the real-time geographic positions of the plurality of objects from at least one location unit attached to each of the plurality of objects.
11. A camera control , comprising: periodically receiving real-time geographic positions of a plurality of objects in a sporting event; processing the periodically-received positions to identify at least one condition that precedes a target situation involving the ity of s; and controlling a robotic camera, in response to the at least one ion being identified, to capture an area of interest prior to, and during at least part of, the target situation.
12. The camera control method of claim 11, the area of st being a scoreboard.
13. The camera l method of claim 11, the sporting event being a football game, wherein: the target situation is a play of the ll game; and the area of interest is a line of scrimmage.
14. A camera control system, comprising: a processor; a memory communicable d with the processor and storing machine-readable ctions that, when executed by the sor, direct the system to: periodically receive real-time geographic positions of a plurality of objects in a sporting event; s the periodically-received positions to fy at least one condition that precedes a target situation involving the plurality of objects; and l a robotic camera, in response to the at least one condition being identified, to capture at least one of the ity of objects prior to, and during at least part of, the target situation.
15. The camera l system of claim 14, n the machine-readable instructions that direct the system to control the robotic camera include machine-readable instructions that direct the system to control the robotic camera to track the at least one of the plurality of s, based on the periodically-received positions, while the at least one of the plurality of objects moves during the target situation.
16. The camera control system of claim 14 or claim 15, further comprising a tracking system configured to detect the real-time geographic positions of the participants.
17. The camera control system of any one of claims 14 to 16, wherein the machinereadable instructions that direct the system to process the periodically-received positions include machine-readable instructions that direct the system to identify the target situation from an aggregate motion determined from the periodically-received positions.
18. A camera control system, comprising: a processor; a memory communicable coupled with the sor and storing machine-readable instructions that, when executed by the processor, direct the system to: periodically receive real-time geographic positions of a plurality of objects in a sporting event; process the periodically-received positions to identify at least one condition that precedes a target situation involving the plurality of objects; and control a robotic , in response to the at least one condition being identified, to capture an area of interest prior to, and during at least part of, the target situation.
19. The camera control system of claim 18, further comprising a tracking system configured to detect the real-time phic positions of the participants.
20. The camera control system of claim 18 or claim 19, wherein the machine-readable instructions that direct the system to s the periodically-received positions include machine-readable instructions that direct the system to identify the target situation from an aggregate motion determined from the periodically-received positions.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US29238610P | 2010-01-05 | 2010-01-05 | |
US61/292,386 | 2010-01-05 | ||
NZ749991A NZ749991A (en) | 2010-01-05 | 2011-01-05 | Systems and methods for analyzing event data |
Publications (2)
Publication Number | Publication Date |
---|---|
NZ758097A NZ758097A (en) | 2020-10-30 |
NZ758097B2 true NZ758097B2 (en) | 2021-02-02 |
Family
ID=
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2019229390B2 (en) | Systems and methods for analyzing event data | |
US10758807B2 (en) | Smart court system | |
CN109862949B (en) | System for evaluating basketball projection performance | |
KR20180063777A (en) | Soccer information analysis apparatus and the method thereof | |
WO2013171658A1 (en) | System and method for automatic video filming and broadcasting of sports events | |
US11823454B2 (en) | Method and apparatus for user interaction with a video stream | |
NZ758097B2 (en) | Systems and methods for analyzing event data | |
US11900678B2 (en) | System for tracking, locating and calculating the position of an object in a game involving moving objects |