AU2021204421A1 - Data extraction and prediction generation method and system for sporting events - Google Patents

Data extraction and prediction generation method and system for sporting events Download PDF

Info

Publication number
AU2021204421A1
AU2021204421A1 AU2021204421A AU2021204421A AU2021204421A1 AU 2021204421 A1 AU2021204421 A1 AU 2021204421A1 AU 2021204421 A AU2021204421 A AU 2021204421A AU 2021204421 A AU2021204421 A AU 2021204421A AU 2021204421 A1 AU2021204421 A1 AU 2021204421A1
Authority
AU
Australia
Prior art keywords
video
sporting event
participant
processing system
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2021204421A
Inventor
Sayed Hasham Davarpanah
Adel Ghazikhani
Iggy Jovanovic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Edworkz Pty Ltd
Original Assignee
Edworkz Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2021901238A external-priority patent/AU2021901238A0/en
Application filed by Edworkz Pty Ltd filed Critical Edworkz Pty Ltd
Publication of AU2021204421A1 publication Critical patent/AU2021204421A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0025Tracking the path or location of one or more users, e.g. players of a game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0028Tracking the path of an object, e.g. a ball inside a soccer pitch
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0056Tracking a path or terminating locations for statistical or strategic analysis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/38Training appliances or apparatus for special sports for tennis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image
    • G06T2207/30228Playing field
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

DATA EXTRACTION AND PREDICTION GENERATION METHOD AND SYSTEM FOR SPORTING EVENT ABSTRACT A method, processing system, system and computer readable medium is disclosed for extracting analytic data from a video of a sporting event and determining one or more predictions in relation to the sporting event. In one aspect, the method includes steps of: receiving a video of the sporting event; extracting a background image from the video; generating, using the background image, a plurality of moving masks for a plurality of frames of the video; extracting, from the plurality of moving masks, analytic data related to at least a portion of the sporting event; and using the analytic data to generate a likelihood for one or more outcomes for the sporting event, or a portion of the sporting event. 4/16 START 300 Receive a video of the sporting event 310 Extract a background image from the video 320 Generate, using the background image, 330 moving masks for a plurality of frames of the video Extract, from the moving masks, analytic data 340 related to at least a portion of the sporting event Use the analytic data to generate a likelihood 350 for one or more outcomes for the sporting event, or a portion of the sporting event END 400 FIG. 3

Description

4/16
START 300
Receive a video of the sporting event 310
Extract a background image from the video 320
Generate, using the background image, 330 moving masks for a plurality of frames of the video
Extract, from the moving masks, analytic data 340 related to at least a portion of the sporting event
Use the analytic data to generate a likelihood 350 for one or more outcomes for the sporting event, or a portion of the sporting event
END 400
FIG. 3
DATA EXTRACTION AND PREDICTION GENERATION METHOD AND SYSTEM FOR SPORTING EVENTS
Related Applications
[0001] The present application claims priority from Australian Provisional Patent Application No. 2021901238, filed 27 April 2021, the contents of which is herein incorporated by reference in its entirety.
Technical Field
[0001a] The present invention relates generally to sports analytics and predictive analysis.
Background
[0002] Sport analytics is an important research field in the machine learning community. These analytics are important due to their role for enhancing the quality of teamwork, coaching, management, planning and integrity.
[0003] Currently, statistics gathered from a sporting event can be collected from sensors worn by the participants, such as GPS/location sensors. However, certain actions during a sporting event cannot be sensed. For example, in tennis it is not easily possible to use a sensor to determine the number of forehand shots that are played by a tennis player in a tennis match. In this instance, data is manually recorded by a person viewing the sporting event. Manual recording of statistics during a sporting event can be a time-consuming task. Coaching staff may require near real time analytics during a sporting event to assist with certain decisions. Therefore, manual recording of statistics may simply not be possible or impractical. Furthermore, whilst predictive analysis for a sporting event can be performed based on simplistic statistics, it would be preferable if the predictive analysis were based on more sophisticated and fine-grained statistics which are captured in near-real time without the use of sensors which is certain instances are simply not possible.
Summary
[0004] It is an object of the present invention to substantially overcome or at least ameliorate one or more disadvantages of existing arrangements.
[0005] In a first aspect there is provided a method of extracting analytic data from a video of a sporting event and determining one or more predictions in relation to the sporting event, wherein the method includes steps of: receiving a video of the sporting event; extracting a background image from the video; generating, using the background image, a plurality of moving masks for a plurality of frames of the video; extracting, from the plurality of moving masks, analytic data related to at least a portion of the sporting event; and using the analytic data to generate a likelihood for one or more outcomes for the sporting event, or a portion of the sporting event.
[0006] In certain forms, the method includes determining the background image repeatedly throughout the video of the sporting event.
[0007] In particular embodiments, each moving mask is generated by comparing each pixel value of a frame of the video with a corresponding pixel of one or more immediately prior frames of the video, wherein based on the comparison, the moving mask for the frame includes each pixel having a pixel value discrepancy which exceeds a pixel value threshold.
[0008] In particular forms, the method includes converting colour frames of the video to greyscale.
[0009] In certain embodiments, the method includes: partitioning a playing area of the sporting event, which is captured within the video, into a plurality of partitions; and identifying actions performed during the sporting event using the moving masks; wherein the analytic data extracted from the video includes each identified action which is assigned to have been performed in a partition of the playing area.
[0010] In certain forms, the method includes at least one of: calculating participant parameters based on participant detection using the moving masks, wherein the analytic data includes the participant parameters; and calculating non-participant parameters based on moving non-participant object detection using the moving masks, wherein the analytic data includes the non-participant parameters.
[0011] In particular embodiments, the participant parameters include at least one of a position of the participant in a playing area of the sporting event, a speed of the participant, and action type.
[0012] In certain forms, the non-participant parameters include at least one of a position of the non-participant object and a speed of a non-participant object.
[0013] In certain embodiments, the method includes running a computerised model using the analytic data to determine the likelihood for the one or more outcomes of the sporting event, or a portion thereof.
[0014] In a second aspect there is provided a processing system for extracting analytic data from a video of a sporting event and determining one or more predictions in relation to the sporting event, wherein the processing system is configured to: receive a video of the sporting event; extract a background from the video; determine, using the background, a plurality of moving masks for a plurality of frames of the video; extract, from the plurality of moving mask, analytic data related to at least a portion of the sporting event; and use the analytic data to generate a likelihood for one or more outcomes for the sporting event, or a portion of the sporting event.
[0015] In one form, the processing system is configured to determine the background image repeatedly throughout the video of the sporting event.
[0016] In another form, each moving mask is generated by comparing each pixel value of a frame of the video with a corresponding pixel of one or more immediately prior frames of the video, wherein based on the comparison, the moving mask for the frame includes each pixel having a pixel value discrepancy which exceeds a pixel value threshold.
[0017] In one embodiment, the processing system is configured to convert colour frames of the video to greyscale prior to determining the moving masks.
[0018] In another embodiment, the processing system is configured to: partition a playing area of the sporting event, which is captured within the video, into a plurality of partitions; and identify actions performed during the sporting event using the moving masks; wherein the analytic data extracted from the video includes each identified action which is assigned to have been performed in a partition of the playing area.
[0019] In particular forms, the processing system is configured to: calculate participant parameters based on participant detection using the moving masks, wherein the analytic data includes the participant parameters; and/or calculate non-participant parameters based on moving non-participant object detection using the moving masks, wherein the analytic data includes the non-participant parameters.
[0020] In particular embodiments, the participant parameters include at least one of a position of the participant in a playing area of the sporting event, a speed of the participant, and action type.
[0021] In certain forms, the non-participant parameters include at least one of a position of the non-participant object and a speed of a non-participant object.
[0022] In certain embodiments, the processing system is configured to run a computerised model using the analytic data to determine the likelihood for the one or more outcomes of the sporting event, or a portion thereof.
[0023] In a third aspect there is provided a system including: a processing system configured according to the second aspect; and a camera for capturing the video; wherein the processing system is configured to receive the video from the camera.
[0024] In a fourth aspect there is provided a non-transitory computer readable medium having executable instructions stored therein or thereon, wherein execution of the executable instructions by a processor of a processing system causes the processing system to: receive a video of the sporting event; extract a background from the video; determine, using the background, a plurality of moving masks for a plurality of frames of the video; extract, from the plurality of moving mask, analytic data related to at least a portion of the sporting event; and use the analytic data to generate a likelihood for one or more outcomes for the sporting event, or a portion of the sporting event.
[0025] In certain embodiments, the execution of the executable instructions configures the processor to determine the background image repeatedly throughout the video of the sporting event.
[0026] In certain embodiments, the execution of the executable instructions configures the processor to generate each moving mask by comparing each pixel value of a frame of the video with a corresponding pixel of one or more immediately prior frames of the video, wherein based on the comparison, the moving mask for the frame includes each pixel having a pixel value discrepancy which exceeds a pixel value threshold.
[0027] In certain embodiments, the execution of the executable instructions configures the processor to convert colour frames of the video to greyscale prior to determining the moving masks.
[0028] In certain embodiments, the execution of the executable instructions configures the processor to: partition a playing area of the sporting event, which is captured within the video, into a plurality of partitions; and identify actions performed during the sporting event using the moving masks; wherein the analytic data extracted from the video includes each identified action which is assigned to have been performed in a partition of the playing area.
[0029] In certain embodiments, the execution of the executable instructions configures the processor to: calculate participant parameters based on participant detection using the moving masks, wherein the analytic data includes the participant parameters; and/or calculate non-participant parameters based on moving non-participant object detection using the moving masks, wherein the analytic data includes the non-participant parameters.
[0030] In certain embodiments, the participant parameters include at least one of a position of the participant in a playing area of the sporting event, a speed of the participant, and action type.
[0031] In certain embodiments, the non-participant parameters include at least one of a position of the non-participant object and a speed of a non-participant object.
[0032] In certain embodiments, the execution of the executable instructions configure the processor to run a computerised model using the analytic data to determine the likelihood for the one or more outcomes of the sporting event, or a portion thereof.
[0033] Other aspects and embodiments will be appreciated throughout the detailed description.
Brief Description of the Drawings
[0034] Some aspects of at least one embodiment of the present invention will now be described with reference to the drawings and appendices, in which:
[0035] Figs. 1A and 1B form a schematic block diagram of a general purpose computer system upon which arrangements described can be practiced.
[0036] Fig. 2 is a system diagram of an example of a sports analytics and prediction system.
[0037] Fig. 3 is a flow diagram representing an example method for generating statistical data from a video stream and predicting likely outcomes of a sporting event using the generated statistical data.
[0038] Fig. 4 is an example interface presented by an output device of the processing system.
[0039] Fig. 5 is an example partitioning of a sporting event area.
[0040] Fig. 6 is an example interface showing a generated heat map for a portion of a sporting event.
[0041] Fig. 7 is an example interface showing a generated heat map for a current sporting event.
[0042] Fig. 8A is a graphical model representing an example of a finite state machine used for predicting a likelihood of outcomes of a sporting event and/or a portion of the sporting event using the generated extracted data.
[0043] Fig. 8B is a schematic representing an example of a prediction module using the finite state machine of Figure 8A for predicting a likelihood of outcomes of a sporting event and/or a portion of the sporting event.
[0044] Fig. 9 is a graphical representation of a function used to calculate a participant speed factor of a participant in the sporting event.
[0045] Fig. 10 is a graphical representation of a function used to calculate a distance covered factor of a participant in the sporting event.
[0046] Fig. 11 is a graphical representation of a function used to calculate an object speed factor of an object used by participants during the sporting event.
[0047] Fig. 12 is a graphical representation of functions used to calculate a rally length factor for a participant of a sporting event.
[0048] Fig. 13 is a graphical representation of a function used to calculate a participant fatigue factor of a participant of the sporting event.
[0049] Fig. 14A is a schematic showing the detection of the lower horizontal baseline and the corners thereof.
[0050] Fig. 14B is a schematic showing an initial estimation process for determining the gradient of each sideline extending upward from the lower horizontal baseline.
[0051] Fig. 14C is a schematic showing a further estimation process to improve accuracy for the determined gradient of the left sideline extending upward from the lower horizontal baseline.
[0052] Fig. 15 is a schematic showing a scanning process for detecting a moving object (e.g., moving tennis ball) in a binary image.
[0053] Fig. 16 is a flowchart representing an alternate example of a method for detecting a moving object (e.g. moving tennis ball) in an image.
[0054] Fig. 17 is an example of a captured image including a plurality of regions.
Detailed Description including Best Mode
[0055] Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears.
[0056] Figs. 1A and 1B depict a general-purpose computer system 100, upon which the various arrangements described can be practiced.
[0057] As seen in Fig. 1A, the computer system 100 includes: a computer module 101; input devices such as a keyboard 102, a mouse pointer device 103, a scanner 126, a camera 127, and a microphone 180; and output devices including a printer 115, a display device 114 and loudspeakers 117. An external Modulator-Demodulator (Modem) transceiver device 116 may be used by the computer module 101 for communicating to and from a communications network 120 via a connection 121. The communications network 120 may be a wide-area network (WAN), such as the Internet, a cellular telecommunications network, or a private WAN. Where the connection 121 is a telephone line, the modem 116 may be a traditional "dial-up" modem. Alternatively, where the connection 121 is a high capacity (e.g., cable) connection, the modem 116 may be a broadband modem. A wireless modem may also be used for wireless connection to the communications network 120.
[0058] The computer module 101 typically includes at least one processor unit 105, and a memory unit 106. For example, the memory unit 106 may have semiconductor random access memory (RAM) and semiconductor read only memory (ROM). The computer module 101 also includes an number of input/output (I/O)interfaces including: an audio-video interface 107 that couples to the video display 114, loudspeakers 117 and microphone 180; an I/O interface 113 that couples to the keyboard 102, mouse 103, scanner 126, camera 127 and optionally a joystick or other human interface device (not illustrated); and an interface 108 for the external modem 116 and printer 115. In some implementations, the modem 116 may be incorporated within the computer module 101, for example within the interface 108. The computer module 101 also has a local network interface 111, which permits coupling of the computer system 100 via a connection 123 to a local-area communications network 122, known as a Local Area Network (LAN). As illustrated in Fig. 1A, the local communications network 122 may also couple to the wide area network 120 via a connection 124, which would typically include a so-called "firewall" device or device of similar functionality. The local network interface 111 may comprise an Ethernet circuit card, a Bluetooth© wireless arrangement or an IEEE 802.11 wireless arrangement; however, numerous other types of interfaces may be practiced for the interface 111. Remote data stores such as cloud data storage 190 can be accessed via the wide area network 120.
[0059] The I/O interfaces 108 and 113 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 109 are provided and typically include a hard disk drive (HDD) 110. Other storage devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 112 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks (e.g., CD ROM, DVD, Blu-ray DiscTM), USB-RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 100.
[0060] The components 105 to 113 of the computer module 101 typically communicate via an interconnected bus 104 and in a manner that results in a conventional mode of operation of the computer system 100 known to those in the relevant art. For example, the processor 105 is coupled to the system bus 104 using a connection 118. Likewise, the memory 106 and optical disk drive 112 are coupled to the system bus 104 by connections 119. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple MacTM or like computer systems.
[0061] The methods described herein may be implemented using the computer system 100 wherein the methods may be implemented as one or more software application programs 133 executable within the computer system 100. In particular, the steps of the method are effected by instructions 131 (see Fig. 1B) in the software 133 that are carried out within the computer system 100. The software instructions 131 may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the methods described herein and a second part and the corresponding code modules manage a user interface between the first part and the user.
[0062] The software may be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer system 100 from the computer readable medium, and then executed by the computer system 100. A computer readable medium having such software or computer program recorded on the computer readable medium is a computer program product.
[0063] The software 133 is typically stored in the HDD 110 or the memory 106. The software is loaded into the computer system 100 from a computer readable medium, and executed by the computer system 100. Thus, for example, the software 133 may be stored on an optically readable disk storage medium (e.g., CD-ROM) 125 that is read by the optical disk drive 112. A computer readable medium having such software or computer program recorded on it is a computer program product.
[0064] In some instances, the application programs 133 may be supplied to the user encoded on one or more CD-ROMs 125 and read via the corresponding drive 112, or alternatively may be read by the user from the networks 120 or 122. Still further, the software can also be loaded into the computer system 100 from other computer readable media. Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computer system 100 for execution and/or processing. Examples of such storage media include floppy disks, magnetic tape, CD-ROM, DVD, Blu-ray T M Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 101. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the computer module 101 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
[0065] The second part of the application programs 133 and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUls) to be rendered or otherwise represented upon the display 114. Through manipulation of typically the keyboard 102 and the mouse 103, a user of the computer system 100 and the application may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s). Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via the loudspeakers 117 and user voice commands input via the microphone 180. In one form, the microphone can be part of
[0066] Fig. 1B is a detailed schematic block diagram of the processor 105 and a "memory" 134. The memory 134 represents a logical aggregation of all the memory modules (including the HDD 109 and semiconductor memory 106) that can be accessed by the computer module 101 in Fig. 1A.
[0067] When the computer module 101 is initially powered up, a power-on self-test (POST) program 150 executes. The POST program 150 is typically stored in a ROM 149 of the semiconductor memory 106 of Fig. 1A. A hardware device such as the ROM 149 storing software is sometimes referred to as firmware. The POST program 150 examines hardware within the computer module 101 to ensure proper functioning and typically checks the processor 105, the memory 134 (109, 106), and a basic input-output systems software (BIOS) module 151, also typically stored in the ROM 149, for correct operation. Once the POST program 150 has run successfully, the BIOS 151 activates the hard disk drive 110 of Fig. 1A. Activation of the hard disk drive 110 causes a bootstrap loader program 152 that is resident on the hard disk drive 110 to execute via the processor 105. This loads an operating system 153 into the RAM memory 106, upon which the operating system 153 commences operation. The operating system 153 is a system level application, executable by the processor 105, to fulfil various high level functions, including processor management, memory management, device management, storage management, software application interface, and generic user interface.
[0068] The operating system 153 manages the memory 134 (109, 106) to ensure that each process or application running on the computer module 101 has sufficient memory in which to execute without colliding with memory allocated to another process. Furthermore, the different types of memory available in the system 100 of Fig. 1A must be used properly so that each process can run effectively. Accordingly, the aggregated memory 134 is not intended to illustrate how particular segments of memory are allocated (unless otherwise stated), but rather to provide a general view of the memory accessible by the computer system 100 and how such is used.
[0069] As shown in Fig. 1B, the processor 105 includes a number of functional modules including a control unit 139, an arithmetic logic unit (ALU) 140, and a local or internal memory 148, sometimes called a cache memory. The cache memory 148 typically includes a number of storage registers 144 - 146 in a register section. One or more internal busses 141 functionally interconnect these functional modules. The processor 105 typically also has one or more interfaces 142 for communicating with external devices via the system bus 104, using a connection 118. The memory 134 is coupled to the bus 104 using a connection 119.
[0070] The application program 133 includes a sequence of instructions 131 that may include conditional branch and loop instructions. The program 133 may also include data 132 which is used in execution of the program 133. The instructions 131 and the data 132 are stored in memory locations 128, 129, 130 and 135, 136, 137, respectively.
Depending upon the relative size of the instructions 131 and the memory locations 128 130, a particular instruction may be stored in a single memory location as depicted by the instruction shown in the memory location 130. Alternately, an instruction may be segmented into a number of parts each of which is stored in a separate memory location, as depicted by the instruction segments shown in the memory locations 128 and 129.
[0071] In general, the processor 105 is given a set of instructions which are executed therein. The processor 105 waits for a subsequent input, to which the processor 105 reacts to by executing another set of instructions. Each input may be provided from one or more of a number of sources, including data generated by one or more of the input devices 102, 103, data received from an external source across one of the networks 120, 102, data retrieved from one of the storage devices 106, 109 or data retrieved from a storage medium 125 inserted into the corresponding reader 112, all depicted in Fig. 1A. The execution of a set of the instructions may in some cases result in output of data. Execution may also involve storing data or variables to the memory 134.
[0072] The disclosed arrangements use input variables 154, which are stored in the memory 134 in corresponding memory locations 155, 156, 157. The arrangements produce output variables 161, which are stored in the memory 134 in corresponding memory locations 162, 163, 164. Intermediate variables 158 may be stored in memory locations 159, 160, 166 and 167.
[0073] Referring to the processor 105 of Fig. 1B, the registers 144, 145, 146, the arithmetic logic unit (ALU) 140, and the control unit 139 work together to perform sequences of micro-operations needed to perform "fetch, decode, and execute" cycles for every instruction in the instruction set making up the program 133. Each fetch, decode, and execute cycle comprises:
Sa fetch operation, which fetches or reads an instruction 131 from a memory location 128, 129, 130;
• a decode operation in which the control unit 139 determines which instruction has been fetched; and
* an execute operation in which the control unit 139 and/or the ALU 140 execute the instruction.
[0074] Thereafter, a further fetch, decode, and execute cycle for the next instruction may be executed. Similarly, a store cycle may be performed by which the control unit 139 stores or writes a value to a memory location 132.
[0075] Each step or sub-process in the processes described herein is associated with one or more segments of the program 133 and is performed by the register section 144, 145, 147, the ALU 140, and the control unit 139 in the processor 105 working together to perform the fetch, decode, and execute cycles for every instruction in the instruction set for the noted segments of the program 133.
[0076] The methods described herein may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of the method. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
[0077] Referring to Figure 2 there is shown a system diagram of a data extraction and prediction generation system. The system 200 includes a processing system 100 which can be provided in the form described in relation to Figures 1A and 1B. The system 200 further includes a camera 210 which is in direct or indirect communication with the processing system 200. The camera is located at a convenient position near the sporting event 299 to capture digital video of the sporting event 299. In particular, the camera can transfer the captured video to the processing system 100 via a computer network 220, such as a WAN like the Internet, or a LAN. Alternatively, the camera 210 may be directly in communication with the processing system 100 via a wired or wireless medium. In one preferred form, the camera 210 can be provided in the form of a GoPro Hero 7 which includes a microphone for receiving commands for recording video footage.
[0078] Referring to Figure 3 there is shown a flowchart representing a method 300 of extracting analytic data from a video of a sporting event 299 and determining one or more predictions in relation to the sporting event 299. The method 300 is performed by the processing system 100 described as part of the system 200 of Figure 3. The processing system 100 can perform the method 300 in response to execution of a computer program represented in the form of a plurality of executable instructions recorded on a computer readable medium such as a computer readable memory or the like.
[0079] At step 310, the method 300 includes receiving a video of the sporting event 299. The video may be received via a broadcast feed or a live stream.
[0080] At step 320, the method 300 includes extracting a background image from the video.
[0081] At step 330, the method 300 includes generating, using the background image, a plurality of moving masks for a plurality of frames of the video.
[0082] At step 340, the method 300 includes extracting, from the plurality of moving masks, analytic data related to at least a portion of the sporting event 299.
[0083] At step 350, the method 300 includes using the analytic data to generate a likelihood for one or more outcomes for the sporting event 299, or a portion of the sporting event 299.
[0084] The method 300 can include determining the background image repeatedly throughout the video of the sporting event 299. For example, after generating the moving mask for a predetermined number of frames of the video, the processing system 100 can repeat the determination of the background image such as to take into account varying environmental factors which may change over time.
[0085] Each moving mask can be generated by comparing each pixel value of a frame of the video with a corresponding pixel of one or more immediately prior frames of the video. The comparison results in a pixel value discrepancy. Based on the comparison for each pixel over the selection of immediately prior frames, the processing system 100 can generate the moving mask to include each pixel having with the respective pixel value discrepancy exceeding a pixel value threshold. Thus, a pixel which have substantially no or little pixel discrepancy over the selection of frames (i.e. static) is disregarded, and a pixel which has a pixel value discrepancy which exceeds a predetermined threshold stored in memory is indicative of movement in the frame and is therefore recorded as a pixel of the moving mask image for the frame.
[0086] The method 300 can also include partitioning a playing area of the sporting event 299, which is captured within the video, into a plurality of partitions. This is best shown in Figure 5 by the partitioned layout 500. Once the playing area is partitioned, the method 300 can include identifying actions performed during the sporting event 299 using the moving masks. The analytic data extracted from the video can include each identified action being assigned to a particular partition of the playing area.
[0087] The analytic data can additionally or alternatively include parameters that are calculated or derived from the image processing of the video frames. In one form, the method 300 includes calculating participant parameters based on participant detection using the moving masks. Participants can include players of the sporting event 299. Such participant parameters include at least one of a position of the participant in a playing area of the sporting event 299, a speed of the participant, and action type.
[0088] In an additional or alternate form, the method 300 includes calculating non participant parameters based on moving non-participant object detection using the moving masks, wherein the analytic data includes the non-participant parameters. A moving non-participant of the sporting event 299 can include an object, such as a ball or the like, which moves during the sporting event 299. In particular embodiments, the non-participant parameters can include at least one of a position of the non-participant object and a speed of a non-participant object.
[0089] The method 300 includes the processing system 100 running a computerised model using the analytic data to determine the likelihood for the one or more outcomes of the sporting event 299, or a portion thereof. For example, the computerised model can be run based on historical data of the sporting event 299 in progress such as to determine the outcome to the sporting event 299 or a portion of the sporting event 299. For example, for a tennis match, the processing system 100 can be configured to run the computerised model using input data of the extracted analytic data to determine the winner of the tennis match, and/or potentially the winner of the next set, game, or point of the tennis match. Additionally, or alternatively, the processing system 100 can determine a next action that may be performed by a participant in the sporting event 299. For example, the processing system 100 is configured to determine whether a next shot played by a tennis player in a match is a forehand shot or a backhand shot based on the historical analytic data that has been extracted from the analysed video of the sporting event 299.
[0090] A more specific example data extraction and prediction method and prediction method and system will now be described with reference to a tennis match.
[0091] The processing system 100 can be configured to extract data from a single fixed camera established at an elevated location on or near the rear fencing/boundary, behind the closest player to camera. The camera is preferably located in the centre of the court behind the nearest player to the camera. The camera is located to capture the entire playing area and should provide video so that the furthest player (or players) is detectable. However, it will be appreciated that the extracted data can be obtained from multiple cameras.
[0092] Players' information and court information can be entered using an input device of the processing system 100 by the user. The processing system 100 is configured to save historical data indicative of calculated, extracted, estimated and predicted information of a match for use in predicting one or more outcomes associated with further matches which involves at least one of the players. The saved historical data can also relate to tracking of an object used in the sporting event 299, such as the tracking of a ball, and actions identified as being performed by each player, such as shots performed by each player. The historical data can be saved in a data store such as a database, for example SQLite.
[0093] Image processing is performed upon frames of the video captured by the camera. The video may be provided in the form of an MP4 file. A background of each frame is extracted and updated, court coordination is detected, and movements are estimated.
[0094] Frames are captured from both MP4 files and camera sources. After capturing the frames, due to different sizes that frames have in various camera settings, the processing system 100 may be configured, if required, to adjust the frames to a usual size so that the processing cost becomes rational and their presentation suitable for screen.
[0095] In some implementations, it is preferable for the frames to be converted from RGB to greyscale to enable faster processing and performance. Therefore, image processing upon the frames can include constructing a greyscale frame for each RGB frame.
[0096] The captured frames are preferably presented on an output device of the processing system 100. The output of this function is illustrated in Figure 4. The interface 400 includes a first window 410 showing frames of the received video. The interface 400 also includes two further windows 420, 430 located adjacent the first window 410 including a second window 420 presenting a binary image indicative of detected moving objects extracted from the received video, and a third window 430 presenting a portion of the playing area captured by the video. In particular, in the example where the sporting event 299 is a tennis match, the third window 430 may present the court and a predetermined percentage, such as 20%, of the area outside the outer boundaries of the court. Other areas captured by the video, such as areas associated with spectators are not presented in the third window. In one form, the first window 410 is a main window of the interface 400, and the second and third windows 420, 430 are smaller windows compared to the first window 410.
[0097] The background image which is extracted from the video is an image without moving objects. For example, the background image includes an image of a tennis court without any player, ball, spectators, and so on. In one form, the background image is extracted from a frame when no match is held on the court and to use this for all matches held on the court. In one form, an operator of the processing system 100 may provide input via the input device to indicate a time when the background image can be extracted from the video.
[0098] In some instances, using the same background image throughout the image processing of the video is less than ideal. For example, varying weather conditions, light conditions, and added static objects such as a bench near the playing area, may need to form part of an updated background image. Therefore, in one form the background image is extracted from the video in a periodic manner. For example, the background image can be extracted every 1000 or 2000 frames.
[0099] In order to perform the background extraction, the processing system 100 is configured to compare each pixel of a frame to subsequent or prior frames to determine whether the respective pixel is a static value over the plurality of frames. In some instances, the value associated with a pixel may vary slightly from frame to frame, but the value is within a specific tolerance. Therefore, the value of a pixel over the plurality of frames is compared, wherein in the event that the discrepancy does not exceed a pixel value discrepancy threshold, the respective pixel is determined as being static. A loop is performed by the processing system 100 over all pixels.
[00100]To enhance the performance, for each pixel a set of clusters from 0 to 255 are considered. While the system is working and by capturing each frame, cluster values should be updated. In addition, after a certain number of frames, for example 1000 or 2000 frames, the background should be updated and stored in memory of the processing system 100.
[00101]At initialisation, the processing system 100 is configured to determine court coordination. This is performed in order to access the inside area of the court to estimate the current situation of the play and ball and in order to determine effective playing area plus area near to the playing area. The other parts of the frames which mostly belong to the spectators and the like, are not utilised by further image processing.
[00102]The processing system 100 is configured to calculate particular positions of the playing area. For example, for a tennis match the processing system 100 is configured to:
1. Perform edge detection by applying a Soble filter to determine the edges of the court. Edges of camera near baseline of the court closest to the camera can be detected using edge detection. The near baseline line is determined according to the extracted edge using the edge detection. The result of this detection in shown in Figure 14A which shows the baseline which is detected. Other edges may also be detected, but the edge which is detected closest to the bottom of the image is identified as the baseline edge.
2. Based on the edge detection performed in step 1, the corners of the baseline are identified. The corners of the baseline line is determined according to the extracted edge using the edge detection in step 1.
3. For each detected corner where a non-horizontal edge (i.e. side lines which are captured as being angled due to the perspective nature of the camera) extends from the baseline, a sub-frame or window of pixels is selected by the processing system 100 about each detected corner as shown in Figure 14B. The window can be a predefined size having predefined dimensions stored in memory. The window may have a square profile. The processing system 100 is then configured to select two or more pixels along the detected angled edge extending from the detected corner which is not horizontal (i.e. not part of the baseline). The processing system 100 then calculates a gradient or slope formed by a line extending between the selected pixels along the sidelines extending from the baseline. The line slope for each non-horizontal line (i.e. sidelines) is calculated by the processing system 100 using the coordinates or positions of the selected pixels as well as the detected corner coordinate forming a sloped line relative to the baseline. In one form, as shown in Figure 14B, the first of the pixels selected is generally located near or approximate to the middle of the window and the second of the pixels selected is generally located at the edge of the upper edge of the window.
Other pixels of the non-horizontal lines are calculated using the first two determined corners and the line slopes.
4. The plurality of points along each sloped sideline is used to calculate the gradient or slope from step 3. Preferably, the gradient can be more accurately calculated using further detected points along the detected edge from step 1. In particular, as shown in Figure 14C, a plurality of further points can be selected by the processing system 100 from the estimated sloped lines. The exact positions in the edge image are estimated. Non-relevant points based on thresholds are neglected. A median point from the plurality of points is selected as part of the sloped sideline. The processing system 100 then uses this median point for improving the accuracy of the slope of the sideline.
5. For the farther baseline relative to the camera, identify a longest top horizontal edge line near to a top portion of the frame. The processing system 100 extends the top horizontal edge line to detect an intersection with the non-horizontal edges, wherein the intersection point corresponds to the far corners of the court.
6. Calculate a surface colour of the inside area within the baselines and side lines of the court using a histogram. The court borders are separated based on adjusted assigned colour values (e.g. red, green, and blue) for inside and outside areas separately.
7. Set the inside area based on the detected borders plus 20 percent around the detected borders as the effective area for image processing.
8. Partition the court and its outside area into separate partitions. An example of the partitioning 500 determined by the processing system 100 for a tennis court is shown in Figure 5.
[00103]After court coordination has been performed, the processing system 100 is configured to perform detection of moving objects. The detection is performed using a pixel by pixel analysis. After constructing and updating the background image, the processing system 100 is configured to determine a moving mask by subtracting a grey scale version of the current frame from the background image. The moving mask is a binary image. Pixels which differ more than a dynamic threshold value are marked as moving pixels and form part of the moving mask. The dynamic threshold value is calculated based on a histogram of the current frame pixel values.
[00104]After court coordination has been performed, the processing system 100 is configured to perform detection of moving objects. The detection is performed using a pixel by pixel analysis. After constructing and updating the background image, the processing system 100 determines a moving mask by subtracting a grey-scale version of the current frame from the background image. The moving mask is a binary image. Pixels which have a difference more than a dynamic threshold value are marked as moving pixels and form part of the moving mask. The dynamic threshold value is calculated based on a histogram which is drawn based on the current frame pixel values.
[00105] Post-processing techniques such as morphological operations are applied to the moving mask to enhance the quality of the detected portions of the moving mask. For example, the processing system 100 can be configured to apply one or more morphological operations to a binary image, in this case the moving mask, and a structuring element as input and combine them using a set operator (intersection, union, inclusion, complement). The one or more morphological operations process objects in the input image based on characteristics of its shape, which are encoded in the structuring element. In addition, a Median filter can be applied to frames. The Median filter is a nonlinear digital filtering technique, often used to remove noise from an image or signal.
[00106]After post-processing, the processing system 100 is configured to generate analytics using the binary image. In this phase, operations such as player detection and tracking, ball detection and tracking, heat-map creation for each player and ball, ball and player speed calculation, action recognition (e.g. for tennis: serve, forehand, backhand, and volley), player hand detection (Left-handed or Right-handed), and score estimation are done.
[00107]In order to detect the players in each frame, the processing system 100 is configured to:
1. Determine an initial position for a player. If the position of the player in previous frames is known, the new initial position is calculated based on a last detected position of the player, a distance between the last detection position of the player and the ball, and an angle between to the last detected position of the player and the ball. If the position of the player in previous frames is unknown, determine the new initial position based on the end of the court (i.e. near or behind base line) associated with the respective player and the side of the court (i.e. left side, right side) depending on the state of the game and the type of game (i.e. doubles/singles, etc).
2. Determine a number of search centres around the determined initial position. The considered area includes all pixels around the initial position with a specific distance. The distance starts from zero to a predefined threshold value which is varied based on its distance from the camera. Therefore, it will be appreciated that the closer an area is relative to the camera, the larger the surface area which is covered by the search centre.
3. For each search centre, construct a boundary box so that a centre of the search centre is locates in the middle of the respective box. Count a number of moving pixels within the boundary box and calculate a respective ratio of the number of moving pixels within the boundary box compared to a total number of pixels located inside the box. Maintain a list of the boundary boxes and associated moving pixel ratios.
4. Identify the boundary box having the largest moving pixel ratio as a player in the sporting event 299 and set in memory of the processing system 100 the corresponding search centre as the current player position.
5. Update the track of player. Increase age of the track. If there is a gap between the current detected position and the previous positions in the last frames, create a new track with an age equals to one.
6. If no position was found in the previous step and a number of frames without any success in determination of player position is low, then calculate current position based on the previous values. Otherwise mark a flag to show the player has been lost.
[00108]For several reasons including the gap in the game, or fault of the system, or when a player moves out of the camera view beyond a threshold period of time (e.g. 10 frames), we position of the player may be lost and thus the searching algorithm to detect one or more players is restarted. For this purpose, the player detection algorithm restarts to attempt detecting one of or both players from their most probable position. This most probable position should be outside of the horizontal lines (i.e. baselines). For this purpose, the processing system 100 considers the following factors: the current state of the game; the player who is next to serve; the side (left or right) which the player is next due to serve from at the base line; and the baseline (i.e. top of bottom) which the serve is to be performed. Based on these factors, an initial location for the player is estimated by the processing system 100 and the searching algorithm is executed to search for one or more of the players within a search sub-frame of this initial search location.
[00109]A similar process to that performed in relation to player detection can be performed by the processing system 100 in relation to ball detection and tracking. When an initial position for the ball is determined, and the ball in the previous frames has not been detected, the processing system 100 scans the entire court for the ball. If the processing system 100 cannot detect the ball and based on a monitored state of play for the sporting event 299 (e.g. a player is serving, or he/she is hitting the ball) the initial position for the ball is set based on the state of play (e.g., a player is serving, the player is hitting the ball). When the processing system 100 is scanning for a moving object in the binary image, the biggest object with little number of true value pixels is selected as the detected ball. Two nested windows around the centre pixels are considered as shown in Figure 15. Considerable pixels in the internal window should be true (i.e. moving pixels) and considerable pixels in the external window should have false values (as background pixels). More specifically, a ball in a moving image (a binary image with zero values for static objects and 1 for moving objects) is a small area with 1 value which is surrounded by a plurality of zero values. For this purpose, the processing system 100 is configured to draw two windows around the nominated object. The inner window is defined to detect the small object and the outer window is defined to detect the non-moving area surrounding the moving object. For example, in the image shown in Figure 15, the candidate can be considered as a ball because a majority of the pixels in the inner window define a moving pixel and the pixels contained in the outer window which are not located within the inner window contain a minority or small section of moving pixels.
[00110]An alternate method for detecting a ball is herein described with reference to Figure 16. The method 1600 will be described with reference to Figure 17 which shows a plurality of regions of a current frame.
[00111]At step 1605, the method 1600 includes applying a dilation operation to a current frame. This step is advantageous to remove small regions near bigger regions thereby improving the detection of an object of interest.
[00112]At step 1610, the method 1600 includes selecting an inner region 1710 (see Figure 17) of the current frame.
[00113]At step 1615, the method 1600 includes obtaining pixels in the current region of the current frame.
[00114]At step 1620, the method 1600 includes determining moving pixels in the current region of the current frame. The moving pixels are determined with respect to the same current region of a previously captured frame of the video footage. Each pixel between of the regions of the current and previous frames are compared to determine a portion of the pixels which have changed between the frames which are indicative of movement.
[00115]At step 1625, the method 1600 includes determining, based on the detected moving pixels in the current region and one or more object features, whether an object has been detected in the current region of the current frame.
[00116] The one or more object features for a tennis ball can include at least one of colour, the size and the shape of a tennis ball. It will be appreciated that other features for detecting an object of interest could be used.
[00117]In relation to the colour for a tennis ball, the processor determines whether one or more moving pixels may be considered to be a detected tennis ball if it has RGB values of within a range of (R=220, G=253; B=80). The range may be +/-10% of any other defined tolerance range stored in memory.
[00118]In relation to the feature of size, the processor can be configured to determine whether there are other neighbouring moving pixels of a particular moving pixel which indicate a size of the object of interest. The size of the detected object is then compared to a threshold area to determine if a portion of the region satisfies the size feature.
[00119]In relation to the shape, the processor can be configured to use a Hough transform to determine if there is circular region around the pixel. If there is a circular shape around the candidate pixel, the respective portion of the region satisfied the shape feature. It should be appreciated that the search is done for an approximate circular region, because when the ball moves it does not possess an exact circular shape.
[00120]At 1630, the method 1600 includes determining whether the object of interest has been detected. In particular, the one or more features can be weighted or combined to determine whether an object of interest, such as a tennis ball, has been detected within the current region of the current frame. Other detection techniques can be used using various types of classification models, neural networks, etc. In the event that the object of interest has been detected, the method proceeds to step 1635. Otherwise, the method proceeds to step 1640.At step 1635, the method 1600 includes returning a detected object position. In one form the detected position may be highlighted in the software for presentation within the software interface.
[00121] At step 1640, the method includes the processor determine whether further regions need to be searched. In particular, as shown in Figure 17, the frame may include a plurality of concentric regions, wherein the method 1600 starts with the inner concentric region. In the event that the object of interest is not detected in the inner most concentric region, the method moves to step 1645 which includes obtaining the next most inner concentric region of the current frame. Thus, in the event that the object of interest is not detected in region 1710, the method proceeds to region 1720, then region 1730 and finally 1740. In the event that the object of interest is not detected in the outer most region of the captured image (e.g. the loop is completed four times for the example shown in Figure 17), the method proceeds to step 1650 to return an indication that no object has been detected. In this case, the software interface may not present a detected object.
[00122]It is noted that it is advantageous to perform the searching process described in relation to Figure 16 starting at the inner-most concentric region and then proceeding to the next inner-most concentric region. This process has substantial processing advantages leading to quicker detection of the object of interest for events such as tennis where the ball spends a significant portion of time within the central portion of the captured frame. As processing can be performed in real-time, such processing advantages can be highly beneficial.
[00123]In one form, the detection process can be enhanced to take into account a potential position of an object of interest in consecutive frames. For example, the ball position should not change radically between consecutive frames. Therefore, the selected region of the frame may be adjusted to take into account the previously detected position. Alternatively, the previous detected position may be used to validate detected positions between consecutive frames.
[00124]The generation of analytics by the processing system 100 can further include the generation of one or more heat-maps used to visualize the frequency of a player or ball being in a particular portion of a playing area, as shown in Figure 6. Each heat map 600 is generated using 2D data (excluding the height dimension). For this purpose, a heat-map for the players and the ball in a current game is illustrated in the interface by small, colourful circles 610 drawn in corresponding locations as shown in Figure 6. Generally, for all current matches, the processing system 100 is configured to divide the court based on the partitioning of the playing area discussed above. For each partition, a number of times the player or the ball was detected as being located therein throughout the sporting event 299 is displayed as shown in Figure 7 by window 700.
[00125]Speed of the players and the ball are calculated based on the distance the players and ball travel over time. The distance is calculated when a shot is detected as being struck by a player until the ball bounces in the opposite side of the court. The time is calculated based on number of frames between the detection of the ball being hit and the ball bouncing.
[00126]Action detection, such as detection of a serve or volley are performed by the processing system 100 based on a current state of the match and a most recent event that has been detected by the processing system 100. In order to detect a backhand shot or a forehand shot, the processing system 100 is configured to count a number of moving pixels in left and right sides of the player and in a specific height. The detected number of moving pixels can be compared to shot threshold values or ranges to determine a racket position and shot type.
[00127]A similar process to that used for action detection can be used for determining whether a player is left handed or right handed. The process is performed by the processing system 100 on multiple occasions (e.g. 5 times) for each detected player to increase reliability, wherein the player is categorised as left handed or right handed based on multiple detection processes.
[00128]The processing system 100 is also configured to perform score estimation. A score assignment starts when a complete action has been identified. A complete action is a sequence of frames which starts and ends with idle phases (e.g. a number of consecutive frames where no movement of the ball is perceived). The first step searches for a "serve" event, recognized as the first stroke after an idle period that happens near the side line. Then, the ball trajectory is analysed until the end of the action, when a point is assigned to one of the players. A finite state machine (FSM) 800, as shown in Figure 8A, which embeds the rules of the game is utilised for performing score estimation using detected actions. The finite state machine changes the state if the ball follows a valid trajectory with respect to the rules of the game. When the FSM cannot reach another valid state in response to an event, the action is considered completed and a point is assigned. Particular attention should be given to the repetition of a serve (first or second) that is allowed only when the served ball touches the net and bounces inside a valid area of the court. In that case, the particular service should not count and the service needs to be repeated without cancelling any previous fault. It should be noted that net events are important in this context only; otherwise they can safely be ignored to correctly assign a score. Finite State Machines that can be used for the described system and method are discussed in Bevc, M. (2015), Predicting the Outcome of Tennis Matches From Point-by-Point Data, the contents of which is hereby incorporated by reference in its entirety. The states are extracted by the events in the previous step by analysing both the type of events and the corresponding ball positions.
[00129]The processing system 100 is also configured to perform score estimation. A score assignment starts when a complete action has been identified. A complete action is a sequence of frames which starts and ends with idle phases (e.g. a number of consecutive frames where no movement of the ball is perceived). The first step searches for a "serve" event, recognized as the first stroke after an idle period that happens near the side line. Then, the ball trajectory is analysed until the end of the action, when a point is assigned to one of the players. A finite state machine (FSM) 800 as shown in Figure 8A, which embeds the rules of the game, is utilised for performing score estimation. The finite state machine changes the state if the ball follows a valid trajectory with respect to the rules of the game. When the FSM cannot reach another valid state in response to an event, the action is considered completed and a point is assigned. Particular attention should be given to the repetition of a serve (first or second) that is allowed only when the served ball touches the net and bounces inside a valid area of the court. In that case, the particular service should not count and the service needs to be repeated without cancelling any previous fault. It should be noted that net events are important in this context only; otherwise they can safely be ignored to correctly assign a score. Finite State Machines that is used for the described system and method has been introduced in Bevc, M. (2015).
[00130]The processing system 100 can perform predictive analysis based on the generated analytics. This phase can include predicting a next shot suggestion and a predicted outcome for the sporting event 299, or portion thereof. Predictive analytics uses many techniques from data mining, statistics, modelling, machine learning, and artificial intelligence to analyse current data to make predictions about future. The ultimate goal with predictive analytics in tennis is determining the winner in a match. Interim goals with predictive analytics in tennis include:
• Who will win the next point?
• Who will win the next game?
• Who will win the next set?
[00131]As shown in Figure 8B there is shown a schematic of a processing system 100 for performing predictive analysis. It will be appreciated that the processing system 100 may be a plurality of processing systems 100 such as a distributed processing system 100. Additionally or alternatively, the processing system 100 may be a cloud processing system 100 and/or a local processing system 100 located near the sporting event.
[00132]A tennis match consists of sets, which consist of games, which in turn consist of points. To win the match a player therefore has to win the sequence of points which yields the required number of games and sets. This structure makes it possible to model a match as a hierarchical Markov model, which consists of all possible Markov chains for a particular event. A Markov chain is a construction of a sequence of random variables which represent possible states of the modelled event. The transitions in the chain are the probability of a player winning a point p based on plurality of different factors including player rankings, court surface type, and so on. These factors and their calculation methods are described herein.
[00133]A graph representation of a tennis game can modelled as a Markov Chain. A tennis game a player has an equal probability of winning the game at the score 30-30 and Deuce, as well as at 40-30 (30-40) and Advantage. To represent the whole match this model is scaled up to represent a set of games and match of sets in equal fashion with the exception of transitions then representing the probability of a player winning a game and set respectively.
[00134] Considering this model relies solely on p to make predictions, it becomes important to estimate it accurately or at least estimating p - q correctly, wherein q is defined as the probability of player 2, or team 2 in doubles matches, to win a point on his serve, the same as p is for player 1. This is done by gathering historical data on overall match statistics and computing a plurality of factors based on this historical information and their values in current match. Values p and q are calculated for two players or teams and based on the plurality of resulted factors in the end of each point. The plurality of factors are normalized and applied to the following equations, namely Equation 1 and Equation 2 as shown below. Final p and q values are normalized again in order to sum to 1. The equations shown below are based on 12 identified features. However, it will be appreciated that a different number of features may be used.
12
p= en i=1 _ P Equation 1 p - q
12
q e Equation 2
q=1
[00135]In order to determine how much a player or a team has a chance to be the winner in the next point, a plurality of features are utilized by a prediction module 820 operated by the processing system 100. These features are discussed below.
[00136] The processing system 100 includes an analytics extraction module 810 which performs the processes described above in relation to Figures 3 to 7. The analytics extraction module 810 receives video data 801 which may be provided in the form of a broadcast or a video stream. The analytics extraction module 810 extracts various statistics which are then stored locally in memory 106 and/or remotely in data store 190, such as a cloud data repository accessed via a wide area network such as the Internet. Control is passed to the prediction module 820 to perform a prediction based on extracted analytics from the video 801. The prediction module 820 includes the model
800 used for generating the prediction. The prediction module 820 can obtain extracted analytics from the memory 106, the cloud data store 190, and/or other third party data stores 890 such as a ranking database or the like. The model 800 generates the prediction based on a variety of features as discussed below.
[00137]In one form, the plurality of features includes players rankings. For each player, rankings are calculated independently in two states: doubles matches and singles matches. The rankings of players may be obtained from a ranking resource, such as ranking website, or from a database accessible to the processing system 100. Before starting a match, ranking values are alerted to the operator of the processing system 100. The operator decides to retain the retrieved rankings or to update the values. This strategy can be used for some or all factors. When ranking values for either players or teams are not clear, the effect of this factor on the final prediction can be ignored by the operator of the processing system 100.
[00138]In singles matches, when the ranking value for only one player is defined, the effect of this factor on the final prediction is calculated by the prediction module 820 based on the retrieved ranking value and according to a first set of ranking ranges which are stored in memory of the processing system 100:
Ranking of 1 - 10: 100%
Ranking of 11 - 20: 80%
Ranking of 21 - 40: 65%
Ranking of 41 - 70: 45%
Ranking of 71 - 100: 35%
Ranking of 101 - 125: 20%
[00139]In singles matches, when ranking values for both players are retrieved, the effect of this factor on the final prediction is calculated based on difference of the ranking values and according to a second set of ranking ranges.
Player ranking 1 - 5: 70%.
Player ranking 6 - 20: 70% (excluding players ranked 1 - 5).
30% chance to win if this player plays against a
player ranked 1 - 5.
Player ranking 21 - 50: 10% when playing against a player ranked 1 - 20.
60% against other players.
Player ranking 51 - 100: 10% when playing against a player ranked 1 - 50
60% against other players.
Player ranking 101 - 200: 5% against player ranked 1 - 50.
15% against 51 - 100.
55% against others.
All other rankings against each other: 50%
[00140]In doubles matches where ranking values for all players in two teams are defined, the ranking value for each pair is defined as their summation and effect of this factor on the final prediction is calculated by the prediction module 820 based on difference of the pair ranking values and according to the second set of ranking ranges.
[00141]In doubles matches where ranking value for only one player of a team is defined, the effect of this factor on the final prediction is calculated by the prediction module 820 based on this ranking value and according to the first set of ranking ranges.
[00142]In doubles matches when the ranking value for only one player of each team is defined, the effect of this factor on the final prediction is calculated by the prediction module 820 based on these two ranking values and according to the second set of ranking ranges.
[00143]In doubles matches, where ranking values for only two players of one team are defined, pair ranking value for the corresponding team equals to their addition and effect of this factor on the final prediction is calculated by the prediction module 820 based on the first set of ranking ranges.
[00144]In doubles matches, when ranking values for three players are defined, pair ranking value for team with two clear ranking values equals to their addition and for the team with only one clear ranking value is equals to addition of the clear ranking value and the same value for the other player. Effect of this factor on the final prediction is calculated by the prediction module 820 based on the difference of these ranking values and according to the second set of ranking ranges.
[00145]A further factor that can be considered is the court surface type, which can include for example clay, hardcourt, grass, and indoor. For each player, the probability equals to number of matches won by a player on a respective court surface type compared to the total matches played by the player on the respective court surface type, as shown by Equation 3 below.
p2 Equation 3 Number of matches won on the corresponding court surface type Total number of matches he has played on the corresponding court surface type
[00146] When values for either players or teams are not clear, the effect of this factor on the final prediction is ignored. In doubles matches, team factor value equals the average of the factor values for corresponding players in that team. In doubles matches where the factor value for only one player is clear, that value is considered as the factor value for that team.
[00147]Another factor taken into consideration by the prediction module 820 includes a serve rate. For each player, the serve rate equals a number of points won from that player with respect to total point played when the player was the server. This is represented by Equation 4 below.
Number of serves won Equation 4 P Total number of serves by the player
[00148]When values for either players or teams are not clear, the effect of this factor on the final prediction is ignored. In doubles matches, team serve rate value equals to average of the rate values for corresponding players in that team as represented by Equation 5 below.
P31 + P32 Equation 5 P3 = 2
[00149]In doubles matches, when rate value for only one player in a team is retrievable from a data source, that value is considered as the rate value for that team.
[00150]A further factor that can be used by the prediction module 820 is shot type, such as forehand, backhand, serve, and volley. At the end of each point, shot type ratio is updated by the processing system 100. This can be performed for the winning team and for all shot types. For this purpose, and for each shot type, number of shots of that type for the current winner of the point is calculated. Its ratio on rally number in the point is calculated and the ratio on the number of points is calculated as shown by Equations 6 and 7 below.
for i E [F,B,S,V} Equation 6 SC No. of Shots in currentpoint of type i Total No.of shots in current point
NewHissci Equation 7 Hissc, * No. of Sh. in His. of type i + SCj - No.of Sh.in cur.point of type i No.of shots in History of type i + No.of Shots in currentpoint of type i
for iE {F,B,S,V}
[00151] Prediction possibility is calculated by the prediction module 820 using a linear weighted combination of each shot type ratio and the number of shots of each type in the current match as shown below in Equation 8.
Equation 8 P4 SCI * NewHissci fori E{F,B,S,V)
[00152]In doubles matches, team shot calculation value equals to average of the rate values for corresponding players in that team as represented by Equation 9 shown below.
P41 + P42 Equation 8
p4-= 2
[00153]In doubles matches, when the rate value for only one player in a team is retrievable, that value is considered as the rate value for that team.
[00154]Another factor to be considered by the prediction module 820 is player speed. Player speed is calculated for each player and each set separately. As such, each player may have between 3 to 5 speed values for a tennis match. Player speed is updated when the player is trying to connect with the ball regardless of the fact whether he is successful or not. Player speed is updated when the player distance from the ball is more than a predetermined threshold value, for example 2 metres. Player speed is calculated based on all distances the player has run so far in the current match and before, and based on the summation of time they have travelled. When player speed for a player is defined, the effect of this factor on the final prediction is calculated by the prediction module 820 according to a function 900 graphically represented in Figure 9 and as represented by Equation 9 below.
0 if(PS < A) Equation 9 1 else if(PS > BI) for i E 1..5} Ps PS -A Bj - A, otherwise
[00155]In doubles matches, the effect of player speed is ignored.
[00156]Another factor which can used by the prediction module 820 is player distance which is calculated for each player over all singles matches which he/she has participated so far. This distance value is used for statistical purposes. Player distance is calculated for each player over a current tournament. This distance value is used for prediction. For this purpose, all the metres the player has run over the prior matches in the current tournament is used for calculating an estimation of the distance travelled by the player in the current match based on metres run so far. When the player distance for a player is defined, the effect of this factor on the final prediction is calculated by the prediction module 820 according to the function 1000 graphically represented in Figure 10.
Equation 10 0 if(PD < A) P6 1 else if(PD > B) P6-B - PD otherwise
[00157]In doubles matches, effect of player distance is ignored.
[00158]Another factor which is used by the prediction module 820 is shot placement. At the end of each point, shot placement ratio is updated. It is done for the winning player and team and for all court areas. For this purpose, and for each area, number of shots landing in that area in the current point won is calculated according to Equation 11. Its ratio on rally number in the point is calculated and the ratio on the number of points is calculated according to Equation 12.
No.of Shots in current point in ith area Equation 11 = Total No.of shots in current point foriE CourtAreas
NewHisspi Equation 12 Hissp, - No. of Sh. in His. in ith area + SCj - No. of Sh. in cur. point in ith area No. of shots in History in ith area + No. of Shots in currentpoint in ith area
for i E Court Areas
[00159] Prediction possibility is calculated by the prediction module 820 using a linear weighted combination of each court area placement value and the number of shots of each area in the current match. In doubles matches, the team shot placement value equals to average of the rate values for corresponding players in that team as represented by Equation 13.
for i E Court Areas Equation 13 P7 = SP, * NewHiss
[00160]In doubles matches, when the rate value for only one player in a team is able to be determined, that value is considered as the rate value for that team.
[00161]Another factor that can be used by the prediction module 820 is a player standing position. This can be calculated by the prediction module 820 in the same manner as shot placement discussed above.
[00162]Another factor that can be used by the prediction module 820 is shot speed. At the end of each point, shot speed ratio is updated. It is done for the player of that point. For this purpose, speed of shots in the current point won and in a specific type (B/F/SN) are calculated. Their ratios on rally number in the point are calculated and the ratio on the number of points is calculated. Speed of each shot equals the distance from its start point to its bouncing point divided by the travel time therebetween which is calculated based on number of its duration frames and the known frame rate. A prediction possibility is calculated by the prediction module 820 based on maximum speed from all shot speed values in current point and using the function 1100 graphically represented by Figure 11 and represented by Equation 14.
r if(Ss < AI) Equation 14 p 1= S else if (SS > B;) jE{,F ,V
B- A otherwise
P9 = ps; * NewHisss; for j E {B, F, S, V}
[00163]Lower and upper threshold values shown in the graphical representation of the function shown in Figure 11 are calculated automatically by the prediction module 820 of the processing system 100. Initially the thresholds begin at zero. In the beginning, when the chart parameters are not clear, the effect of this factor on the final prediction is ignored. In doubles matches, effect of shot speed is ignored.
[00164]Another factor that can be used by the prediction module 820 is a rally number. The rally number is indicative of the number of balls being hit in a point. Prediction possibility is calculated by the prediction module 820 based on a maximum possibility from two corresponding probability values in each rally number based on the function
1200 represented in Figure 12. The curve 1202 shows a possibility of being the loser and the curve 1204 presents this value for being the winner in the next point. Values A, A', B, B', C, C', D, D' should be calculated for each player based on their history and rally numbers in lost and won points. In doubles matches, effect of rally number is ignored.
[00165]Another factor that can be used by the prediction module 820 is point winning shots. Point winning shots is indicative of a number of specific shots resulting in the winning of points over the course of a match. For each player, point winning shots prediction possibility equals a number of the points which were won by a winning shot with respect to a total number of points won by the player. In doubles matches, effect of point winning shots is ignored.
[00166]Another factor that can be used by the prediction module 820 is fatigue and integrity analysis. Low fatigue has ignorable effect on prediction. When it reaches an injury condition, the player performance drops and it has considerable effect on the prediction. Fatigue analysis could be ignored in the initial minutes of a match. For example, in first thirty minutes of a match, data is gathered, but does not contribute toward the prediction. Drops in player speed, ball speed, and shot placement prediction factors show fatigue existence in a player or a team, or injury or lack of effort. A minimum value between these factors is considered as the fatigue level. The function 1300 as represented in Figure 13 can be used by the prediction module 820 to determine the probability for the winner which is also represented by Equation 15.
O if(FA < A) Equation 14 12 1 else if(FA > B) P12 FA - A I B1 - Aotherwise
[00167]In doubles matches, effect of Fatigue Analysis is ignored.
[00168]It will be appreciated that a preferred method and system of extracting analytic data from a video of a tennis match and determining one or more predictions in relation to the tennis match has been described above. However, it will be appreciated that this embodiment can be extended to other sporting events.
[00169]The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive.
[00170] In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings.

Claims (28)

CLAIMS:
1. A method of extracting analytic data from a video of a sporting event and determining one or more predictions in relation to the sporting event, wherein the method includes steps of: receiving a video of the sporting event; extracting a background image from the video; generating, using the background image, a plurality of moving masks for a plurality of frames of the video; extracting, from the plurality of moving masks, analytic data related to at least a portion of the sporting event; and using the analytic data to generate a likelihood for one or more outcomes for the sporting event, or a portion of the sporting event.
2. The method according to claim 1, wherein the method includes: determining the background image repeatedly throughout the video of the sporting event.
3. The method according to claim 1, wherein each moving mask is generated by comparing each pixel value of a frame of the video with a corresponding pixel of one or more immediately prior frames of the video, wherein based on the comparison, the moving mask for the frame includes each pixel having a pixel value discrepancy which exceeds a pixel value threshold.
4. The method according to any one of claims 1 to 3, wherein the method includes converting colour frames of the video to greyscale.
5. The method according to any one of claims 1 to 4, wherein the method includes: partitioning a playing area of the sporting event, which is captured within the video, into a plurality of partitions; and identifying actions performed during the sporting event using the moving masks; wherein the analytic data extracted from the video includes each identified action which is assigned to have been performed in a partition of the playing area.
6. The method according to any one of claims 1 to 5, wherein the method includes at least one of: calculating participant parameters based on participant detection using the moving masks, wherein the analytic data includes the participant parameters; and calculating non-participant parameters based on moving non-participant object detection using the moving masks, wherein the analytic data includes the non participant parameters.
7. The method according to claim 6, wherein the participant parameters include at least one of a position of the participant in a playing area of the sporting event, a speed of the participant, and action type.
8. The method according to claim 6 or 7, wherein the non-participant parameters include at least one of a position of the non-participant object and a speed of a non participant object.
9. The method according to any one of claims 1 to 8, wherein the method includes running a computerised model using the analytic data to determine the likelihood for the one or more outcomes of the sporting event, or a portion thereof.
10. A processing system for extracting analytic data from a video of a sporting event and determining one or more predictions in relation to the sporting event, wherein the processing system is configured to: receive a video of the sporting event; extract a background from the video; determine, using the background, a plurality of moving masks for a plurality of frames of the video; extract, from the plurality of moving mask, analytic data related to at least a portion of the sporting event; and use the analytic data to generate a likelihood for one or more outcomes for the sporting event, or a portion of the sporting event.
11. The processing system according to claim 10, wherein the processing system is configured to determine the background image repeatedly throughout the video of the sporting event.
12. The processing system according to claim 11, wherein each moving mask is generated by comparing each pixel value of a frame of the video with a corresponding pixel of one or more immediately prior frames of the video, wherein based on the comparison, the moving mask for the frame includes each pixel having a pixel value discrepancy which exceeds a pixel value threshold.
13. The processing system according to any one of claims 10 to 12, wherein the processing system is configured to convert colour frames of the video to greyscale prior to determining the moving masks.
14. The processing system according to any one of claims 10 to 13, wherein the processing system is configured to: partition a playing area of the sporting event, which is captured within the video, into a plurality of partitions; and identify actions performed during the sporting event using the moving masks; wherein the analytic data extracted from the video includes each identified action which is assigned to have been performed in a partition of the playing area.
15. The processing system according to any one of claims 10 to 14, wherein the processing system is configured to: calculate participant parameters based on participant detection using the moving masks, wherein the analytic data includes the participant parameters; and/or calculate non-participant parameters based on moving non-participant object detection using the moving masks, wherein the analytic data includes the non participant parameters.
16. The processing system according to claim 15, wherein the participant parameters include at least one of a position of the participant in a playing area of the sporting event, a speed of the participant, and action type.
17. The processing system according to claim 15 or 16, wherein the non-participant parameters include at least one of a position of the non-participant object and a speed of a non-participant object.
18. The processing system according to any one of claims 10 to 17, wherein the processing system is configured to run a computerised model using the analytic data to determine the likelihood for the one or more outcomes of the sporting event, or a portion thereof.
19. A system including: a processing system configured according to any one of claims 10 to 18; and a camera for capturing the video; wherein the processing system is configured to receive the video from the camera.
20. A non-transitory computer readable medium, having executable instructions stored therein or thereon, wherein execution of the executable instructions by a processor of a processing system causes the processing system to: receive a video of the sporting event; extract a background from the video; determine, using the background, a plurality of moving masks for a plurality of frames of the video; extract, from the plurality of moving mask, analytic data related to at least a portion of the sporting event; and use the analytic data to generate a likelihood for one or more outcomes for the sporting event, or a portion of the sporting event.
21. The non-transitory computer readable medium according to claim 20, wherein execution of the executable instructions configures the processor to determine the background image repeatedly throughout the video of the sporting event.
22. The non-transitory computer readable medium according to claim 21, wherein execution of the executable instructions configures the processor to generate each moving mask by comparing each pixel value of a frame of the video with a corresponding pixel of one or more immediately prior frames of the video, wherein based on the comparison, the moving mask for the frame includes each pixel having a pixel value discrepancy which exceeds a pixel value threshold.
23. The non-transitory computer readable medium according to any one of claims 20 to 22, wherein execution of the executable instructions configures the processor to convert colour frames of the video to greyscale prior to determining the moving masks.
24. The non-transitory computer readable medium according to any one of claim 20 to 23, wherein execution of the executable instructions configures the processor to: partition a playing area of the sporting event, which is captured within the video, into a plurality of partitions; and identify actions performed during the sporting event using the moving masks; wherein the analytic data extracted from the video includes each identified action which is assigned to have been performed in a partition of the playing area.
25. The non-transitory computer readable medium according to any one of claims 20 to 24, wherein execution of the executable instructions configures the processor to: calculate participant parameters based on participant detection using the moving masks, wherein the analytic data includes the participant parameters; and/or calculate non-participant parameters based on moving non-participant object detection using the moving masks, wherein the analytic data includes the non participant parameters.
26. The non-transitory computer readable medium according to claim 25, wherein the participant parameters include at least one of a position of the participant in a playing area of the sporting event, a speed of the participant, and action type.
27. The non-transitory computer readable medium according to claim 25 or 26, wherein the non-participant parameters include at least one of a position of the non participant object and a speed of a non-participant object.
28. The non-transitory computer readable medium according to any one of claims 20 to 27, wherein execution of the executable instructions configure the processor to run a computerised model using the analytic data to determine the likelihood for the one or more outcomes of the sporting event, or a portion thereof.
EDWORKZ PTY LTD Patent Attorneys for the Applicant/Nominated Person SPRUSON&FERGUSON
AU2021204421A 2021-04-27 2021-06-28 Data extraction and prediction generation method and system for sporting events Abandoned AU2021204421A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2021901238A AU2021901238A0 (en) 2021-04-27 Data extraction and prediction generation method and system for sporting events
AU2021901238 2021-04-27

Publications (1)

Publication Number Publication Date
AU2021204421A1 true AU2021204421A1 (en) 2022-11-10

Family

ID=83902072

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021204421A Abandoned AU2021204421A1 (en) 2021-04-27 2021-06-28 Data extraction and prediction generation method and system for sporting events

Country Status (1)

Country Link
AU (1) AU2021204421A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020030542A1 (en) * 2018-08-07 2020-02-13 Wingfield GmbH Game monitoring

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020030542A1 (en) * 2018-08-07 2020-02-13 Wingfield GmbH Game monitoring

Similar Documents

Publication Publication Date Title
US11157742B2 (en) Methods and systems for multiplayer tagging for ball game analytics generation with a mobile computing device
US11755952B2 (en) System and method for predictive sports analytics using body-pose information
US20190087661A1 (en) Methods and systems for ball game analytics with a mobile device
CN102819749B (en) A kind of football offside automatic discrimination system and method based on video analysis
US11967086B2 (en) Player trajectory generation via multiple camera player tracking
Ghosh et al. Towards structured analysis of broadcast badminton videos
US10803598B2 (en) Ball detection and tracking device, system and method
US20180157974A1 (en) Data-driven ghosting using deep imitation learning
US9928879B2 (en) Video processing method, and video processing device
US20230330485A1 (en) Personalizing Prediction of Performance using Data and Body-Pose for Analysis of Sporting Performance
Weeratunga et al. Application of computer vision and vector space model for tactical movement classification in badminton
US20230377336A1 (en) Method of operating server providing sports video-based platform service
Sangüesa et al. Identifying basketball plays from sensor data; towards a low-cost automatic extraction of advanced statistics
CN110490064B (en) Sports video data processing method and device, computer equipment and computer storage medium
Hsu et al. Spiking and blocking events detection and analysis in volleyball videos
AU2021204421A1 (en) Data extraction and prediction generation method and system for sporting events
CN115475373B (en) Display method and device of motion data, storage medium and electronic device
Bhatia A review of Machine Learning based Recommendation approaches for cricket
Tahan et al. A computer vision driven squash players tracking system
CN110314368B (en) Auxiliary method, device, equipment and readable medium for billiard ball hitting
Shah et al. Innovating a centuries-old sport: how emerging data analytics tools are redefining cricket
US9959632B2 (en) Object extraction from video images system and method
Hsu et al. 2D Histogram-based player localization in broadcast volleyball videos
Ghosh Analyzing Racket Sports From Broadcast Videos
Nilesh et al. Towards Real-Time Analysis of Broadcast Badminton Videos

Legal Events

Date Code Title Description
MK5 Application lapsed section 142(2)(e) - patent request and compl. specification not accepted