GB2567800A - Detecting ball strike position - Google Patents

Detecting ball strike position Download PDF

Info

Publication number
GB2567800A
GB2567800A GB1713818.1A GB201713818A GB2567800A GB 2567800 A GB2567800 A GB 2567800A GB 201713818 A GB201713818 A GB 201713818A GB 2567800 A GB2567800 A GB 2567800A
Authority
GB
United Kingdom
Prior art keywords
ball
strike
analysis system
images
playing surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1713818.1A
Other versions
GB201713818D0 (en
Inventor
Jonathon Penzik Dov
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Social Entertainment Ventures Ltd
Original Assignee
Social Entertainment Ventures Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Social Entertainment Ventures Ltd filed Critical Social Entertainment Ventures Ltd
Priority to GB1713818.1A priority Critical patent/GB2567800A/en
Publication of GB201713818D0 publication Critical patent/GB201713818D0/en
Publication of GB2567800A publication Critical patent/GB2567800A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B43/00Balls with special arrangements
    • A63B43/008Balls with special arrangements with means for improving visibility, e.g. special markings or colours
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B67/00Sporting games or accessories therefor, not provided for in groups A63B1/00 - A63B65/00
    • A63B67/04Table games physically beneficial for the human body, modelled on outdoor sports, e.g. table tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/38Training appliances or apparatus for special sports for tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47BTABLES; DESKS; OFFICE FURNITURE; CABINETS; DRAWERS; GENERAL DETAILS OF FURNITURE
    • A47B25/00Card tables; Tables for other games
    • A47B25/003Card tables; Tables for other games for table tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0037Tracking a path or terminating locations on a target surface or at impact on the ground
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0037Tracking a path or terminating locations on a target surface or at impact on the ground
    • A63B2024/0043Systems for locating the point of impact on a specific surface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0658Position or arrangement of display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/808Microphones
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image
    • G06T2207/30224Ball; Puck

Abstract

An apparatus for detecting ball strike position in table tennis comprises: one or more cameras 170 to generate images of a table tennis ball 150 moving above a playing surface 102; one or more microphones 180 that generate one or more audio signals containing strike sounds of the ball 150 hitting the surface 102 and an analysis system that determines, from the images, the position 104 of the ball coincident with a strike sound. The apparatus preferably includes a table tennis table 100 and the ball 150 may be detected in the images based on the hue of the ball, the size of the ball or an expected position of the ball. The apparatus may include a UV light to illuminate the playing surface and a display apparatus 140 to display visual elements on the playing surface.

Description

The present invention relates to detecting positions of ball strikes, for example the strike of a table tennis ball or similar on a playing surface.
Background of the invention
Table tennis is a widely popular game played around the world. It is played by professionals, amateurs, and casual players alike due its accessibility. In recent years, there has been a rise in the playing of table tennis in more casual environments, such as bars or clubs, where table tennis tables are provided as part of a social experience.
The traditional rules of table tennis are dependent on being able to identify impacts or strikes of the table tennis ball upon the table tennis table surface. Given the typically fast paced nature of the game, and high speed of the table tennis ball, trained referees are typically used in higher level competitions to score and adjudicate the game. Such referees, however, are still capable of making mistakes, especially when it comes to the impact positions of the table tennis ball, which can prove to be important, for example in doubles games. In casual play, it is typically not feasible to use trained referees and instead the players themselves are usually relied upon to score the game.
The rise of casual play, in venues such as bars or clubs, has also lead to greater interest in extending the game of table tennis, such as through variants of the rules and/or scoring system to better suit the casual nature of the environment. Further automated functionality can also be desirable such as the provision of visual effects and other functionally reactive to the progress of the game.
To this end it has been recognised that “smart” table tennis tables capable of determining the impact positions of the ball automatically, through electronic means, are desirable. An example of this is given in FR3018036 which seeks to
7633518; SJB2; SJB2 provide a smart table capable of identifying the impact positions of the ball through the use of piezoelectric vibration sensors incorporated in the underside of the table itself. In this document, three piezoelectric vibration sensors are used allowing the position of the impact of the ball to be determined through a process of triangulation.
The invention addresses limitation of the related prior art.
Summary of the invention
The invention provides a method of automatically determining strike (or bounce or impact) locations of a playing object such as a table tennis ball from in-flight tracking (or determination) of ball position by one or more cameras, in combination with audio identification of strike events.
This advantageously provides improved determination of strike locations of table tennis balls on a playing surface of a table where the tables themselves may be the subject to external sources of vibration, such as loud music, jostling by players or spectators, and impacts from other objects.
According to one aspect, the invention provides an analysis system configured to, receive a sequence of images containing a ball moving above a playing surface, receive at least one audio signal containing one or more strike sounds of the ball hitting the surface, and determine, from at least one of the images, a position of the ball coincident with a strike sound in the audio signal, to thereby determine a strike location of the ball on the playing surface.
There is also provided, according to a further aspect, apparatus comprising: a playing surface; one or more imaging devices (such as cameras or video cameras); one or more microphones; and an analysis system (or analyzer). The cameras and/or the microphones may be connected to the analysis system via one or more wireless connections. The one or more cameras are arranged to generate images containing a ball (such as a table tennis ball) moving above the playing surface. The one or more cameras may be high frame rate cameras, such as cameras generating images at or above 120 frames per second. The one or more microphones are arranged to generate one or more audio signals
7633518; SJB2; SJB2 containing strike (or impact) sounds of the ball hitting the surface. The analysis system is arranged to determine, from the images, a position of the ball coincident with a strike sound in the audio signals. The position of the ball may be determined by the analysis system tracking the position of the ball through the images in real time. Typically the analysis system, in determining the position of the ball, is arranged to use image processing techniques, such as edge detection to identify the ball in the image. In this way a strike location of the ball on the playing surface is determined.
In some examples the microphones are arranged to only generate audio signals while the analysis system is tracking the table tennis ball. Additionally, or alternatively, the one or more microphones may be arranged to selectively filter for sounds within of a predetermined frequency range, for example about the frequency of an expected strike sound (or comprising a characteristic frequency of the expected strike sound). In particular, the one or more microphones may be arranged to preferentially filter for sounds within of a frequency range of 1 to 2.5 kHz.
This may advantageously improve the accuracy of the apparatus by ignoring any possible strike sounds that may occur when the ball is not in flight above the table and/or removing background noise allowing strike sounds present to be identified more reliably.
In some examples at least one of the microphones may be located under the table comprising the playing surface. In particular, the microphones may be attached (or attachable to) the underside of the table.
In some examples the analysis system is arranged to determine the position of the ball based on one or more of: an expected hue of the ball in the images; an expected size of the ball in the images; and an expected position of the ball in the image.
In some examples the analysis system is arranged to determine the strike location of the ball on the playing surface by mapping the determined position of the ball in the image to the strike location of the ball on the playing surface using the angle of view of the camera and the distance between the camera and relative position of the camera to the playing surface.
7633518; SJB2; SJB2
The apparatus may further comprise a UV light arranged to illuminate the area directly above the table tennis playing surface to enable a UV fluorescent table tennis ball to fluoresce. This may aid the determination of the position of the ball by increasing the contrast of the ball in the image.
In some examples the apparatus may comprise display apparatus arranged to cause one or more visual elements to be displayed in dependence on the strike location determined by the analysis system. The display may be arranged to cause at least one of the visual elements to be displayed on the playing surface at, proximal to, or dependent upon, the strike location determined by the analysis system. Additionally, or alternatively, the display (or a further display of the apparatus) may be arranged to cause a visual element to be displayed on the playing surface, wherein the analysis system is further arranged to cause a gameplay action to be triggered upon determining a strike location of the ball on the playing surface coinciding with the position of the displayed visual element.
In some examples the analysis system may be arranged to track, based on determined strike positions, any one or more of the following: a score for a game; the number of consecutive shots in a rally; and the number of rallies in a game. The analysis system may be arranged to output any of: a score for a game; the number of consecutive shots in a rally; and the number of rallies in a game, using (or via) the display apparatus.
There is also provided a kit of parts for apparatus such as those described above. The kit of parts comprises one or more video cameras; one or more microphones; and an analysis system.
The invention also provides a method of operating an analysis system of apparatus such as those described above. The method comprises, receiving images containing a table tennis ball moving above a playing surface; receiving one or more audio signals containing strike sounds of the ball hitting the surface; and determining, from the images, a position of the ball coincident with a strike sound in the audio signals, to thereby determine a strike location of the ball on the playing surface.
7633518; SJB2; SJB2
The invention provides methods of operation of apparatus such as those described above. The invention also provides one or more computer programs suitable for execution by one or more processors, such computer program(s) being arranged to put into effect the methods outlined above and described herein. The invention also provides one or more computer readable media, and/or data signals carried
Brief description of the drawings
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 illustrates an embodiment of the invention in use for a table tennis gameplay scenario involving two or more players;
Figure 2 illustrates a schematic diagram of apparatus in showing in more detail showing how the arrangement of figure 1 may be implemented;
Figure 3 schematically illustrates a logical arrangement for an example analysis system, such as that which may be used in the systems of figure 1 or 2;
Figure 4 schematically illustrates an example of a computer system, such as that which may be used in the systems of figures 1,2 or 3.
Detailed description of embodiments of the invention
In the description that follows and in the figures, certain embodiments of the invention are described. However, it will be appreciated that the invention is not limited to the embodiments that are described and that some embodiments may not include all of the features that are described below. It will be evident, however, that various modifications and changes may be made herein without departing from the broader spirit and scope of the invention as set forth in the appended claims.
Figure 1 illustrates an embodiment of the invention in use for a table tennis gameplay scenario involving two or more players 110; 112. During the gameplay
7633518; SJB2; SJB2 scenario, each player 110; 112 typically takes it in turns to hit (or strike) a ball 150 towards the other player 110; 112 across a table tennis table 100.
Once struck, the ball 150 follows a flight path or trajectory 155. As part of the flight path 155 the ball 150 may strike (or impact or hit) the upper surface 102 of the table 100 one or more times at one or more respective strike positions 104. The upper surface 102 of the table 100 (typically referred to as the playing surface) is usually divided into two halves (or zones) by a net 130 stretched across the table 100 between the two players 110; 112. The particular rules of the game of table tennis being played will dictate the number of strikes and the respective strike positions 104 required for a legal shot to be played. In the example scenario shown in figure 1 the player 112 has just served the ball 150, which in the traditional rules of table tennis requires the ball to strike the surface 102 of the table 100 only once at a strike position 104 in the half nearest to the player 112, and then at least once at a strike position in the other half.
Also shown in figure 1 is at least one video camera 170 positioned above the table 100. The video camera 170 is arranged to film (or record) the ball 150 moving above the playing surface 102. In this way, the video camera 170 is arranged to generate a plurality (or sequence) of images. Each of these images will typically contain at least a part of the playing surface 102, and will also show the ball 150 when it is in view.
One or more microphones 180 are also provided to record the sound of the ball 150 striking the playing surface 102 of the table 100. In particular, the microphones 180 are arranged to generate one or more audio signals containing strike sounds of the ball 150 hitting the playing surface 102 of the table 100. The microphones may be positioned above, below, around, and/or in contact with the playing surface 102 or some other surface or structure of the table tennis table 100, depending on convenience of positioning for example to avoid interference with game play or unwanted tampering with the microphones 180, and to ensure that the one or more audio signals adequately comprise the strike sound for subsequent determination of strike positions 104 on the playing surface 102 using the image data as described in more detail below. Typically, the one or
7633518; SJB2; SJB2 more microphones 180 are attached underneath (or to the underside of) the table
100.
The video camera 170 and the microphones 180 are connected to an analysis system (or analyzer) 190. The analysis system 190 may be arranged to use video and/or image processing techniques to track the ball 150 through the images generated by the video camera 170, or to otherwise identify the position of the ball 150 in one or more particular images . The analysis system 190 is also arranged to identify or locate strike sounds in the one or more audio signals. The analysis system 190 is then arranged to determine the position of the ball 150 coincident with an identified strike sound in the audio signal, using one or more of the images. In this way the analysis system 190 can determine a strike location 104 of the ball 150 on the surface of the table 100.
This is typically achieved by determining the position of the ball in an image captured by the video camera 170 at substantially the same time at the identified strike sound was recorded by the microphone 180. As the ball 150 would have been touching or very near to the surface 102 of the table 100 at the time the identified strike sound was recorded, the position of the ball relative to the surface 102 of the table 100 in the image captured by the video camera 170 gives a strike location 104 of the ball 150 on the surface of the table 100. Interpolation or averaging of ball 150 position between two or more images, or determination of ball position from a determined track or flight path and a relevant time point indicated by the strike sound, the track being determined from a sequence of images, may also be used, as well as other suitable techniques.
In other words, the analysis system 190 may derive positions of the ball 150 with respect to the playing surface 102 from the images. The analysis system 190 may then use the audio signals to determine the timing of any strike of the ball 150 with the playing surface 102. The analysis stem 190 uses this timing to extract the position of the ball 150 from the images which corresponds to the ball 150 striking the surface 102, thus providing the strike position 104.
In the scenario shown in figure 1 a light projector 140 or other display apparatus can be triggered by the analysis system 190 to display information based on the strike positons 104, such as by projecting visual effects onto the
7633518; SJB2; SJB2
- 8 surface 102 of the table 100 when the ball 150 strikes the surface of the table
100. As the analysis system 190 can determine the strike location 104 of the ball 150 the projector 140 can be triggered to project the visual elements and effects at positions dependent on said strike location 104, such as visual elements encompassing, centred on, or separated from but otherwise related to a strike location 104. For example, the analysis system 190 may trigger the projector 104 to project a water ripple effect onto the playing surface, centred on the strike location 104.
It will also be appreciated that in some scenarios, the analysis system 190 may additionally or alternatively be used as part of a scoring, automatic refereeing, or other gameplay system, for example in which strike locations 104 determined by the analysis system 190 may be used to determine a game score and/or identify if a shot played by a player 110; 112 is legal or illegal. The analysis system 190 may be arranged to keep (or track) a score (or scores) for a game. Additionally, or alternatively the analysis system 190 may be arranged to count, or track the number of consecutive shots between players in a continuous sequence (known as a rally). The analysis system 190 may track the number of rallies in a game. The analysis system 190 may be arranged to display any or all of these tracked values using the display apparatus.
Further description of ways in which arrangements such as that illustrated in figure 1 may be put into effect to provide tracking of a ball above a table tennis playing surface, are discussed in the following figures.
Figure 2 illustrates a schematic diagram of apparatus in showing in more detail showing how the arrangement of figure 1 may be implemented. The apparatus 300 shown in figure 2 comprises an analysis system 190 a camera 170 a microphone 180 and display apparatus (such as a projector) 140.
The camera 170 is arranged to capture video (or a plurality of images) 372 of playing surface 102 of a table tennis table 100, such as that shown in figure 1. The camera 170 is typically positioned above the playing surface 102. However, it will be appreciated that the camera 170 need not be positioned directly above the playing surface 102 in order to capture video 372 of the playing surface 102. Although figure 3 shows a single camera 170, it will also be appreciated that a
7633518; SJB2; SJB2 plurality of cameras 170 may be used in order to capture video 372 of the surface
102. In particular, a plurality of cameras 170 may be arranged such that each camera 170 captures video of a respective portion of the surface 102 to thereby allow video of the entire surface 102 to be reconstructed from the output of the plurality of cameras 170. Different cameras 170 may also or instead be arranged to provide multiple viewing angles of all or particular parts of the surface 102 to enable more accurate ball position determination.
In any case it will be appreciated that video camera 170 is arranged to generate, when a ball 150 is in play above the surface 102, a plurality of images 372 each showing the ball 150 and the playing surface 102.
The video camera 170 is communicatively coupled to the analysis system 190. The camera 170 is also arranged to provide the plurality of images 372 to the analysis system 190. It will be appreciated that the camera 170 may be coupled to the analysis system 190 by any kind of data communication link suitable for communicating or transferring data (such as the plurality of images 372). Thus, the data communication link may comprise any of: a universal serial bus (USB) link, a FireWire link, an Ethernet link, an eSATA link, a thunderbolt link, an infrared data link (such as any IrDA link), high definition multimedia interface (HDMI) link, digital visual interface (DVI) link, DisplayPort link, and so on and so forth.
The camera 170 may be a camera capable of capturing high frame rate video, such as frame rates above 24 frames per second (fps). In particular the camera 170 may capture video at 120fps or above.
It will be appreciated that by increasing the frame rate of the camera 170 the strike locations 104, discussed above in reference to figure 1, can be determined by the analysis system 190 to a greater accuracy. This is because with more images being captured per unit time, images will be captured closer to the instant of impact of the ball 150 with the surface 102. However, it should also be appreciated that increasing frame rate beyond a certain degree may lead to sufficient increase in image noise that the precision of the determined ball position is compromised.
7633518; SJB2; SJB2
The plurality of images 372 will typically form video content. As such, the plurality of images 372 may be encoded as video, using suitable video encoding. Examples of this include MPEG-2, MPEG-4, VP6, VP7, VP8, VP9, and so on and so forth. Additionally, or alternatively, the plurality of images 372 may be encoded as individual images, using suitable image encoding. Examples of this include JPEG, TIFF, GIF, and so on and so forth.
The microphone 180 is typically arranged near to the surface 102 of the table 100 so as to be able to capture the sounds emanating from the surface 102 of the table 100. As with the video camera 170 above, whilst only a single microphone 180 is shown in figure 3 it will be appreciated that a plurality of microphones 180 may be used, for example spaced around the table 100 or suspended in a constellation above the table 100 so as to capture sound emanating from the surface 102 of the table 100.
The microphone 180 is arranged to generate one or more audio signals 382 containing strike sounds of the ball 150 hitting the playing surface 102. One or more audio filters may be applied to (or incorporated in) the microphone 180 in order to filter the audio signals 382. The audio signals 382 may be filtered based on a frequency, or frequency range, of an expected strike sound (or a characteristic frequency of a strike sound). For example, the microphone 180 may be calibrated for use with a particular combination of playing surface 102 and ball 150 which is known to produce a strike sound with a given characteristic frequency or frequencies. This would form the expected strike sound to which the microphone 180 could be adapted. For example, it has been observed that filtering out sounds generally outside of the frequency range 1 - 2.5 kHz leads to an increased accuracy in identifying strike sounds by the analysis system 190. However, other filtering parameters may advantageously be used, for example a band pass filter centred on a frequency within a frequency range of 1 - 2.5 kHz or 500 Hz - 3.0 kHz, a low pass filter with a cut off of around 2.5 kHz or 3 kHz, a high pass filter with a cut off of around 500 Hz or 1 kHz.
The microphone 180 is communicatively coupled to the analysis system 190. The microphone 180 is also arranged to provide the one or more audio signals 382 to the analysis system 190. It will be appreciated that the
7633518; SJB2; SJB2 microphone 180 may be coupled to the analysis system 190 by any kind of data communication link suitable for communicating or transferring audio data (such as one or more audio signals 382). Thus, the data communication link may comprise any of: a universal serial bus (USB) link, a FireWire link, an Ethernet link, an eSATA link, a Thunderbolt link, an infrared data link (such as any IrDA link), high definition multimedia interface (HDMI) link, DisplayPort link, S/PDIF, and so on and so forth.
The analysis system 190 may comprise one or more computer systems such as computer system 1000 described below with reference to figure 4. As set out above, with reference to figure 1, the analysis system 190 is arranged to determine, from the images 372, a position of the ball coincident with a strike sound in the audio signals 382, to thereby determine a strike location 104 of the ball 150 on the playing surface 102. The analysis system 190 is discussed in further detail shortly below.
In some embodiments, display apparatus 140 may be communicatively coupled to the analysis system 190. The display apparatus 140 may be arranged to display information to people (such the players 112; 114 and/or spectators) based or dependent on the strike location determined by the analysis system 190. The information displayed may be, or relate to, any one or more of: a score; a graphical indication of the strike position 104; a graphical effect based on the strike position 104; an animation based on the strike position 104. Typically, the display apparatus 140 may comprise a video or light projector arranged to project images or other visual elements or effects onto the surface 102 of the table 100, as already discussed above in respect of figure 1. Such visual elements may be related to the game being played. For example, the display 140 may project a target onto the surface 102 indicating a score achieved by hitting the target with the ball 150. In response to the analysis system 190 determine a strike position of the ball 150 coincident with the target the display 140 may alter the projected target. Thus visual cues may be provided to players of the game.
Figure 3 schematically illustrates a logical arrangement for an example analysis system 190, such as that which may be used in the systems of figure 1 or 2. The analysis system 190 comprises a receiver module 420, an image (or
7633518; SJB2; SJB2 video) processing module 430, an audio processing module 440, and a further processing module 450.
The receiver module 420 is arranged to receive (or obtain or acquire) the plurality of images 372 showing the ball 150 moving above the playing surface 102. The receiver module 420 is also arranged to receive (or obtain or acquire) the one or more audio signals 382 containing strike sounds of the ball 150 hitting the surface 102. The plurality of images 372 and/or the one or more audio signals 382 are typically received in real-time. As such, the plurality of images 372 and/or the one or more audio signals 382 may be streamed to the receiver module 420.
The audio processing module 440 is communicatively coupled to the receiver module 420 and obtains (or receives) the audio signals 382 from the receiver module 420. The audio processing module 440 is arranged to identify one or more strike sounds in the audio signals 382. In particular, for identifying one or more strike sounds in the audio signals 382, the audio processing module 440 may be arranged to use acoustic fingerprinting or similar techniques. For example, known strike sounds of a ball 150 striking a playing surface 102 may be used to build up an acoustic fingerprint (for example a set of characteristic frequencies or wave form features). Using this audio fingerprint the analysis system can then search or monitor for comparable audio strike sounds in the audio signals 382.
It will be appreciated that the generation of the acoustic fingerprint may be carried out as part of a research and development, calibration, or similar process not involving the final apparatus described by this document. Suitable acoustic fingerprint data may then be stored in the analysis system 190 for use by the audio processing module 440. As acoustic fingerprinting and such related techniques are well known in the art, they are not discussed herein further.
Strike sounds detected by the audio processing module 440 typically give timings for each identified strike. In this way, the audio processing module 440 may be arranged to determine timing information (such as a time) for the respective impact of the ball 150 corresponding to (or causing) each identified strike sound. Typically, the timing information will be based on the position of the
7633518; SJB2; SJB2 strike sound in the audio signal, as the audio signal will usually be a time dependent (or time series) signal.
It will be appreciated that some or all of the audio filtering described above in relation to the microphone 180 may be carried out by the audio processing module 440. Such audio filtering may be carried out in hardware, or software, or a combination of hardware and software audio filtering.
The image processing module 430 is communicatively coupled to the receiver module 420 and obtains (or receives) the plurality of images 372 from the receiver module 420. The image processing module 430 is communicatively coupled to the audio processing module 440. The image processing module is arranged to identify a position of the ball 150 coincident with a strike sound identified by the audio processing module 440.
Usually, the image processing module receives the timing information of the strike sound from the audio processing module 440. The image processing module 430 may then identify the image in the plurality of images 372 that corresponds to the impact. It will be appreciated that the plurality of images 372 effectively form a time series set of images (or a video) of the flight of the ball 150. The image processing module 430 can find the point in time in this video that is coincident with the impact. In other words, using the time information of the strike sound the image processing module 430 may identify the image that was taken at (or around) the same time as the impact.
The image processing module 430 is arranged to automatically identify the ball 150 in the image of the plurality of images 372 corresponding to the impact. By identifying the ball in the image a position of the ball 150 in the image is usually produced. In particular, the image processing module 430 may be arranged to use the expected shape of the ball 150 in the image and/or the expected hue of the ball 150 in the image to locate the ball 150 in the image. It will be appreciated that both the expected shape of the ball 150 and the expected hue of the ball 150 in the image can be calculated based on the known properties of the camera 170 and the ball 150. In an example, the image processing module 430 may use edge matching techniques to identify the ball 150 in the image. Such edge matching techniques typically involve using an edge detection
7633518; SJB2; SJB2 algorithm on a template (such as a predetermined image (or outline) of the ball 150) and/or the subject image (such as an image of the plurality of images 372) and comparing found edges to determine a location where the template would match in the image. It will be appreciated that in some examples the image processing module 430 may, for each impact, identify the position of the ball in a respective single image. In these examples the image processing module 430 may not need to process any of the other images in the plurality of images. As such, the image processing module 430 may avoid receiving (or requesting) said other images. In some cases, the receiver module 420 may avoid receiving (or requesting) said images. The image processing module 430 and/or the receiver module 420 may be arranged to request (or obtain) the respective single image.
Additionally, or alternatively, to identifying the ball 150 in the image of the plurality of images 372 corresponding to the impact the image processing module 430 may be arranged to track the position of the ball 150 through the plurality of images 372. As such, the image processing module 430 may be arranged to generate (or calculate) a position track (or path) for the ball 150 through the plurality of images. Such a position track may be a time series of coordinates (or ball positions).
In this way, the image processing module 430 may identify the ball 150 in any given image based also upon the identified ball 150 position in a previous image. In another example, the image processing module 430 may use edge matching techniques to track the ball 150 through the plurality of images 372. It will be appreciated that the tracking of objects in a video (or sequence of images) is well known, and therefore not described herein further.
The image processing module 430 is arranged to determine a strike location 104 of the ball 150 on the playing surface 102, from the identified position of the ball 150 coincident with a strike sound. The image processing module 430 may thereby a strike location of the ball on the playing surface. Usually, the image processing module 430 will be arranged to map positions in images of the plurality of images 372 (such as the identified position of the ball 150) to the corresponding locations on the surface 102 of the table 100. It will be appreciated that since the position of the ball 150 in the image is identified
7633518; SJB2; SJB2 coincident with the strike sound the ball 150 in the image is touching (or very near to touching) the surface 102. As such the position of the ball 150 in the image may be considered as a position on the surface 102 in the image. In some cases the location on the surface 102 corresponding to the position in the image may be obtained by applying one or more transforms to the position. Such transforms are to take account of the optics of the camera (such as the camera 170) which produced the image. Typically, these transforms are applied based on one or more of: the field of view of the camera 170, the distance between the camera 170 and the surface 102, the focal length of the camera 170 etc. Such mapping and image transformation techniques are well known and not described herein further.
It will be appreciated that the determined strike location 104 may be used in a multitude of ways. Typically, further processing will be carried out based on the strike location 104. Such further processing may involve the calculation of a score contingent on the strike location 104. Additionally, or alternatively the further processing may involve calculating the number of shots in a rally, and/or the number of rallies in a game (or set period). More generally, the further processing may be related to enforcing of rules of the particular game (which may vary from the traditional rules of table tennis). For example, the rules defining a legal shot based on number and position of bounces may be used in the further processing to determine the number of shots in a rally, detecting that the rally is over based on detecting an illegal shot in a sequence of shots.
As mentioned previously, the strike location 104 may be used to cause visual elements to be displayed on a screen and/or projected on the playing surface 102. The further processing may comprise driving of the display (or projector) 140 to effect the display of such visual elements. It will be appreciated that the further processing may be carried out by a system or systems external to the analysis system 190 (such as computer system 1000). In such a case the analysis system 190 may be arranged to provide the strike locations 104 to the external system or systems. Additionally, or alternatively the analysis system may be arranged to carry out further processing, such as with the optional further processing module 450.
7633518; SJB2; SJB2
As set out in discussion of figures 1,3, and 4 above the analysis system 190 may use image processing techniques to identify the ball 150 in the plurality of images 372. Such techniques can be most effective when the object that is sought to be identified in an image has a significant contrast with the surrounding image. As such, it has been realized that the apparatus shown above may be further modified by the inclusion of one or more UV lights arranged to illuminate the area directly above the table tennis playing surface. This enables the use of a ball 150 which fluoresces under UV light. Cameras, such as the camera 170, will then generate an image with an increased contrast of the ball with respect to the rest of the image. This is due to the fact that the UV light does not increase the overall illumination of the scene in the image, rather simply increases the brightness of the ball 150.
Figure 4 schematically illustrates an example of a computer system 1000, that may be used in embodiments of the invention, for example to implement aspects of the analysis system 190 described above. The system 1000 comprises a computer 1020. The computer 1020 comprises: a storage medium 1040, a memory 1060, a processor 1080, an interface 1100, a user output interface 1120, a user input interface 1140 and a network interface 1160, which are all linked together over one or more communication buses 1180.
The storage medium 1040 may be any form of non-volatile data storage device such as one or more of a hard disk drive, a magnetic disc, an optical disc, a ROM, etc. The storage medium 1040 may store an operating system for the processor 1080 to execute in order for the computer 1020 to function. The storage medium 1040 may also store one or more computer programs (or software or instructions or code).
The memory 1060 may be any random access memory (storage unit or volatile storage medium) suitable for storing data and/or computer programs (or software or instructions or code).
The processor 1080 may be any data processing unit suitable for executing one or more computer programs (such as those stored on the storage medium 1040 and/or in the memory 1060), some of which may be computer programs according to embodiments of the invention or computer programs that,
7633518; SJB2; SJB2 when executed by the processor 1080, cause the processor 1080 to carry out a method according to an embodiment of the invention and configure the system 1000 to be a system according to an embodiment of the invention. The processor 1080 may comprise a single data processing unit or multiple data processing units operating in parallel or in cooperation with each other. The processor 1080, in carrying out data processing operations for embodiments of the invention, may store data to and/or read data from the storage medium 1040 and/or the memory 1060.
The interface 1100 may be any unit for providing an interface to a device 1220 external to, or removable from, the computer 1020. The device 1220 may be a data storage device, for example, one or more of an optical disc, a magnetic disc, a solid-state-storage device, etc. The device 1220 may have processing capabilities - for example, the device may be a smart card. The interface 1100 may therefore access data from, or provide data to, or interface with, the device 1220 in accordance with one or more commands that it receives from the processor 1080.
The user input interface 1140 is arranged to receive input from a user, or operator, of the system 1000. The user may provide this input via one or more input devices of the system 1000, such as a mouse (or other pointing device) 1260 and/or a keyboard 1240, that are connected to, or in communication with, the user input interface 1140. However, it will be appreciated that the user may provide input to the computer 1020 via one or more additional or alternative input devices (such as a touch screen). The computer 1020 may store the input received from the input devices via the user input interface 1140 in the memory 1060 for the processor 1080 to subsequently access and process, or may pass it straight to the processor 1080, so that the processor 1080 can respond to the user input accordingly.
The user output interface 1120 is arranged to provide a graphical/visual and/or audio output to a user, or operator, of the system 1000. As such, the processor 1080 may be arranged to instruct the user output interface 1120 to form an image/video signal representing a desired graphical output, and to provide this signal to a monitor (or screen or display unit) 1200 of the system
7633518; SJB2; SJB2
1000 that is connected to the user output interface 1120. Additionally or alternatively, the processor 1080 may be arranged to instruct the user output interface 1120 to form an audio signal representing a desired audio output, and to provide this signal to one or more speakers 1210 of the system 1000 that is connected to the user output interface 1120.
Finally, the network interface 1160 provides functionality for the computer 1020 to download data from and/or upload data to one or more data communication networks.
It will be appreciated that the architecture of the system 1000 illustrated in figure 4 and described above is merely exemplary and that other computer systems 1000 with different architectures (for example with fewer components than shown in figure 4 or with additional and/or alternative components than shown in figure 4 may be used in embodiments of the invention. As examples, the computer system 1000 could comprise one or more of: a personal computer; a server computer; a mobile telephone; a tablet; a laptop; a television set; a set top box; a games console; other mobile devices or consumer electronics devices; etc.
It will be appreciated that the methods described have been shown as individual steps carried out in a specific order. However, the skilled person will appreciate that these steps may be combined or carried out in a different order whilst still achieving the desired result.
The functionality of the analysis system 190 is described above logically. It will be appreciated that the analysis system 190 may be distributed in various ways. In other words functionality of the analysis system 190 may be implemented across different devices. For example some or all of the functionality of the image processing module 430 may be implemented at (or as part of) the camera 170. Additionally, or alternatively, some or all of the functionality of the audio processing module 440 may be implemented at (or as part of) the microphone 180. Some or all of the modules of the analysis system 190 may be implemented as one or more separate systems (such as system 1000).
7633518; SJB2; SJB2
Equally the functionality of the analysis system 190 may be carried out at various different times in relation to the motion of the ball 150. Typically, the processing carried out by the analysis system 190 as described previously is carried out substantially at the same time as the motion of the ball 150. This would be understood to be “real time” processing. As such, the impact positions 104 of the ball 150 are determined substantially at the same time as the respective impacts of the ball 150. However, it will be appreciated that the processing may occur sometime after the motion of the ball. For example, the analysis system 190 may process audio signals and/or a plurality of images collected previously, in order to provide impact positions 104 of a previously played game.
Whilst the above discussion has related to table tennis tables, it will be appreciated that the invention may be applied to any situation where objects strike a surface. For example, the invention may be applied to other racquet sports, such as squash, tennis, etc. Equally, the invention may be applied to other table ball sports, such as snooker, billiards etc. where the invention may be used to determine the position on the table where balls strike each other, thereby giving rise to a strike sound between balls which can be used as described above instead of the strike sound of the strike of the ball on the surface.
It will be appreciated that embodiments of the invention may be implemented using a variety of different information processing systems. In particular, although the figures and the discussion thereof provide an exemplary computing system and methods, these are presented merely to provide a useful reference in discussing various aspects of the invention. Embodiments of the invention may be carried out on any suitable data processing device, such as a personal computer, laptop, personal digital assistant, mobile telephone, set top box, television, server computer, etc. Of course, the description of the systems and methods has been simplified for purposes of discussion, and they are just one of many different types of system and method that may be used for embodiments of the invention. It will be appreciated that the boundaries between logic blocks are merely illustrative and that alternative embodiments may merge
7633518; SJB2; SJB2 logic blocks or elements, or may impose an alternate decomposition of functionality upon various logic blocks or elements.
It will be appreciated that the above-mentioned functionality may be implemented as one or more corresponding modules as hardware and/or software. For example, the above-mentioned functionality may be implemented as one or more software components for execution by a processor of the system. Alternatively, the above-mentioned functionality may be implemented as hardware, such as on one or more field-programmable-gate-arrays (FPGAs), and/or one or more application-specific-integrated-circuits (ASICs), and/or one or more digital-signal-processors (DSPs), and/or other hardware arrangements. Method steps implemented in flowcharts contained herein, or as described above, may each be implemented by corresponding respective modules; multiple method steps implemented in flowcharts contained herein, or as described above, may be implemented together by a single module.
It will be appreciated that, insofar as embodiments of the invention are implemented by a computer program, then a storage medium and a transmission medium carrying the computer program form aspects of the invention. The computer program may have one or more program instructions, or program code, which, when executed by a computer carries out an embodiment of the invention. The term “program” as used herein, may be a sequence of instructions designed for execution on a computer system, and may include a subroutine, a function, a procedure, a module, an object method, an object implementation, an executable application, an applet, a servlet, source code, object code, a shared library, a dynamic linked library, and/or other sequences of instructions designed for execution on a computer system. The storage medium may be a magnetic disc (such as a hard drive or a floppy disc), an optical disc (such as a CD-ROM, a DVD-ROM or a BluRay disc), or a memory (such as a ROM, a RAM, EEPROM, EPROM, Flash memory or a portable/removable memory device), etc. The transmission medium may be a communications signal, a data broadcast, a communications link between two or more computers, etc.

Claims (20)

1. Apparatus comprising:
a table tennis playing surface;
one or more cameras arranged to generate images containing a table tennis ball moving above the playing surface;
one or more microphones arranged to generate one or more audio signals containing strike sounds of the ball hitting the surface; and an analysis system arranged to determine, from one or more of the images, a position of the ball coincident with a strike sound in the one or more audio signals, to thereby determine a strike location of the ball on the playing surface.
2. The apparatus of claim 1, wherein the analysis system is arranged to track a position of the ball through a sequence of the images.
3. The apparatus of claim 2, wherein the microphones are arranged to only generate audio signals for transmission to the analysis system while the analysis system is tracking the table tennis ball.
4. The apparatus of any one of claims 1 to 3, wherein the analysis system is arranged to determine the position of the ball based on one or more of:
an expected hue of the ball in the images;
an expected size of the ball in the images; and an expected position of the ball in the image.
5. The apparatus of any one of claims 1 to 4, wherein the analysis system is arranged to determine the strike location of the ball on the playing surface by:
mapping the determined position of the ball in the image to the strike location of the ball on the playing surface using one or more of: an angle of view of the camera; a distance between the camera and the ball; and a relative position of the camera to the playing surface.
7633518; SJB2; SJB2
6. The apparatus of any one of claims 1 to 5 further comprising a UV light arranged to illuminate the area directly above the table tennis playing surface to enable a UV fluorescent table tennis ball to fluoresce.
7. The apparatus of any one of claims 1 to 6, wherein the one or more microphones are arranged to selectively filter for sounds within a predetermined frequency range comprising a characteristic frequency of an expected strike sound.
8. The apparatus of claim 7, wherein the one or more microphones are arranged to selectively filter for sounds within a frequency range of 1 to 2.5 kHz.
9. The apparatus of any one of claims 1 to 8, wherein the camera is arranged to generate the images at a frame rate of 120 frames per second or higher.
10. The apparatus of any one of claims 1 to 9, wherein the cameras and/or the microphones are arranged to deliver the images and one or more audio signals to the analysis system via one or more wireless connections.
11. The apparatus of any one of claims 1 to 10 further comprising display apparatus arranged to cause one or more visual elements to be displayed in dependence on the strike location determined by the analysis system
12. The apparatus of claim 11 wherein the display apparatus is arranged to cause at least one of the visual elements to be displayed on the playing surface coinciding with the strike location determined by the analysis system.
13. The apparatus of any one of claims 1 to 10 further comprising display apparatus arranged to cause a visual element to be displayed on the playing surface, wherein the analysis system is further arranged to cause a gameplay
7633518; SJB2; SJB2 action to be triggered upon determining that the determined strike location of the ball on the playing surface coincides with the displayed visual element.
14. The apparatus of any one of claims 1 to 13 wherein the analysis system is further arranged to track, based on determined strike positions, any one or more of the following:
a score for a game;
the number of consecutive shots in a rally; and the number of rallies in a game.
15. An analysis system configured to:
receive a sequence of images containing a ball moving above a playing surface;
receive at least one audio signal containing one or more strike sounds of the ball hitting the surface; and determine, from at least one of the images, a position of the ball coincident with a strike sound in the audio signal, to thereby determine a strike location of the ball on the playing surface.
16. Apparatus, for use with a table tennis playing surface, the apparatus comprising:
one or more video cameras operable to generate images containing a table tennis ball moving above the playing surface;
one or more microphones operable to generate one or more audio signals containing strike sounds of the ball hitting the surface; and an analysis system according to claim 15.
17. The apparatus of claim 16 wherein the one or more video cameras and the one or more microphones are operably connected to the analysis system so as to deliver the images and the one or more audio signals to the analysis system.
18. A computer implemented method comprising:
7633518; SJB2; SJB2 receiving images containing a table tennis ball moving above a playing surface;
receiving one or more audio signals containing strike sounds of the ball hitting the surface; and determining, from the images, a position of the ball coincident with a strike sound in the audio signals, to thereby determine a strike location of the ball on the playing surface.
19. A computer program which, when executed by a processor, causes the processor to carry out a method according to claim 18.
20. A computer-readable medium storing a computer program according to claim 19.
GB1713818.1A 2017-08-29 2017-08-29 Detecting ball strike position Withdrawn GB2567800A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1713818.1A GB2567800A (en) 2017-08-29 2017-08-29 Detecting ball strike position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1713818.1A GB2567800A (en) 2017-08-29 2017-08-29 Detecting ball strike position

Publications (2)

Publication Number Publication Date
GB201713818D0 GB201713818D0 (en) 2017-10-11
GB2567800A true GB2567800A (en) 2019-05-01

Family

ID=60037064

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1713818.1A Withdrawn GB2567800A (en) 2017-08-29 2017-08-29 Detecting ball strike position

Country Status (1)

Country Link
GB (1) GB2567800A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111840966A (en) * 2020-07-29 2020-10-30 山东科技大学 Intelligent evaluation method and system for table tennis training based on image recognition
EP4087666A4 (en) * 2020-01-06 2023-05-24 Topgolf International, Inc. Identifying a location for a striker of an object

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2403362A (en) * 2003-06-27 2004-12-29 Roke Manor Research Calculating the location of an impact event using acoustic and video based data
US20100090811A1 (en) * 2008-10-10 2010-04-15 Ahem Frank W Pong scores
US20100198528A1 (en) * 2009-02-03 2010-08-05 Mccauley Jack J Systems and methods for an impact location and amplitude sensor
CN202438121U (en) * 2012-02-14 2012-09-19 江南大学 Intelligent table tennis score recording system
CN103170115A (en) * 2011-12-20 2013-06-26 西安天动数字科技有限公司 Interactive table tennis system
FR3018036A1 (en) * 2014-03-03 2015-09-04 Cornilleau Sas TABLE TENNIS TABLE
US20160037139A1 (en) * 2014-08-01 2016-02-04 Smart Billiard Lighting LLC Billiard Table Lighting and Game Play Monitor
RU160073U1 (en) * 2015-10-19 2016-02-27 Владимир Иванович Меркулов TENNIS TABLE
CN105426444A (en) * 2015-11-06 2016-03-23 河海大学常州校区 Ping-pong competition information statistical system based on video processing technology
CN205569688U (en) * 2016-04-05 2016-09-14 安徽机电职业技术学院 A table tennis table for imparting knowledge to students training

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2403362A (en) * 2003-06-27 2004-12-29 Roke Manor Research Calculating the location of an impact event using acoustic and video based data
US20100090811A1 (en) * 2008-10-10 2010-04-15 Ahem Frank W Pong scores
US20100198528A1 (en) * 2009-02-03 2010-08-05 Mccauley Jack J Systems and methods for an impact location and amplitude sensor
CN103170115A (en) * 2011-12-20 2013-06-26 西安天动数字科技有限公司 Interactive table tennis system
CN202438121U (en) * 2012-02-14 2012-09-19 江南大学 Intelligent table tennis score recording system
FR3018036A1 (en) * 2014-03-03 2015-09-04 Cornilleau Sas TABLE TENNIS TABLE
US20160037139A1 (en) * 2014-08-01 2016-02-04 Smart Billiard Lighting LLC Billiard Table Lighting and Game Play Monitor
RU160073U1 (en) * 2015-10-19 2016-02-27 Владимир Иванович Меркулов TENNIS TABLE
CN105426444A (en) * 2015-11-06 2016-03-23 河海大学常州校区 Ping-pong competition information statistical system based on video processing technology
CN205569688U (en) * 2016-04-05 2016-09-14 安徽机电职业技术学院 A table tennis table for imparting knowledge to students training

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4087666A4 (en) * 2020-01-06 2023-05-24 Topgolf International, Inc. Identifying a location for a striker of an object
US11786783B2 (en) 2020-01-06 2023-10-17 Topgolf International, Inc. Identifying a location for a striker of an object
CN111840966A (en) * 2020-07-29 2020-10-30 山东科技大学 Intelligent evaluation method and system for table tennis training based on image recognition

Also Published As

Publication number Publication date
GB201713818D0 (en) 2017-10-11

Similar Documents

Publication Publication Date Title
US11715214B1 (en) Systems and methods for indicating user performance in launching a basketball toward a basketball hoop
US9694277B2 (en) Client side processing of character interactions in a remote gaming environment
TWI441669B (en) Virtual golf simulation apparatus and method
JP2021514750A (en) Deinterleave of gameplay data
JP2021514753A (en) Statistically defined game channels
TW201936241A (en) Enhanced gaming systems and methods
JP6673221B2 (en) Information processing apparatus, information processing method, and program
JP2018506205A (en) Control virtual reality content
JP6249706B2 (en) Information processing apparatus, information processing method, and program
US11040287B2 (en) Experience-oriented virtual baseball game apparatus and virtual baseball game control method using the same
US11138744B2 (en) Measuring a property of a trajectory of a ball
GB2567800A (en) Detecting ball strike position
JP5399966B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
CN111184994A (en) Batting training method, terminal equipment and storage medium
JP5318016B2 (en) GAME SYSTEM, GAME SYSTEM CONTROL METHOD, AND PROGRAM
JP5629364B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US20220366573A1 (en) Apparatus, method and computer program product for generating location information of an object in a scene
US20230230376A1 (en) Method, computer program, apparatus and system
WO2011013550A1 (en) Golf practice device
JPWO2004013812A1 (en) Image recognition apparatus and image recognition program
WO2023106201A1 (en) Play analysis device, play analysis method, and computer-readable storage medium
JP2022040665A (en) Video processing device, video processing method, and model generation device
TW202305664A (en) Method for analyzing image for sensing moving ball and sensing device using the same
JP2018191760A (en) Performance device, performance system, and program
JP2022168633A (en) Ball game video analyzer, ball game video analyzing method, and computer program

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)