GB2457674A - Determining the time and location of an event, for example whether a ball hit a bat in cricket - Google Patents
Determining the time and location of an event, for example whether a ball hit a bat in cricket Download PDFInfo
- Publication number
- GB2457674A GB2457674A GB0803080A GB0803080A GB2457674A GB 2457674 A GB2457674 A GB 2457674A GB 0803080 A GB0803080 A GB 0803080A GB 0803080 A GB0803080 A GB 0803080A GB 2457674 A GB2457674 A GB 2457674A
- Authority
- GB
- United Kingdom
- Prior art keywords
- event
- time
- location
- audio signals
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000005236 sound signal Effects 0.000 claims abstract description 38
- 238000000034 method Methods 0.000 claims description 17
- 238000001914 filtration Methods 0.000 claims description 3
- 230000001360 synchronised effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000005311 autocorrelation function Methods 0.000 description 2
- 244000025254 Cannabis sativa Species 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/0015—Training appliances or apparatus for special sports for cricket
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/36—Training appliances or apparatus for special sports for golf
- A63B69/3614—Training appliances or apparatus for special sports for golf using electro-magnetic, magnetic or ultrasonic radiation emitted, reflected or interrupted by the golf club
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0605—Decision makers and devices using detection means facilitating arbitration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/80—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using ultrasonic, sonic or infrasonic waves
- G01S3/802—Systems for determining direction or deviation from predetermined direction
- G01S3/808—Systems for determining direction or deviation from predetermined direction using transducers spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/80—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using ultrasonic, sonic or infrasonic waves
- G01S3/802—Systems for determining direction or deviation from predetermined direction
- G01S3/808—Systems for determining direction or deviation from predetermined direction using transducers spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems
- G01S3/8083—Systems for determining direction or deviation from predetermined direction using transducers spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems determining direction of source
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/18—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
- G01S5/22—Position of source determined by co-ordinating a plurality of position lines defined by path-difference measurements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2102/00—Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
- A63B2102/20—Cricket
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/808—Microphones
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
Abstract
A system for determining the time and location of an event includes at least four microphones M 1 -M4 arranged in a non-planar array, a video camera 16 arranged to capture a sequence of video frames, and processor device 18 arranged to receive audio signals from each of the microphones M1 -M4 and a video signal from the video camera 16. The processor device 18 identifies an event in the audio signals received from the microphones, determines the time and location of the event from the relative timing of the event in the audio signals, and generates a video output signal that includes an icon indicating the determined time and location of the event. This system can be used in cricket to help umpires, commentators and/or spectators make a decision on whether or not the batsman hit the ball.
Description
SYSTEM AND METHOD FOR DETERMINING
THE TIME AND LOCATION OF AN EVENT
The present invention relates to a system and a method for determining the time and location ofan event. The invention is of particular but not exclusive application to televised sports, for
example cricket.
In recent years, television companies televising cricket matches have introduced a number of innovations that allow viewers and commentators to review the action that has taken place and assess the validity of decisions made by the umpires. Some of these innovations, such as slow motion action replays, have also been adopted by the cricketing authorities and are sometimes used by the umpires to help them make their decisions (usually by reference to an off-field video umpire).
One such innovation that has been used by television companies for a number of years is the system described in the inventor's British Patent No. 2358755, which is commonly referred to as the "Snickometer". This system is used when the batsman plays at the ball and either misses it completely orjust touches it as the ball goes through to the wicketkeeper, who then claims a catch. The umpire must then decide whether the batsman hit the ball, in which case he is out, or whether he missed it completely, in which case he is not out.
The Snickometer helps viewers and commentators (and potentially also umpires) to decide whether the batsman hit the ball by comparing the video images captured by the television cameras with the sound captured by a microphone located close to the batsman (usually at the base of the wicket). The sound is synchronised to the video images and displayed on screen as an audio waveform (similar to an oscilloscope trace), to show the sound detected by the microphone during each frame of the video footage. The video frames and the associated audio waveforms captured around the time when the ball passed the bat are then viewed and compared. If the batsman hit the ball, the audio waveform will generally show a distinctive spike at the moment of impact. Therefore, if such a spike is present at the moment when the ball appears to pass the bat, this suggests that the batsman did in fact hit (or "snick") the ball.
If no such spike is present, this suggest that the batsman did not hit the ball.
Although this system is very helpful, one weakness is that the presence of a spike at the critical moment does not provide categorical proof of a "snick", since it is possible that the detected sound came from another source and was detected by the microphone at the critical moment purely by coincidence. The "snick" may for example have been caused by the bat hitting the ground, or the ball hitting batsman's pads just after passing the bat. The sound may also have originated further away, and only by coincidence arrived at the microphone at the critical moment. In such circumstances, the Snickometer system can give misleading results.
Although the incidence of false "snicks" can be reduced, for example by analysing the audio waveform and excluding sounds that are not typical of the sound made by a bat hitting a ball, these techniques cannot completely overcome the reliability problem.
It is an object of the present invention to provide a method and system for determining the time and location of a sound event, which mitigates at least some of the aforesaid problems.
According to the present invention there is provided a system for determining the time and location of an event, the system including at least four microphones arranged in a non-planar array, a video camera arranged to capture a sequence of video frames, and a processor device that is constructed and arranged to receive audio signals from each of the microphones and a video signal from the video camera, identify an event in the audio signals received from the microphones, determine the time and location of the event from the relative timing of the event in the audio signals, and generate a video output signal that includes an icon indicating the determined time and location of the event.
The system makes it possible to generate an image or a sequence of video images to illustrate the location of a sound event relative to the positions of other objects at the time of the event.
For example, in a cricket game, the image can show the position of the event relative to the players, the bat and the ball. It may then be possible to form an opinion as to whether the detected sound event might have been caused by the batsman hitting the ball or by some other event, such as the bat hitting the ground or the ball hitting the batsman's clothing or pads.
Advantageously, the event is identified in each of the audio signals by selecting a relevant portion of the audio signal and searching for the event in the selected portion. This reduces the amount of processing required by the processor. The relevant portion of the audio signal may be selected by estimating the time and location of the event and determining the travel time of sound from the estimated location to the microphone.
Estimating the time of the event preferably includes selecting a video frame that depicts the event and determining the time of that video frame.
Advantageously, the event is identified in the audio signals by comparing and matching the waveforms of the audio signals. This allows the processor automatically to detect the same sound event in each of the audio signals.
The system may include filter means for filtering the audio signals to exclude predetermined portions of those signals. This makes it easier to distinguish a particular kind of sound event
from other random background sounds.
Advantageously, three microphones are located substantially at ground level and a fourth microphone is located in a fixed position above the ground. In a cricket game, the fourth microphone is preferably located at or near the top of a stump, as this provides a convenient fixed location for the microphone.
The input signals are preferably synchronised.
Advantageously, the processor device is constructed and arranged to generate an image illustrating the time and location of the event relative to the locations of other objects. The processor device preferably generates an image by selecting a video frame that corresponds to the time of the event and superimposing an icon on the selected video frame at the location of the event. The processor device preferably selects a plurality of video frames occurring before and after the time of the event and generates a video sequence illustrating a sequence of events before and after the event. Alternatively, the processor device may be constructed and arranged to create a computer generated image representing the location of the event relative to the locations of other objects. The generated image or video sequence can then be broadcast to a viewing public.
According to another aspect of the invention there is provided a method of determining the time and location of an event, the method including providing a video camera and at least four microphones arranged in a non-planar array, receiving a video signal from the video camera and audio signals from the microphones, identifying an event in the audio signals, determining the time and location of the event from the relative timing of the event in the audio signals, and generating a video output signal that includes an icon indicating the determined time and location of the event.
An embodiment of the invention will now be described by way of example with reference to the accompanying drawings, wherein: Figure 1 is a perspective diagram of a cricket pitch illustrating schematically an array of microphones and a video camera; Figure 2 is a schematic diagram illustrating the components of a system according to the invention; Figure 3 is a polar coordinate diagram, showing the relative positions of an array of microphones and an event for which the time and location are to be determined; Figure 4 illustrates a set of audio signals recorded by the array of microphones, and Figure 5 shows a video image generated by the system.
Figure 1 illustrates a cricket field, which includes a pitch 2 (typically a prepared strip of short mown grass) having a wicket 4 comprising three wooden stumps at either end. The outfield 6 lies beyond the pitch 2, the outer periphery of the outfield 6 being defined by a roughly circular boundary 8. The pitch 2 is 22 yds (20.12 metres) long and is marked at each end by a line (or "bowling crease") 10 that extends perpendicular to the length of the pitch 2. Another line (or "popping crease") 12 runs parallel to the bowling crease and is located 4 ft (1.22 metres) in front of it. These elements are all conventional, being required by the laws of cricket.
The system for detecting the time and location of an event includes a set of microphones 14 and one or more video cameras 16. In the arrangement shown in Figure 1, six microphones 14 are installed, although only four of these are used at any one time. These four microphones, referred to herein by the designations Ml to M4 are located as follows: Microphone Ml is located at the base of the middle stump of the wicket 4; Microphone M2 is located at the top of the centre stump of the wicket 4; Microphone M3 is located on the bowling crease 10 approximately 10 ft (3.05 metres) from microphone Ml; Microphone M4 is located at the base of the middle stump of the wicket at the opposite end of the pitch 2.
Microphones Ml, M2 and M3 are therefore located at one end of the pitch 2, whereas microphone M4 is located at the opposite end. The two unused microphones 14 are located at the same end of the pitch as microphone M4 and replicate the positions of microphones M2 and M3, thus allowing an equivalent array of microphones to be used when the bowling switches from one end of the pitch to the other. The microphone M3 is preferably located at the position of middle stump on an adjacent pitch, so that when play takes place on that adjacent pitch this microphone becomes the microphone Ml that is located at the base of the middle stump. This arrangement avoids unnecessary duplication when installing the microphones.
The video camera 16 is located approximately on the centre line of the pitch 2, some distance beyond the boundary 8. It is positioned to provide a clear view of play. Additional video
cameras may be located elsewhere around the field.
It should be noted that the array of microphones shown in Figure 1 is only exemplary: any other arrangement may be used, providing that the four microphones Ml to M4 are set out in a non-planar array. Usually, three of the microphones will be located substantially at ground level and one will be located in a fixed position above the ground, the preferred position of this microphone being at or near the top of one of the stumps.
Further components of the system are shown in Figure 2. The system includes a set of microphones 14 and one or more video cameras 16. The microphones and cameras are connected to a processor 18, which receives and records the audio signals from the microphones and video signals from the cameras. These input signals are all synchronised to a single synchronisation clock (this is conventional in a television recording system that includes multiple cameras and/or microphones).
The processor 18 is connected to a user interface 20 and a monitor 22, which allow an operator to control the processor 18 and view the input and output signals. The processor 18 also has a video output terminal 24 for an output signal, which may for example be routed to a television broadcasting system.
A typical arrangement of the four microphones is illustrated in figure 3. The positions of the microphones may be defined with respect to an origin using a spherical polar coordinate system, where the location of a point X relative to an origin is defined by the values of the parameters (R, 0, 4)), where R is the radial distance of the point X from the origin, 0 is the azimuth angle and 4) is the zenith angle. In the arrangement shown in figure 3, where the origin (0, 0, 0) is located at the base of the middle stump, the microphones are located as follows: Ml isat (0, 0, 0) M2 is at (R2, 0, rt/2) M3 isat(R3, it,0) M4 is at (R4, it/2, 0) R, is the height of the microphone M2 above ground level, which may for example be the height of the stump. R3 is arbitrary but may for example be the width of a pitch (usually 10 feet / 3.05m). R4 may be the length of the pitch (22 yards / 10.21 m). Although these are preferred locations, the microphones may be located elsewhere providing that they are sufficiently close to the wicket to detect the sound of a "snick" and they are not all located in a single plane.
If the batsman plays a shot but only "snicks" the ball, this will create a sound that is detected by the four microphones. The time taken by the sound to reach each microphone depends on the distance of the microphone from the "event (the point where the bat touched the ball).
The location and the time of the event can then be determined by comparing the relative timings of the sound as recorded in the soundtracks of the microphones.
This principle is illustrated in figure 4, which depicts as an example the audio soundtracks of four microphones, MI to M4. These audio signals are time-coded in a synchronised ("genlocked") television system and are recorded continuously by the processor 18. The "snick" appears as a spike 30 in the waveform of each soundtrack. In the soundtrack of the first microphone Ml the spike 30 appears at a time t1, in the soundtrack of the second microphone M2 it appears at a time t2, for the third microphone M3 it appears at t3 and for the fourth microphone M4 it appears at t4.
The approximate timing of the relevant event in each of the audio signals can be estimated by noting the time of the event in the video signal and estimating the time the sound would take to reach each microphone. For example, if the batsman plays a shot at the ball but only snicks it, the time at which the ball appears to pass the edge of the bat will be noted and the time taken by the sound to reach the microphones will be estimated from this noted time.
Normally, the batsman stands about 2m in front of the wicket and as the speed of sound in air is about 340m1s, the sound will usually take about 6ms to reach the wicket microphones MI and M2, about lOms to reach M3 and about 55ms to reach M4. Of course, the actual timing of the event at each microphone may be a few milliseconds earlier or later than the estimated figure, depending on the exact location of the event, wind strength and atmospheric conditions.
The processor 18 looks for a spike 30 in the waveform of each soundtrack at approximately the expected arrival time, which is estimated as set out above. It may also apply a frequency filter to the waveforms to exclude sounds (such as low frequency sounds) that are unlikely to be associated with a ball hitting a bat. The processor 18 then uses an auto-correlation function to confirm that the spikes detected in the soundtracks all relate to the same sound event. This auto-correlation function compares the waveforms of the spikes or the frequency components of the waveforms to ensure they substantially match one another, indicating that they all came from the same source. This function is conventional and so will not be described in detail.
The processor 18 then reads and stores the timings t1 to t4 of the sound event, as recorded by the four microphones. These timings can then be used as follows to calculate the location and time of the event.
Referring to figure 3, an event taking place at a location X and time t as defined by the parameters (R, 0, , t) gives rise to set of timing values (t1, t2, t3, ta).
The distance of the event from each microphone may be represented by the values D1 to D4, where: D1 is the distance from the location X to microphone Ml, D2 is the distance from the location X to microphone M2, D3 is the distance from the location X to microphone M3, and D4 is the distance from the location X to microphone M4.
From the timings of the spikes 30 in the microphone soundtracks of figure 4 and the coordinate system shown in figure 3, these distances may be represented as follows: D1=R D2=D1 +S(t2-t1)=R+M21 D3=D1+S(t3-t1)=R-f-d31 D4=D1 +S(t4-t1)=R+Etd41 where S is the speed of sound at pitch level, d21 is the difference in the distance of location X from microphones Ml and M2, and so on.
The positions of the microphones in spherical polar coordinates can be converted to rectangular Cartesian coordinates as follows: (R, 0, 4) (Rcos4cosO, Rcos4sin0, Rsin4) (0,0,0) .(0,0,0) (R2, 0, t/2) (0, 0, R2) (R3, it, 0) (-R3, 0, 0) (R4, it/2, 0) (R4cos4)4, R4sin4)4, 0) The 4-tuple (t1, t2, t3, t4) enables the following relations to be established: D2 = R + M21 = v'(R2cos24)cos20 + R2cos24)sin20 + (Rsin4)-R2)2) (1) D3 = R + d31 = /((Rcos4)cos0 + R3)2 + R2cos24)sin2O + R2sin24)) (2) = R + = /((Rcos4)cos0 -R4cosO4)2 + (Rcos4)sjn0 -R4sinO4)2 + R2sin24)) (3) These three equations (1), (2), (3), which can be simplified, include three unknowns (R, 0, 4)).
They can be solved numerically to provide the values of the parameters R, 0,4) thus indicating the location X of the event. The time t of the event is then given by t t1 -RxS. It is thus possible to determine both the time t and the location X of the event.
Having determined the location X and the time t, this information can be used to generate an image or a sequence of video images to illustrate the location of the event relative to the positions of the players, the bat and the ball at the time of the event. This may be done for example by locating within a video sequence captured by the camera 16 the frame that corresponds to the determined time t of the event, generating an icon that represents the calculated location of the event and superimposing the icon on the selected video frame at the determined location. This allows the calculated location of the event to be compared with the positions of the players, the bat and the ball at the time of the event. It may then be possible to form an opinion as to whether the detected sound event might have been caused by the batsman hitting the ball or by some other event, such as the bat hitting the ground or the ball hitting the batsman's clothing or pads etc. An example is shown in Figure 5. Here, the batsman 40 is standing in front of the wicket 4 and playing a shot at the ball 42, which is passing very close to the edge of the bat 44. These objects are all present in the video frame captured at the time t of the detected event. The determined location of the event is represented by a spherical icon 46, the radius of the sphere representing the estimated error (typically�0.5ms) associated with determining the time values t1, t2, t3 and t4. In this example, it can be seen that the icon 46 encompasses the ball 42 and overlaps the edge of the bat 44. This suggests that the detected sound might have been caused by the ball 42 hitting the edge of the bat 44.
In this example, the icon 46 is larger than the balI 42, to indicate a certain margin of error in the determined location of the event. The size of the icon may be varied according to the estimated margin of error. Alternatively, a point icon may be used, if preferred.
By selecting a few video frames occurring before and after the time t of the detected event, it is possible to show a video sequence showing the positions of the players, the bat and the ball before and after thc detected event. In this case, the icon may for example be superimposed only on the frame corresponding to the determined time of the event, so providing further information about the possible cause of the detected sound event.
The icon representing the location and time of the event may also be superimposed on the video output of various other cameras, to provide even more information. For example, it may be superimposed on the images captured by a camera showing a side view of the pitch, to indicate whether the sound event took place before or after the ball had passed the bat.
Alternatively, instead of superimposing the icon on video frames captured by the cameras, it may be combined with a computer generated display, to illustrate the time and location of the event relative to, for example, the pitch and the wicket. It may also be combined with images showing, for example, the flight path of the ball.
Claims (23)
- --CLAIMSI. A system for determining the time and location of an event, the system including at least four microphones arranged in a non-planar array, a video camera arranged to capture a sequence of video frames, and a processor device that is constructed and arranged to receive audio signals from each of the microphones and a video signal from the video camera, identify an event in the audio signals received from the microphones, determine the time and location of the event from the relative timing of the event in the audio signals, and generate a video output signal that includes an icon indicating the determined time and location of the event.
- 2. A system according to claim 1, in which the processor device is arranged to identify the event in each of the audio signals by selecting a relevant portion of the audio signal and searching for the event in the selected portion.
- 3. A system according to claim 2, in which the processor device is arranged to select the relevant portion of the audio signal by estimating the time and location of the event and determining the travel time of sound from the estimated location to the microphone.
- 4. A system according to claim 3, in which the processor device is arranged to estimate the time of the event by determining the capture time of a video frame that depicts the event.
- 5. A system according to any one of the preceding claims, in which the processor device is arranged to identify the event in the audio signals by comparing and matching the waveforms of the audio signals.
- 6. A system according to any one of the preceding claims, in which the processor device includes filter means for filtering the audio signals to exclude predetermined portions of those signals.
- 7. A system according to any one of the preceding claims, in which three microphones are located substantially at ground level and a fourth microphone is located in a fixed position above the ground.
- 8. A system according to claim 7, in which the fourth microphone is located at or near the top of a stump.
- 9. A system according to any one of the preceding claims, including means for synchronising the input signals.
- 10. A system according to any one of the preceding claims, in which the processor device is constructed and arranged to generate an image illustrating the time and location of the event relative to the locations of other objects.
- 11. A system according to claim 10, in which the processor device is constructed and arranged to generate an image by selecting a video frame that corresponds to the time of the event and superimposing an icon on the selected video frame at the location of the event.
- 12. A system according to claim 11, in which the processor device is constructed and arranged to select a plurality of video frames occurring before and after the time of the event and generate a video sequence illustrating a sequence of events before and after the event.
- 13. A system according to claim 10, in which the processor device is constructed and arranged to create a computer generated image representing the location of the event relative to the locations of other objects.
- 14. A method of determining the time and location of an event, the method including providing a video camera and at least four microphones arranged in a non-planar array, receiving a video signal from the video camera and audio signals from the microphones, identifying an event in the audio signals, determining the time and location of the event from the relative timing of the event in the audio signals, and generating a video output signal that includes an icon indicating the determined time and location of the event.N
- 15. A method according to claim 14, in which the event is identified in each of the audio signals by selecting a relevant portion of the audio signal and searching for the event in the selected portion.
- 16. A method according to claim 15, in which the relevant portion of the audio signal is selected by estimating the time and location of the event and determining the travel time of sound from the estimated location to the microphone.
- 17. A method according to claim 16, in which determining the travel time includes selecting a video frame that corresponds to the time of the event.
- 18. A method according to any one of claims 13 to 17, in which the event is identified in the audio signals by comparing and matching the waveforms of the audio signals.
- 19. A method according to any one of claims 13 to 18, including filtering the audio signals to exclude predetermined portions of those signals.
- 20. A method according to any one of claims 13 to 19, including generating an image illustrating the time and location of the event relative to the locations of other objects.
- 21. A method according to claim 20, including generating an image by locating a video frame that corresponds to the time of the event and superimposing an icon on the selected video frame at the location of the event.
- 22. A method according to claim 21, including selecting a plurality of video frames occurring before and after the time of the event and generating a video sequence illustrating a sequence of events before and after the event.
- 23. A method according to claim 20, including creating a computer generated image representing the location of the event relative to the locations of other objects.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0803080A GB2457674A (en) | 2008-02-19 | 2008-02-19 | Determining the time and location of an event, for example whether a ball hit a bat in cricket |
GBGB0808077.2A GB0808077D0 (en) | 2008-02-19 | 2008-05-06 | System and method for determining the time and location of an event |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0803080A GB2457674A (en) | 2008-02-19 | 2008-02-19 | Determining the time and location of an event, for example whether a ball hit a bat in cricket |
Publications (2)
Publication Number | Publication Date |
---|---|
GB0803080D0 GB0803080D0 (en) | 2008-03-26 |
GB2457674A true GB2457674A (en) | 2009-08-26 |
Family
ID=39271984
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0803080A Withdrawn GB2457674A (en) | 2008-02-19 | 2008-02-19 | Determining the time and location of an event, for example whether a ball hit a bat in cricket |
GBGB0808077.2A Ceased GB0808077D0 (en) | 2008-02-19 | 2008-05-06 | System and method for determining the time and location of an event |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GBGB0808077.2A Ceased GB0808077D0 (en) | 2008-02-19 | 2008-05-06 | System and method for determining the time and location of an event |
Country Status (1)
Country | Link |
---|---|
GB (2) | GB2457674A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2488991A (en) * | 2011-03-09 | 2012-09-19 | British Broadcasting Corp | A system for measuring audio signals generated during sporting activity |
AU2013100500B4 (en) * | 2012-06-08 | 2013-08-15 | Brennan Broadcast Group Pty Ltd | Football contact determination |
AU2013101177B4 (en) * | 2012-06-08 | 2014-01-16 | Brennan Broadcast Group Pty Ltd | Australian rules football goal post contact determination |
US20150377694A1 (en) * | 2014-06-25 | 2015-12-31 | The Board Of Trustees Of The University Of Alabama | Systems and methods for remotely sensing and assessing collision impacts |
WO2018099554A1 (en) * | 2016-11-30 | 2018-06-07 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and system for localization of ball hit events |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2358755A (en) * | 1998-08-12 | 2001-08-01 | Allan Plaskett | Method of and system for analysing events |
US20030142210A1 (en) * | 2002-01-31 | 2003-07-31 | Carlbom Ingrid Birgitta | Real-time method and apparatus for tracking a moving object experiencing a change in direction |
GB2403362A (en) * | 2003-06-27 | 2004-12-29 | Roke Manor Research | Calculating the location of an impact event using acoustic and video based data |
-
2008
- 2008-02-19 GB GB0803080A patent/GB2457674A/en not_active Withdrawn
- 2008-05-06 GB GBGB0808077.2A patent/GB0808077D0/en not_active Ceased
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2358755A (en) * | 1998-08-12 | 2001-08-01 | Allan Plaskett | Method of and system for analysing events |
US20030142210A1 (en) * | 2002-01-31 | 2003-07-31 | Carlbom Ingrid Birgitta | Real-time method and apparatus for tracking a moving object experiencing a change in direction |
GB2403362A (en) * | 2003-06-27 | 2004-12-29 | Roke Manor Research | Calculating the location of an impact event using acoustic and video based data |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2488991A (en) * | 2011-03-09 | 2012-09-19 | British Broadcasting Corp | A system for measuring audio signals generated during sporting activity |
AU2013100500B4 (en) * | 2012-06-08 | 2013-08-15 | Brennan Broadcast Group Pty Ltd | Football contact determination |
AU2013101177B4 (en) * | 2012-06-08 | 2014-01-16 | Brennan Broadcast Group Pty Ltd | Australian rules football goal post contact determination |
US20150377694A1 (en) * | 2014-06-25 | 2015-12-31 | The Board Of Trustees Of The University Of Alabama | Systems and methods for remotely sensing and assessing collision impacts |
WO2018099554A1 (en) * | 2016-11-30 | 2018-06-07 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and system for localization of ball hit events |
US11123625B2 (en) | 2016-11-30 | 2021-09-21 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and system for localization of ball hit events |
Also Published As
Publication number | Publication date |
---|---|
GB0803080D0 (en) | 2008-03-26 |
GB0808077D0 (en) | 2008-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6304665B1 (en) | System for determining the end of a path for a moving object | |
US20210350833A1 (en) | Play Sequence Visualization and Analysis | |
US10616663B2 (en) | Computer-implemented capture of live sporting event data | |
EP2850823A1 (en) | System and method for automatic video filming and broadcasting of sports events | |
JP2001518242A (en) | A system for improving the television presentation of objects in sports competitions | |
CN101027900A (en) | System and method for the production of composite images comprising or using one or more cameras for providing overlapping images | |
CN102015046B (en) | Game system and game device control method | |
US9602858B1 (en) | Method and system for synchronizing multiple data feeds associated with a sporting event | |
GB2457674A (en) | Determining the time and location of an event, for example whether a ball hit a bat in cricket | |
CN110270078B (en) | Football game special effect display system and method and computer device | |
JP2015070503A (en) | Information processing apparatus, information processing method, and program | |
US20140085478A1 (en) | Automatic Camera Identification from a Multi-Camera Video Stream | |
US20100195978A1 (en) | System to facilitate replay of multiple recordings of a live event | |
US20230289982A1 (en) | Methods and systems to track a moving objects trajectory using a single camera | |
AU2014295901B2 (en) | Synchronisation of video and audio capture | |
CN109446241A (en) | A kind of statistical method, device, equipment and the storage medium of sporter's technical parameter | |
US20030036887A1 (en) | System and method for searching selected content using sensory data | |
US11103760B2 (en) | Line fault detection systems and method for determining whether a sport gaming device has bounced off an area of a sports field | |
US20190304508A1 (en) | Apparatus and method to display event information detected from video data | |
AU2015291766A1 (en) | Systems for reviewing sporting activities and events | |
CN102752529B (en) | A kind of synchronism output method and system of in-situ match data acquisition | |
CN114500773B (en) | Rebroadcasting method, system and storage medium | |
EP4089636A1 (en) | Apparatus, method and computer program product for generating location information of an object in a scene | |
JPH0866506A (en) | Golf training device | |
WO2023089381A1 (en) | The method and system of automatic continuous cameras recalibration with automatic video verification of the event, especially for sports games |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |