GB2403362A - Calculating the location of an impact event using acoustic and video based data - Google Patents
Calculating the location of an impact event using acoustic and video based data Download PDFInfo
- Publication number
- GB2403362A GB2403362A GB0319065A GB0319065A GB2403362A GB 2403362 A GB2403362 A GB 2403362A GB 0319065 A GB0319065 A GB 0319065A GB 0319065 A GB0319065 A GB 0319065A GB 2403362 A GB2403362 A GB 2403362A
- Authority
- GB
- United Kingdom
- Prior art keywords
- event
- time
- location
- video
- impact
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
- G01S3/7864—T.V. type tracking systems
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0605—Decision makers and devices using detection means facilitating arbitration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/18—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
- G01S5/22—Position of source determined by co-ordinating a plurality of position lines defined by path-difference measurements
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
A means for determining the occurrence and location of an impact event is disclosed comprising a video-based trajectory tracking system for tracking the path of an object 110 and one or more microphones 20 placed in the expected vicinity of the impact event. An acoustic / vibration signal generated by an impact event is detected and a time delay between the event and the detection of sound determined in order to calculate the time of occurrence of the event. The location on the path 140 of the object is determined, corresponding to the time of occurrence of the event as determined by the video based trajectory tracking system. The invention is particularly suited for locating sporting impact events such as those occurring in cricket. The video signal corresponding to the calculated time and position of an event may be manipulated to enable corresponding video and audio signals representing the event to be synchronised in time. The received signals may be evaluated to differentiate between a predetermined set of different types of events expected to occur. Signals received from at least two microphones may be compared using time difference of arrival in order to calculate the spatial location of an event. The system may further adjust the calculated time and location of an event to compensate for the effects of meteorological factors.
Description
AN ACOUSTIC EVENT SYNCHRONISATION ANI)
CHARACTERISATION SYSTEM FOR SPORTS
The present invention relates to an acoustic synchronization, event location and classification system for use in cricket and other sports.
There is a growing requirement to collect and analyse data from sports matches such as cricket, baseball, tennis and other bat / racquet sports. When and where an impact happened (or whether one occurred at all), and what it consisted of in terms of which objects were involved in the impact, are questions of considerable interest for broadcast, umpiring and training purposes. This information may also be used lo enhance the viewing or learning experience.
Current techniques for video-based ball localization include a trajectory tracking system known as "Hawk-EytTAM}', commercialized by Hawk-Eye Innovations Ltd., and publicised at www.hawki.tv. The system is partially described in the patent application published as WO01418X4. Hawk-Eye its a system for tracking the trajectory of an object such as a cricket ball (or a tennis ball etc.), and extrapolating the relative ball position at a certain location, e.g. in the plane of cricket stumps, based upon the early and middle parts of its travel. The drawback with this system is that it cannot always track the ball tra jectory close to a batsman who may obscure the view of a camera used for calculating the trajectory. Hence the system does not always provide (my information about when the ball was diverted from the llawk-Eye predicted trajectory by impact with any other object, such as a bat, pad, body or ground. In addition, (ma) Hawk-Eye does not provide any information regarding what might have happened at the point of impact, such as what type of objects were involved in the impact.
In the current art tor the game of cricket, an acoustic device called the Snickometer', described in the patent application published as WO0010333, aims to provide information about the impact of the cricket ball with the batsman's bat, glove or pad, or with the ground or the stumps. It displays a time waveform from a microphone, and attempts to show the correlations between the acoustic transients representing the mpaets and events within the sequence of video frames showing the - 2 passage of the ball past the batsman. This enables one to see what the impacts are likely to have been between - and thus to make informed 'catch' and 'leg- before wicket' (LBW) decisions.
However the main drawbacks of a practical system using this invention include: (i) that the system has to work offline i.e. results may only he displayed after a certain delay; (ii) the system cannot determine the time accurately enough to provide high-resolution impact location; (iii) the system only provides video t'rame images nearest the moment of impact and cannot resolve events within one video frame period; (iv) the frame rate is too slow for the human eye to distinguish between very fast 'snick' sequences by studying a sequence of frames or to correlate accurately with the audio events; (v) the audio signature is not classil'ied into permutations of ball with bat/glove/pad/ground/wicket although this is acknowledged in principle.
The technique of analysis proposed in WO0010333 Fourier analysis is in fact unlikely to be capable of reliable classification. WO0010333 explicitly states that the t'requency-analysed signal is to be presented graphically i.e. the bat/object permutations are to be determined by the observer and not automatically.
To illustrate the invention the game of cricket will be used but the scope of this invention is not limited solely to this sport.
Fig. I illustrates a cricketer 100 at the stumps 150 receiving a delivery. A cricket ball 110 approaches the cricketer along a trajectory 120 and the cricketer attempts to hit the ball 110 with bat 130. The trajectory 120 is [racked by a video ball (Rid) All) trajectory tracking system SUCH as Hawk-EyeA A particular feature of Hawk-Eye is that it calculates the future path 140 of the ball 110 using measured video int'ormation describing its past trajectory 120 and assuming that the ball does not impact on any objects. The position of the stumps 150 is known to the video trajectory tracking systemic.
The illustration Fig. I accordingly represents a combination of video data for example showing the cricketer 100 bat 130 and ball 110 together with data from the video trajectory tracking system such as the representations of the ball's past trajectory 120 the ball's projected trajectory 140 and the position of the stumble 150. The video trajectory tracking system will typically monitor the position of the stumps 150 and the relative trajectory of the ball 11(), at least to assist in LBW decisions.
As is known to those knowledgeahle in the art, a microphone may he placed close to the cricketer, for example between the stumps 15(), or embedded in one ol the stumps themselves, possibly near the top of the stump. Such microphones have been used, for example in systems such as described in WO()()10333, for providing a clear reproduction of sound at the wicket, such as the ball 11() hitting the bat 130, the stumps or the cricketer 1() 0. Such microphones pick up the sound of an impact event between any of the above objects with a delay, which depends principally on the distance between the microphone and the impact event location, and the effective speed of sound, which will depend at least on meteorological factors such as wind speed and direction, presence of rain etc. Video images will also be delayed, by the video capture apparatus itself; leading to a difference in timing between the video images available for broadcasting, and the associated audio information. The system described in WO0010333 attempts to remove this difference by simultaneously displaying a graphical representation of a sound, and a selected video frame. By changing the selection of the video trade until one is found which shows the occurrence of the impact event represented by the audio signal, approximate synchronisalion between the audio and video systems may be achieved. In other systems, a constant delay is assurlled, representing a typical sound between the vicinity of the cricketer 1()0 and the position of the microphone. This delay is applied to the video signal, again to achieve approximate synchronization between audio and video signals.
The present invention provides methods and apparatus for the use of acoustics for the spatial and temporal location of sporting or other impacts such as those occurring in cricket, coupled with the use ol this information to classify the event into a number ol possible categories.
More particularly, the present invention provides a method for calculating the location of an impact event, comprising: a video-based trajectory tracking system for tracking the path of an object and possibly for predicting its future path; one or more microphones in the expected vicinity of the impact event; mechanisms 1or detecting an - 4 acoustic and/or vibration excitation generated by the impact event, and generating a signal representative thereof. The method further comprises the steps of: determining a time delay between the impact event and the detection of the sound, and accordingly calculating a time of occurrence of the impact event; determining the location on the path of the object, corresponding to the time of occurrence of the impact event, as determined by the video-based trajectory tracking system; and adopting that location as the location of the impact event.
The present invention also provides a method for acoustic temporal and spatial event detection and location for use in ball sports, comprising the steps of: providing acoustic and/or vibration sensors in known, mutually displaced locations within an environment of interest; receiving corresponding acoustic or vibration excitation caused by an impact event at each of the sensors; generating, in each of the sensors, a signal representing the corresponding excitation; and correlating the signals, thereby to calculate an estimated time and position of the event with respect to the location of the sensors. The event may be an impact event.
The method may be used in con junction with a video-based trajectory tracking system to provide a snore accurate position and time estimation of the event, and wherein the event is the impact ol'a ball with another object.
The method may also be used in conjunction with a video-based trajectory tracking system to provide a more accurate position and time estimation of the ball's location on the trajectory at a time derived l'rom the impact between two or more objects other than the ball (such as the bat and pad).
The present invention also provides a further method, for dealing with the opposite of those situations that have been described above in relation to the game of cricket. The first method involves being prompted or alerted by the detection ol' an acoustic event representing an impact event and using this to determine the position and nature of the impact event. The further method involves checking whether an acoustic event occurred given the hypothesis that an impact of the types described above has occurred during a certain, defined range of time or space. For cricket, an example would be an appeal of Bowled Out or Caught when the umpire is unsure whether in fact the ball had touched either the bat or the batsman before respectively
hitting the stumps or being caught by a fielder.
A video signal showing the event may be adjusted according to the calculated time and position of the event, to enable corresponding video and audio signals representing the event to be synchronized in lime.
The signals may be evaluated to differentiate between a predetermined set of different types of event expected to occur.
Spatial detection may comprise comparing respective waveforms of the signals l'rorn at least two sensors using time-dif'ference-oLarrival techniques.
1() The sensors may be arranged in a three-dimensional array, and the location of the event is estimated in three dimensions.
Spatial interpolation may be carried out between frames of the video signal respectively representing time before and after the calculated time of the event, thereby providing improved diagnostics as to where and what the impact was.
The method may be used in detecting the impact of a cricket ball with another object. In this case, the sensors may comprise one or two sensors placed behind each wicket.
In a method for use in detecting the impact of a cricket ball with another object, al least one of the sensors may be placed within or attached to one of the stumps.
The method may further comprise the steps of measuring meteorological factors concerning the vicinity of the event, and adjusting the calculated time and location of the event to compensate for the effects of such meteorological factors.
According to another aspect of the present invention, a method ol' generating a computer generated environment may comprise the step of incorporating a representation ol' the event at time and location representations corresponding to the time and location of the event as determined according to any preceding claim.
According lo another aspect of the present invention, a method of generating a virtual acoustic environment may comprise the step of incorporating a representation of the event at time and location representations corresponding to the time and location of - 6 the event, thereby enabling multiple channel audio playback providing realistic sound localization by suitable adjustment of amplitude, delay and characteristics of the sound indicating the impact event in one or more of the audio channels.
This acoustic environment may where appropriate be linked to the video representations.
The above, and further, objects, characteristics and advantages of the present invention will become more clear with reference to the following description of certain embodiments of the invention, together with the accompanying figures, wherein: Fig. I schematically illustrates a composite image comprising a video image of a cricketer with a virtual image of the ball, the past and tuture trajectories of the ball, and the stumps; Fig. 2 shows a the ball, the trajectories and the stumps of Fig. 1; Fig. 3 shows a block diagram ol a system according to the present invention; Fig. 4 shows an example layout of a cricket wicket, illustrating the location of an impact event, and the use of timedifference-of-arrival techniques to locate the impact event; Fig. S shows an example of results of modelling the performance of one implementation of the present invention, in which multiple estimates of the location of a single impact event are shown, the estimates being calculated from a signal produced by a single acoustic sensor; Fig. 6 shows a statistical distribution of errors in the estimations shown in Fig. 5; Fig. 7 shows multiple impact events with corresponding position estimations, the estimates being calculated from signal produced by multiple acoustic sensors; Fig. X shows a statistical distribution of errors in the estimations shown in Fig. 7; Fig. 9 shows a simulated, partial computergenerated image showing the trajectory, location of the impact event calculated according to the present invention, and a calculated postimpact trajectory; and - 7 Fig. 10 shows a volume of interest defined around the location of a batsman, and a trajectory segment representing a ball's transit across the volume of interest, itself defining a time range of interest.
The present invention provides a system useful in the televising and umpiring of sporting events or the analysis ol'training.scssions. In its simplest embodiment, the invention uses one or more sensors (for example microphones, seismometers or accelerometers) that pick up signals corresponding to an acoustic/vibration signal generated by an impact occurring during the activity. This is explained here using the example of the game of cricket, although it can be used in the same or similar form for IO other games or even non-sport events.
In cricket, an impact might he between the ball, the cricketer'.s hat, pad, clothing, the ground or the wicket. Whether the ball has hit the bat or the pad is important in determining leg before wicket (LOW) decisions, which is one of the key benel'its provided by the present invention.
An audio signal waveform from an impact is picked up by one or more microphones or other sensors installed near to the pitch. The time of the impact event is calculated l'rom the audio signal wavcl'orm, and is used to determine the location of the ball at the time of impact in relation to the position of the batsman and his bat. This may be done most simply by synchronizing the impact event with a video frame on a video recording of the game by calculathg the timing of the impact event. If several microphones are provided, then their signals may be processed to provide an indication of the location of the impact event. The distance of the impact location from the microphones may then be calculated, and this will provide the delay of the audio signals due to the time taken for them to travel through air. Alternatively, particularly if only one microphone is provided, a standard delay may be assumed, and this used to provide approximate alignment with the video signal.
In another embo:liment of the present invention, the precise position of the impact event is determined by a ball-tracking algorithm, for example using a ball trajectory tracking system such as llawk-Eye, which uses a video image sequence as its 3() input. In this example, the audio signal waveform provides only the timing of the - x - impact event. With this approach, the time of the impact event can be determined to an accuracy of a fraction of a video frame, something that cannot be done using simple, frame-synchronising techniques. The location of the ball at the lime of the impact event may also be calculated very accurately. The present invention is accordingly able to ofl'er accurate information on whether the hall hit the bat or something else.
The present invention provides means and apparatus for providing improved impact detection and classification of impact events, and improved audio/visual synchronization t'or coverage of ball sports such as cricket. Using a trajectory [racking (RrM) syslern such as Hawk-Eye, the present invention allows the location of an impact event to be computed more precisely than possible with known systems. By accurately calculating the position of an impact event, it is then possible to accurately calculate the relative delay between audio and video records ot' the event, lo achieve accurate synchronization between audio and video. Furthermore, the accurate location of the impact may be taken into consideration when determining the nature of the impact event. An impact of the ball 1 10 with the bat 130 will take place at a different location from an impact with the stumps 150, or an impact with the cricketer 100.
In an advanced embodiment of the present invention, multiple video cameras such as those used for the trajectory prediction system may provide images that are used to provide a computer-generated three- diTnensional model of the cricketer 100, the bat 130, the ball 110 with its past and projected trajectories 120, 140, and the stumps 50. Once the location of an impact event its accurately known, that impact event location may be included in the three dimensional model. By comparing the modelled positions ol' the impact event, the cricketer 100, the bat 130 and the stumps 150, a valuable further insight may be obtained to determine the nature of the impact, and so to determine, for example, whether the ball hit the bat or part of the batsman, essential in determining LBW decisions, or whether the impact involved the ball at all. Such three-dimensional models may provide interesting images l'or broadcast with commentary on the game, for improving a viewer's enjoyment of the game.
Alternatively, such images may be used as training devices t'or cricketers. Such models could also be used as an aid to umpires in making dit'l'icull decisions and to whether the - 9 - cricketer should be given out LBW, where the ball must have hit the cricketer but would otherwise have hit the stump, caught, where the ball must he caught be a fielder after hitting the teal no matter how gently, or bowled, where the ball must cause the bails to be dislodged.
As is already known the various possible impact events, such as bat-ball, stump-hall, cricketer-ball, ground-ball, bal-pad, bat-ground, each have a typical associated sound. This may also be included into a.systern according to the present invention for assisting in classifying the nature and location of the impact event.
Reference will now be made to Fig. 2, which shows the ball 110, its past 120 1() and projected 140 trajectory, the stumps 150 and a microphone 20. Supposing that the ball 1 10 enters into an impact at the position shown in Fig. 2. After a delay introduced by the time taken for the sound generated by the impact to reach the microphone 1, and for associated audio processing circuitry to process the sound, the impact is detected by the equipment of the present invention. At that time, the trajectory prediction system may predict the position ol the ball to be at the location shown at 160. It should be noted that the ball may not actually be at that location, it having been diverted from its former course by an impact event at location 110.
In an embodiment of the present invention, a standard delay time may be assumed lor the time taken for the sound of the impact event to reach the audio processing circuitry. This standard delay may be the typical time taken tor the sound of an impact in the region of the cricketer 100 to reach the audio circuitry in average weather conditions and in still air. This assumed, standard delay will be known to the system ol the present invention. By first taking the predicted position 160 of the ball at the time that the audio representation of the impact event was detected, the system of the present invention can deduct the standard delay from the time of calculation of the location 160 of the ball, and Nay provide the location 110 calculated for the resulting, earlier time as the location of the impact event. In the illustrated example, the audio signal representing the impact event is detected at a time when the predicted position of the ball is at 160. By "counting back" by the standard delay, the trajectory prediction system will give the position 110 as the ball's position at this earlier thnc. This - lo- position may be plotted on, for example, the appropriate video frame of that point in time, enabling an accurate view of the moment of impact. This should allow an observer to determine the type ol' impact: that its, the object that the ball hit. This analysis may also be combined with an analysis of the sound made by the impact to arrive at a conclusion as to the nature of the object hit by the ball. The standard delay may be varied according to data supplied by optional meteorological sensors such as wind speed, temperature, humidity, air pressure sensors.
Use of a single microphone I can only provide a single indication of the sound, and reliance will need to be made on the assumed, standard delay. To enhance the accuracy in impact event timing, the position of the impact event is estimated by using the known positions of two or more sensors, such as microphones seismometers or accelerometers, and carrying out a process of location of the impact event, using time-dit'ferenceoi:arrival information, derived from the audio or vibration signals detected by the sensors.
IS This impact event location information may be used for a number of purposes, including correcting for the propagation delay of the sound to a sensor for broadcast purposes or for placing the impact event in a computer-generated environment.
In an embodiment of the invention, by providing two microphones, for example, each in a gap between two stumps, then a degree of direction and range finding may be employed. Once again, assuming a standard delay t, the sound is picked up at the microphones at respective delays t+dt and tdt. The dit'ference in timings, 2dt, provides an indication of how much further the impact event was away from one microphone than from the other. This may be used to plot a locus of possible location for the impact event. By calculating the point at which this locus intersects, or at least approximately intersects, the ball trajectory 12() as calculated by the trajectory [racking system, the location oi'the impact event may be calculated. This will only apply if the impact hlvolved the ball. An impact between the bat and pad may not correspond to a point on the trajectory, and hence this can be used as another mechanism for classii'ication. In the case where the ball was involved, verification of' this location may be provided by calculating the time delay from the calculated impact - 11 event location to the audio processing circuitry and using the "counting back" method to retrace the predicted trajeclory of the hall back to the location 11() of the impact event, as described in the preceding par- agrapll.
In another embodiment, a single microphone 1 is placed at each wicket 20.
The two microphones will receive the audio from the impact event one after the other.
The microphone nearest the wicket 20 being played will detect the sound at a time I. The other microphone, at the other wicket 20, will receive the sound at time t+Dt. As the distance between the microphones is known and constant representing a reference value of time delay Dt, the value of Dt will provide a one-dimensional indication of the location of the impact event. For example, if sound under assumed meteorological conditions takes a lime tw to traverse the distance between the microphones, Dt=0 will indicate that the impact event took place midway between the two microphones, whereas Dt=tw would indicate that the impact event took place at the wicket 20 in play.
Dt=tw/2 would indicate that the impact event took place midway between the wicket 20 being played and the midpoint between the two sets of wickets. The relationship is linear. By locating a point ol the ball's trajectory 120 as calculated by the trajectory tracking system at the calculated distance from the wicket 20, a position for the impact event may be calculated. Verification of this location may be provided by calculating the time delay from the calculated impact event location to the audio processing circuitry and using the "counting back" method to retrace the predicted trajectory of the ball back to the location l lo of the impact event, as described earlier.
Additional microphones may be used, and the sound received by such microphones may be compared together, using the time-difterenceot:arrival technique.
An example of such an arrangement is shown in Fig. 4. Two microphones are placed at each end of the wicket, and these microphones receive an audio signal indicating the impact event at times tl, tl+a, tl+b and tl+c. The distance of the impact event from each microphone will accordingly be y, y+a.x, y+b.x and y+c.x, where y is the (unknown) distance Irom the first microphone to the impact event location, and x is the speed ol sound in metros per second. Possible locations for the impact event will be indicated by points of intersection of spheres centred on the respective microphones - 12 and having respective radii of y, y+a.x, y+b.x and y+c.x, as illustrated for example in Fig. 4. These spheres may intersect at only one single point 22, which may be determined as the location of the impact event. Alternatively, two or more points 22 may appear equally suitable. Logic may eliminate some of the possible solutions, e.g. ones that are behind the wicker, wildly of 1 to one side or the other, those that arc below ground level.
To obtain an accurate location for the impact event, the ball trajectory calculated by the trajectory prediction system is preferably compared with the calculated positions of the impact event. The closest point on the trajectory to the point calculated by the time-difference of arrival method may be taken to be the location of the impact event, given the ball was involved.
A verification step may be performed by calculating the delay of the audio signal from the calculated Impact location 22, and "counting hack" along the trajectory as illustrated in Fig. 2, to an estimated location 110. Locations 110 and 22 should at least approximately coincide, given the ball was involved.
It may also be possible to classify the signal waveform or waveforms of the detected audio or vibration signal to enable differentiation to be carried out on the basis of the audio waveform, h1 order to determine the object impacted by the ball at the defected impact event.
The system does not rely upon having line-of:sight between each sensor I and the impact event, thus overcoming the problems of the prior art trajectory tracking syslems caused by obstruction of line-of-sight to the impact event. Even though other systems exist for manually aligning broadcast audio to video, the system of the present invention uses a sensor array comprising at least one sensor to perform automatic alignment with greater alignment accuracy than is otherwise possible. This also allows the process to be considerably shortened and allows a direct teed-lhrough into broadcast equipment for direct transmission, direct links into existing 3D computer generated environments, and statistical analysis databases for display and also for training. - 13
The present invention seeks to advance the art by using acoustic information available t'rom installed or other sensors, such as microphones, in conjunction with the (M) data available loom image-based trajectory tracking system, such as Hawk-Eye, which provides ball trajectory information in games such as cricket. Such trajectory data can enable automatic, high-resolution synchronization of the video frames andaudio signal to be achieved, thus avoiding the long delay in manual synchronization typil'ied by the current state of the art, and providing additional benefits as follows.
By employing a known accurate acquisition and tracking algorithm embedded in such image tracking systems, it is possible to ensure that interpolation of the time between frames is accurate, even if such things as frame-to-frame variations and blur noise due to the length ol'the pixel integration period are present, because the tracker can average over many more l'rames as long as certain constraints in the ballistics of the ball can be assumed.
The video frames adjacent to the impact event recorded by the audio channel enable some spatial interpolation to be carried out of the movement ol' the bat, batsman's pads and other parts of his body in relation to the ball's trajectory, the bat's movement being the one of most significance. This enhances the present invention's ability to resolve multiple, close-by impacts or doubts as to which object the ball hit.
The use ol'an array of multiple sensors, such as microphones, within the limits allowed by cricket ground rules or other constraints, enables some degree of direetivity and location to be achieved in detecting the acoustic events. The use of close-in microphones contributes to an enhanced ability to locate the source of the impact sound and correct for propagation issues, and reducing the effect of acoustic 'clutter' and noise on the reliability of the system.
In certain embodiments ot' the invention, a suitably robust sensor and transmitter may be embedded within the Cricket ball, or other object participating in the impact ol' interest. For that sensor, the time delay will be very small, providing a further or alternative indication of the timing of the impact event. - 14
The use of modern signal processing and analysis techniques allows more reliable acoustic classification of the impact type (e.g. ball to bat / pad / glove / ground / wicket) to be made, automatically.
The use of other types of sensor, such as seismometers or accelerometers as well as microphones, enables a better discrimination to be achieved between, say ball to ground impacts and those to a cricketer's bat/pad/glove.
Detailed Examples
The speed ol'a cricket ball from a last bowler can vary typically t'rom 34 to 4()m/s. Thus, with a high frame rate camera ( 150 fields per second), the ball will travel 22 to 27cm between fields. The spacing (parallel to the pitch) between the cricketer's bat and pad may vary t'rorn zero to 60crn or so.
Inter-Frame Interpolation After the impact of a ball on another object, its trajectory may be diverted between frames. The present invention can beneficially be used to interpolate the post- impact trajectory to an impact event position and time which occurred in between video images. According to an aspect of the present invention, the precise timings at each video frame, together with location estimates, are used to estimate trajectories between t'rames and to build up a more detailed account of the ball's movement during and after the impact event.
The camera integration time in these timel'rames is likely to be such that some blurring of the last-moving ball is evident, so reliance on adjacent frames for ball interpolation gives inferior results.
Blind Side Trajectory ESstirnation (27) The cameras available to the HawkEye, or other such image-based trajectory tracking systems, may not be positioned so that they can always see the ball trajectory clearly. Even if it becomes impossible to see the ball as it approaches the batsman, the earlier trajectory gives sut't'icient information t'or the position on impact to be determined with accuracy.
Correcting l'or the propagation delay of the sound to a sensor - 15 After correction, where necessary, for such meteorological factors as wind speed and temperature, the delay in event timing caused by the propagation time through the atmosphere to the, or each, microphone can be computed. This correction may involve the use of collateral data l'rom meteorological sensors deployed around the pitch or at least in the vicinity ol' the equipment. The ability to measure time-ol:arrival dift'erences between audio or vibration signals generated by a single impact event and detected in a plurality ol' sensors, typically comprising up to four microphones normally installed in pairs at each wicket, enables a thirty accurate estimate of the position of the impact event lo be made.
Placing the event in a computer-generated environment The position and time of an impact event can be illustrated in a computer- generated environment. For example, an environment based on data from the trajectory tracking system. This application can be used for statistical analysis; for example for the performance of a particular sportsman, or for use with other inl'ormation about the 1 5 location of other oh jects at this point in time; for example the batsman, or the stumps in a game ol' cricket.
Line of Sight The system does not rely upon having line-ol:sight between the sensor and the impact event location, so overcomes issues with obstruction and uses signal processing to calculate propagation times instead of using a human estimate which is slow and prone to error.
An embodiment of the system liar use in cricket matches is described below, with reference to Fig. 3 showing a system block diagram, and Fig. 4, showing an explanation of time alignment techniques.
The system comprises a number of microphones 1, or other sensors for detecting audio or vibration signals, located about a cricket pitch 18. One or more microphones I are located approximately 50mm behind each of the sets of stumps 20, aligned with the gaps between the stumps. The microphones should be amplitude and phase-matched at the frequencies of interest such that each microphone records the - 16 components of the acoustic event with equal weighting in amplitude and time.
Alternatively, a calibration stage involving a controlled acoustic impulse from a known location can be used to account t'or any dil'ferences between the sensors 1.
The microphones I record and relay audio signals 4 back to a central processing computer 5. These audio signals 4 are collected 6 and are processed 8. The signals are relayed via the central control desk 26 of the broadcast system using the audio channels of the Super Slow-motion cameras 2 that are used to monitor the wicket 20. A master clock signal 3 is also provided from the control desk to the cameras 2 and the capture card 6.
Event Synchronisation The signal processing performed in the computer 5 creates a cross-correlation "'unction 8 between all of the microphone signals 4. This provides useful information about the time and amplitude similarity ol'signals recorded at each of the microphones.
When a signal of an impulsive nature is recorded, such as is typically generated by an impact event, the cross-correlation function 8 can be used to identify the time difference of arrival (Tl)OA) of the signals at each of the microphones 1. Sound takes a finite amount ol'time to travel over a set distance (typically 340m/s). Taking the situation shown in Fig. 4 as an example, by identifying the dil'l'erence in time arrival of the same sound at each microphone, the location of the event at location 22, down the wicket, can be identil'ied. This may be simply achieved as the solution of a set of sinultancous equations. As the relative locations of the microphones are known, and the sound detected by each microphone must have originated at a same time and place, a locus 23 of possible impact locations may be plotted for each microphone.
Typically, that locus will be a sphere ot'radius y. The dimension y may be found as the solution of a set of' simultaneous equations t'or spheres based on respective other microphones of diameters y+a, y+b, y+c, where a, b, c are the distance travelled by sound in the time of the time dil'ference of arrival.
Since, at the time and place ot the impact event, all of these loci must intersect, a point in time and space may be l'ound where the loci of for all microphones 3() intersect, providing a calculated time and 2D (at least) location 22 for the impact event - 17 (Fig.2). The difference in arrival times between the microphones at the same end of the wicket, in particular, can be used to give a bearing to the location of the sound, thus giving a two-dimensional region showing where the event is most likely to have happened. This calculated location 22 may then be used to calculate a sound delay between the impact location and a video cameras., enabling the video signal to be accurately delayed to enable the audio and video signals to be synchronized.
The system can be automatically triggered to flag up events to the user, based upon the calculated location ol each event. By matching the calculated position of the impact event to the position of the bat, the wicket and the batsman, the system can provide an indication of whether the ball hit the bat, the batsman or the wicket.
Using One Microphone at Each End If only coarse accuracy is required, an average propagation time can be used to give a fixed delay to the video signal allowing approximate alignment between audio and video signals. The microphone to he used in any over would he switched in, on the basis of minimum time delay, so that it would normally be the one at the batsman's end of the pitch. In such an embodiment, a standard time delay would need to be assumed, and used to approximately align audio and video, and to deduce the position of the impact from the image-based trajectory [racking system. Such an embodiment does not require multiple microphones and hence is simpler to implement, hut provides a less accurate alignment or synchronization, as it is not possible to accurately determine the position of the impact event from the audio signal alone. Modelling of this scenario has been performed to identify the accuracy of this approach. Fig. 5 shows that, by choosing an arbitrary location for the impacts, a statistical analysis can be performed to identify the error between the actual impact event location 31 to each calculated impact event location 33.
Fig. 4 shows the statistical results for 1000 calculated impact event locations, as a spread around a mean propagation time error and as a positional accuracy error.
Assuming a rather idealistic l-in-200,000 I:alse alarm rate, this simple system could estimate the propagation time to within 4.6 milliseconds (notwithstanding any other erTors due to time synchronization). This corresponds to a positional accuracy error of - 18 about 1 metro. This, may sound large, but consider that by comparison, to align audio to a frame of video sampled at the rate ol' 75 frames per second, the error must be within half a frame, or + 6.67 milliseconds as shown 42 in Fig.6.
Using Two Microphones at Each End Modelling the situation where two microphones are located at each end, between each of the stumps, the calculation ol' estimated impact event position and tinning is significantly improved, as illustrated in Figs. 5 and 6. Here, the estimated locations vary, based upon a time difference of arrival (TDOA) calculation. One hundred simulated events are shown as crosses 53, with their corresponding estimates as circles 55. A statistical analysis of the spread of these estimation errors is shown in Fig. 6.
The alignment error (propagation time accuracy) is now reduced to 1.3 ms, or to within 79cm accuracy.
The above results are given assuming no prior knowledge about the environmental conditions, but the simulated events assume a random temperature between 14 and 30 degrees Celsius, a random wind speed between 1 and 8 m/s and a random wind direction, from any angle. Given that the propagation time is dependent upon sound propagation speed and distance, lo improve accuracy it may be necessary to monitor the wind speed and direction and the local temperature and humidity that occur in the vicinity of the propagation path. Standard physical relationships can be used to calculate the speed of' sound from this information. This could potentially improve the alignment accuracy for the multiple microphone case above from 1.3 ms to 0.4 rms, if the temperature, wind speed and wind direction could be estimated to within 1%.
A 3-dirmensional array of microphones I or other sensors would improve this even further, allowing more accurate impact event location in the vertical direction.
Use of Timing Int'ormation The calculated time and position of impact event inl'ormation can be used in a variety of' ways. Some of these ways are described below.
Broadcast Audio- Video Alignment Referring again to Fig. I, the video signals from the super slow-motion cameras 2 can be acquired by the computer 5 using high specification video capture cards 7. The super slowmotion cameras 2 capture data at a relatively high rate such as frames per second, hence multiple cards 7 are required to acquire and convert the data into normal broadcast rate (25 frames per second). Once the location of the impact event has been calculated according to the present invention, an acoustic propagation time adjustment based upon the speed of sound may be applied 9 to adjust the video signal such that it is aligned lo the audio at the lime the impact event occurred. This simulates what would he experienced if the viewer were located exactly at the point of impact.
Such system can be arranged to perform time-alignmenl only for events that happened close to the wickets, defined as a certain range of TDOA, such that it can automatically align events that are of interest, i.e. those around the location of the batsman. The aligned audio and video 10 can linen be broadcast so the viewers can judge the characteristics of the impact event.
T rajectory Calculation A further application of the present invention involves the use ot a computer- (RT generated environment. We shall use Hawk-Eye as an example of one such system, although other such systems exist, and may be used according to this aspect of the invention. If the computer system generating the environment is aligned to a broadcast signal using a master clock signal 13 (Fig. 1) sent out by the control desk 26, the impact event lime and position information 14 identified by the acoustic method of the present (Rant) invention can be compared to a ball trajectory 15 predicted by the Hawk-Eye system 28. This allows the location of the point of impact to be identified on the trajectory ( TM) (something that is not possible with systems such as Hawk-Eye). A timing accuracy ot 0.4 ms is estimated to be feasible, as mentioned above, if the temperature, wind speed and wind direction can be measured or estimated to within 1%. With a 40m/s ball delivery, this corresponds to a location uncertainty of around 6cm and thus is compatible with the accuracy of the Hawk-Eye system itself. Slower balls are clearly 3() easier to locate in terms of inter-frame position, and are also likely to be more accurately determined in speed and direction, since more samples of the video picture will he available with which to do the calculations.
A rendition of the hall-obiecl impact may be.superimpo.sed on a single frame video backdrop of the batsman al the crease and the surroundings. A simulated, partially computer generated image is presented in this context as shown in Fig.9. The ball's trajectory 72 before an impact 74 with the batsman's pad is provided by the
ARTS
Hawk-Eyesy.stem. The ellipsoidal estimation area 74 for the impact event location is provided by the acoustic system of the present invention in conjunction with the Hawk (r) Eye system. The projection 76,78 to the position of the ball 79 at the instant of the I'rame shown in Fig. 9 (almost in the wicket keeper's gloves) is calculated from the previously known data.
Computer Generated Impact Classification A yet further application of the present invention also involves the use of a fully computer-generated environment.
Referring again to Fig. I, by using a number of cameras 1 I that have a known (R4) location, pan, tilt and zoom such as cameras provided for the Hawk-Eyesystem, a 3D representation of a batsman can he created. The cameras 11 take images of the batsnran from a variety of angles. image processing is then used to map the pixels of the individual images taken at the calculated time of the impact event from the variety of angles in such a way that a computer-generated solid representing the batsman can (ATE) be placed accurately onto the wicket used in the Hawk-Eye\ syslem's trajectory repre.senlalion.
By identifying where the batsman (with teal) and the ball was at the time Cal' the detected impact, a 3D computer generated environment can he created, which can he manipulated and viewed 1'roTn any angle to aid the viewer in deciding what caused the impact. The entire sequence of the ball approaching, hitting an object and being deflected by the object can then the played to audiences or an umpire, or for training purposes, with whatever additional annotations are required.
Audio Impact Classification - 21 Aftcr the audio signals have been captured, and according to another aspect of the present invention, analysis 16 can be performed to identify what could have happened at the time of impact. Amongst the various techniques available to analyse the audio waveforms, Wavelct Transforms provide an appropriate means of identifying the time and frequency varying properties of audio impulses. If a template wavelet is chosen lo suit the type of impulses to be analysed, a Wavelet transform can be produced for the time region surrounding an impact, and this can then be compared to a lookup table of Wavelet transforms corresponding to events of known origin.
This provides an assessmcut of the similarity of the recorded impact to those 1() stored in the database, which can be used to give judgement on the nature of the impact, for example on a display device l 7.
Fourier analysis could also be used, but is more suited to continuous signals and is unlikely to he ol'benel'it on its own.
If required, renditions of the transformed waveform can be made available on a transmitted picture, to allow viewers or an umpire to compare this with the set ol' patterns representing the various types of impact.
Virtual ACOUSliC World In another application ol'lhe present invention, the impact location system can also be used to generate stereo or other mulli-channel audio signals to produce a virtual acoustic environment. In a simple embodiment, by changing the level of the recorded audio signal of an impact event in either the let's or the right channel of a multichannel (e.g. stereo) broadcast audio signal, the viewer experiences the sound coming from the direction of the impact event, given that the position of the camera providing the transmitted video signal is the viewer's frame ol'rel'erence.
This application may be implemented by a simple system that calculates the relative position ol' the ball to the viewing position of the camera in either the real broadcast video or the 3D computer-generated world and pans the audio in a stereo or other multi-channel audio signal accordingly.
Even though other systems exist for aligning broadcast audio to video, the present invention provides a system using a microphone array to perform greater - 22 alignment accuracy than its otherwise possible. It also allows the alignment process to be considerably shortened and allows a direct feed-through into broadcast equipment for direct transmission, statistical analysis databases for training, and direct links into existing SD computer generated environments.
The embodiments of the invention described above use a video tracking trajectory to provide the path of the hall. The audio detection provides an accurate timing for the event. By determining the exact location on the calculated trajectory that corresponds to the calculated time, the nature of the impact may be inferred from the surrounding artel'acts, e.g. bat, stumps, balsman, at that time. These surrounding arlelacts are preferably shown on a video image, which also includes an illustration of the calculated trajectory, and the calculated position ol' the hall at the time of the impact.
Referring to Fig. 10, further embodiments of the present invention permit the reverse operation to be carried out. That is, if the position of a potential impact is known, the system determines whether any impact actually took place in the relevant position at the relevant time, and prel'erably also perl'orms some degree of evaluation on any detected sound, in an attempt to identify the objects involved in any detected impact.
For example, if the umpire, or wicket-keeper, believes that the ball 110 hit the bat 130, a system ol' the present invention may respond by determining time limits between which the ball was in a position where such an impact could have occurred.
The range of such positions will be referred to as the volume of interest 80. By comparing a ball trajectory X2 calculated by the video-based ball tracking system with the defined volume of interest Ss0, the time period corresponding to the ball's trajectory through the volume of interest may he determined. This time period will be referred to as the time period of interest. A recording of the sound picked up by the microphone(s) ol' the system during the thee period of interest may be scrutinised in an attempt to detect any impact-indicaling sounds, and thereby detect whether any impact in fact took place. - 23
Tn a first step, the sound recording from the time period of interest may be re- applied to the impact-detecting algorithm described. However, it is likely that no impact would be detected usmg these, since no impact was detected as the ball originally travelled through the volume of interest. It may accordingly be necessary to apply a dil't'erent, or amended, detection algorithm to the selected sound recording.
Various signal processing techniques, well known in themselves, may he applied to the selected sound recording in order to enhance any sounds of interest, such as sounds typical of impact events, while attenuating any intcrt'cring sounds, such as crowd noise, noise due to nearby road or rail traffic, noise from overhead aircral't and so on. It may not be possible to apply such extensive signal processing techniques to the received sound in real time. However, since the present aspect of the invention is concerned only with examining a relatively short time frame, after the event, much more time is available for processing and enhancement ol' the detected sounds, than for the real-time processing described in earlier embodiments.
The volume of interest Nay be defined as a fixed volume. For example, for ease of processing, the volume of interest 80 may be det'incd as a cuboid, having a height approximately equivalent to the greatest likely height of a batsman, for example 2 metros; a width equal to the width of the crease plus one bat's length in either direction, and a depth equal to the maximum distance between the StUIlpS and a batsman with outstretched arms holding a bat. Such volume is likely to be no larger than a cube of side 2 mctres. The maximum path length through such volume would accordingly be 3.46 metres. Typically, the cricket ball 110 may be travelling at a speed of 31 metres per second, so the maximum time that the ball may remain within the volume of interest is 0.1 l seconds. Since the time period ol' interest is so short, very complex analysis may he performed on this short section of audio signal, which would be impossible in real-time processing. In an important decision such as whether a batsman is out, in a game of cricket, a processing delay of 1() seconds could be acceptable. A great amount of data processing could be performed on 0.11 second's worth of audio signal in a processing time of 10 seconds. - 24
Supposing that the enhanced data processing applied to the selected section of audio signal reveals a potential impact event, the time of occurrence of that event may he determined, and a corresponding location on the hall trajectory 82 may be calculated. By comparing this location to video data representing the position of bat, ball, cricketer and stumps at that time, the objects involved in the impact event may be deduced.
Alternatively, or in addition, the sounds indicating the potential impact event may be compared with stored typical sounds made by expected types of impact, such as bat-hall, bat-ground, bat-cricketer, ball-wicket, ballwicket keeper, ball-ground, and so on. These sounds may be beneficially represented in other than the time domain, for example the wavelet domain in which better discrimination between target may be po.s.sible.
The enhanced analysis offered by such embodiments of the present invention assists in the resolution of certain disputes. For example, the fielding team may believe that the ball hit the bat, while the umpire may not be sure. By examining the sound recording of the short time period when the ball 110 was in the volume of interest 80, being the only time when the ball could have hit the bat, enhanced sound analysis may be pert'ormed, to provide a more definitive answer to whether an impact occurred in the volume of interest, and the nature ol'that impact.
While the present invention has been principally described as using microphones installed on or near the ground in the vicinity ol'lhe wickets, it may be advantageous to attach microphones to, or embed microphones in, certain articles of interest. For example, wireless microphones could be installed in or on one or more of: the bat, the ball, the stumps, the crickelerts pads. Such microphones would be particularly sensitive to impact events involving the associated object. The relative signal strengths returned by each of such microphones would, m itself, provide an indication ol' which ob jects were involved in an impact event.
The preceding example uses a very simple model of the volume of interest det'ining a simple cuboid. A more accurate volume of interest may be defined. For example, the volume ol' interest in respect of the cricketer may be more accurately - 25 modelled as a vertical cylinder, having a radius corresponding to the maximum reach of the cricketer and bat, and a height corresponding to the cricketer's shoulder height.
The stumps may be included within this volume of interest, or a separate volume of interest may be defined for the stumps. Video data may be used lo calculate the position of the cricketer at any particular moment. The corresponding volume of interest may then be redefined, centred on the actual position ol'the cricketer at the time of interest. It may be advantageous to calculate a volume of interest that follows the contours of the cricketer and the bat, for example del'ining a volume of 0.5m around the halsman and bat. However, this may not he required. The simple cuboid approach is able lo reduce the sound signal of interest about 0.11 seconds in length. This may well be sul'l'icient for most applications.
{firm) The above description has been given in the context of the HawkEyesystem, but it will be clear lo those skilled in the art that the present invention could be used with any system intended to carry out similar functions.
Additional microphones could be added to the system at more remote locations, but their positions would have to be chosen so that they did not interfere with the game. This would improve the ability of the system to locate the impact event, and possibly increase the accuracy ol' the location and timing calculations. Tl is also possible to mount the microphones within the stumps. This would provide elevation and allow the generation of a 3D region of possibility. - 26
Claims (19)
- CLAIMS: 1.A method for determining the occurrence and location of an event, comprising: - providing a video-based position measuring system capable of tracking the path of an object; - providing al least one microphone in the expected vicinity ot the event; and - detecting an acoustic and/or vibration excitation generated by the event, and generating a signal representative thereof; characterized in that the method further comprises the steps of: - determining a time delay between the event and the detection of the sound, and accordingly calculating a time of occurrence of the event; - determining the location on the path of the object, corresponding to the time of occurrence of the event, as determined by the video-based trajectory tracking system; and adopting that location as the location of the event.
- 2. A method for determining the occurrence and location of an event, comprising: - providing a video-based position measuring system capable ot tracking the path of an object; - providing at least one microphone in the expected vicinity of the event; - calculating a time period of interest; detecting an acoustic and/or vibration excitation generated by an event, by examining signal(s) representative thereof provided by the at least one microphone in the time period ot interest, in search of a sound indicating that an event has occurred; and - calculating the time and location at a detected sound; characterized in that the method comprises the steps of: 27 - defining a volume of interest within which any events of interest would have occurred; - identifying a part of the tracked path of the object which passes through the volume of interest; - calculating the time period ol interest, within which the object was within the volume of interest; - examining only that part of the signal(s) provided by the microphone(s) which correspond(s) to the time period ol interest, to detect any event, and accordingly calculating a time of occurrence of the event; - determining the location on the tracked path of the object, as determined by the video-based position measuring system, which corresponds to the time of occurrence of the event; and - adopting that time and location as the time and location of the event.IS
- 3. A method according to claim 2 further comprising the step ol determining a time delay between the event and the detection of the sound, and compensating for this delay in the calculation of the time of occurrence of the event.
- 4. A method according to any preceding claim, wherein the video-based position measuring system is capable ol tracking respective paths of two or more objects, and correlating the mcvernents in space of the objects.
- 5. A method according to any preceding claim, wherein the event comprises at least one of the following combinations of objects participating in an impact: - an object whose path is being tracked with at least one object whose path is not tracked; - two or more objects whose paths are not tracked.
- 6. A method according to any preceding ckairn, wherein the step of determining the time delay in comprises the sub-steps of: - 28 - providing acoustic and/or vibration sensors in known, mutually displaced locations within an environment of intere.sl; - receiving, at each of the sensors, any corresponding acoustic or vibration excitation caused by an event; generating, in each of the sensors, a signal representing the corresponding excilalion; and - correlating the signals, thereby to calculate an estimated time and position of the event with respect to the location of the sensors.
- 7. A method according to any preceding claim, used in conjunction with a video- based trajectory tracking system to provide a more accurate position and lime estimation of the event.
- 8. A method according lo any preceding claim wherein a video signal showing the event is manipulated according to the calculated time and position of the event, to enable corresponding video and audio signals representing the event lo be synchronized in time.
- 9. A method according to any preceding claim wherein the signals are evaluated to dil lerentiate between a predetermined set of different types of events expected to occur, one of the means of evaluation being the use of the Wavelet Transform.
- 10. A method according lo any preceding claim comprising comparing respective waveforms call the signals from at least two sensors using time-difference-of-arrival techniques, thereby calculating a spatial location of the event.
- 11. A method according to any preceding claim wherein the sensors are arranged in a two or three-dimensional array, and the location of the event is estimated in three dimensions. - 29
- 12. A method according to any preceding claim further comprising the use of a video trajectory tracking system, wherein spatial interpolation is carried out between frames of the video signal respectively representing time before and alter the calculated time of the event, thereby providing improved diagnostics as to where and what the event was.
- 13. A method according to any preceding claim wherein the event is an impact event.
- 14. A method according to any preceding claim, for use in detecting the impact of one or more objects occurring in the game of cricket, wherein the sensors comprise two sensors placed behind each wicket.
- 15. A method according to any preceding claim, Ior use in detecting the impact of one or more objects occurring in the game of cricket, wherein at least one of the sensors is placed within or attached to one of the stumps.
- 16. A method according to any preceding claim, further comprising the steps of measuring meteorological factors concerning the vicinity of the event, and adjusting the calculated time and location of the event to compensate for the effects of such meteorological factors.
- 17. A method of generating a computer generated environment comprising the step of incorporating a representation of the event at time and location representations corresponding to the time and location of the event as determined according to any preceding claim.
- 18. A method of generating a virtual acoustic environment comprising the step of incorporating a representation ol the event at time and location representations - 30 corresponding to the time and location of the event as determined according to any preceding claim, thereby enabling multiple channel audio playback providing realistic sound kcalisation by suitable adjustment of amplitude, delay and characteristics of the sound indicating the event in one or more of the audio channels.
- 19. Apparatus adapted and arranged to perform the method according to any. . . preceding claim. , . À À À À19. A method substantially as described and/or as illustrated h1 the accompanying drawings.Apparatus adapted and arranged to perform the method according to any preceding claim. HiAmendments to the clahns have been filed as follows 1. A methoct for determining the occurrence and location of an event, comprlsmg: providing a video-haled position measuring system capable of tracking the path of an object; - providing at least one microphone in the expected vicinity of the event; and - detecting an acoustic ar,c'/or vibration excitation generated by the event, and generating a signal representative thereof; 1() - determining a time delay between the event and the detection of the sound, and accordingly calculating a time of occurrence of the event; - determining the location on the path of the object, corresponding to the time of 2..occurrence of the event, as determined by the video-based trajectory tracking system; ,. . . and - adopting that location as the location of the event, 2 À À' characterized in that the step of determining the time delay comprises the sub-steps of. À ' - providing acoustic and/or vibration sensors in known, mutually displaced locations....within an environment of interest; À2 - receiving, at each of the sensors, any corresponding acoustic or vibration excitation À''.caused by an event; - generating, h1 each of the sensors, a signal representing the corresponding excitation; and - correlating the signals, thereby to calculate an esthnated time and position of the event with respect to the location of the sensors.2. A metl1od tor determining the occurrence and kcatcn of an event, comprlsmg: - providing a video-based position measurhg system capable of tracking the path ot an cab ject; 3() - provicthg at least one microphone in the expected vicinity of the event; - defining a volume of interest within which any events of interest would have occurrecl; - identifying a part of' the tracked path ofthe object which passes through the volume of interest; - calculating the time period of interest, within which the object was within the volume ol' interest; - detecting an acoustic and/or vibration excitation generated by an event, by examining signal(s) representative thereof provided by at least ore microphone in the thee period of interest, in search ol'a sound indicating that an event has occurred; - examining only that part of the signal(s) provided by the microphone(s) which correspond(s) to the time period of interest, to detect any event, and accordingly calculating a time of occurrence of the event; À..- determining the location on the tracked path of the object, as determined by the video- .. . based position measuring system, which corresponds to the time of occurrence of the event; and..' - adopthig that thee and location as the time and location ol'the event, À characterized in that the step of calculating a time of occurrence of'the event comprises. .. . the sub-steps of À - providing acoustic and/or vibration sensors in known, mutually displaced locations À.within an environment of interest; - reeeivhig, at each of the sensors, any corresponding acoustic or vibration excitation caused by an event; generating, in each of the sensors, a signal representing the corresponding excitation; and - correlating the signals, thereby to calculate an estimated time and position ol'the event with respect to the location of the sensors.3. A method according to claim 2 farther comprising the step of determinhig a time delay between the event and the detection ol' the sound, and compensating for this delay h1 the calculation of the time of occurrence of the event.4. A methocl according to any preceding claim, wherein the vidco-bascd position measuring system is capable of tracking respective paths of two or more objects, and correlathig the movements in space ofthc objects.5. A method according to any preceding claim, wherein the event comprises at least one of the following combinations of ob Acts participating in an impact: - an object whose path is bchig tracked with at least one object whose path is not tracked; - two or more objects whose paths are not tracked.6. A method according to any preceding claim, used he conjunction with a video- À., . based trajectory tracking system to provide a more accurate position and time A, . estimation of the event.7. A method according to any preceding claim wherein a video signal showing the À.,..6 event is manipulated according to the calculated time and position of the event, to,... Àenable corresponding video and audio signals reprcscnting the event to be synchroniscd ' :.m torte.8. A method according to any preceding claim wherein the signals are evaluated to dil'fercntiate between a predetermined set of different types of events expected to occur, one of the means of evaluation being the use of the Wavelet ''I'ransform.9. A method according to any preceding claim comprising comparing respective waveforms of the signals from at least two sensors using timedif'f'crcncc-of:arrival techniques, thereby calculathg a spatial location of the event. 3!10. A method according to any preceding claim wherein the sensors are arranged in a two or three-dimensional array, and the location of the event is estimated in three dimensions.11. A method according to any preceding claim further comprising the use of a video trajectory tracking system, wherein spatial interpolation is carried out between frames of the video signal respectively representing time hel'ore and after the calculated time of the event, thereby providing improved diagnostics as to where anct what the event was.12. A method according to any preceding claim wherein the event is an impact event. :. . 13. A method according to any preceding claim, t'or use in detecting the impact of one or more ob Acts occurring h1 the game of cricket, wherein the sensors comprise two sensors placed behind each wicket.14. A method according to any preceding claim, for use in detecting the impact of one or more objects occurring h1 the game ol' cricket, wherein al least one ol' the sensors is placed within or attached to one ot' the stumps.15. A method according to any preceding claim, further comprising the steps of measuring meteorological factors concerning the vicinity of the event, and ad lusting the calculated time and location of the event to compensate lor the el'l'ects ol' such meteorological factors.16. A method of generathg a computer generated enviromnent comprising the step of hcorporating a representation of' the event at time and location representations corresponding to tile time and location of the event as determined according to any 3() preceding claim.]7. method of generating a virtual acoustic environment comprising the step of incorporating a representation ol the event at thee and location representations corresponding to the thile and location -1 the event as determined according to any preceding claim, thereby enabling multiple channel audio playback providing realistic sound localization by suitable ad justment of amplitude, delay and characteristics of the sound indicating the event ha one or more ol the audio channels.18. A method substantially as described and/or as illustrated he the accompanying drawings.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0315007A GB0315007D0 (en) | 2003-06-27 | 2003-06-27 | An acoustic event synchronisation and characterisation system for sports |
Publications (3)
Publication Number | Publication Date |
---|---|
GB0319065D0 GB0319065D0 (en) | 2003-09-17 |
GB2403362A true GB2403362A (en) | 2004-12-29 |
GB2403362B GB2403362B (en) | 2005-05-11 |
Family
ID=27637458
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0315007A Ceased GB0315007D0 (en) | 2003-06-27 | 2003-06-27 | An acoustic event synchronisation and characterisation system for sports |
GB0319065A Expired - Fee Related GB2403362B (en) | 2003-06-27 | 2003-08-14 | An acoustic event synchronisation and characterisation system for sports |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0315007A Ceased GB0315007D0 (en) | 2003-06-27 | 2003-06-27 | An acoustic event synchronisation and characterisation system for sports |
Country Status (1)
Country | Link |
---|---|
GB (2) | GB0315007D0 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2457674A (en) * | 2008-02-19 | 2009-08-26 | Allan Plaskett | Determining the time and location of an event, for example whether a ball hit a bat in cricket |
EP2455911A1 (en) * | 2010-11-23 | 2012-05-23 | Vicomtech-Visual Interaction and Communication Technologies Center | Method for detecting the point of impact of a ball in sports events |
CN103926562A (en) * | 2014-04-04 | 2014-07-16 | 中国计量学院 | Tennis tactic analysis instrument based on sound localization principle |
WO2015013752A1 (en) * | 2013-08-01 | 2015-02-05 | Brennan Broadcast Group Pty Ltd | Synchronisation of video and audio capture |
WO2016051021A1 (en) * | 2014-10-03 | 2016-04-07 | Zenniz Oy | Determination of at least one parameter relating to a trajectory of an object |
CN105759246A (en) * | 2016-04-14 | 2016-07-13 | 中国计量学院 | Precision-adjustable tennis hitting point positioning device with self-calibration function |
GB2567800A (en) * | 2017-08-29 | 2019-05-01 | Social Entertainment Ventures Ltd | Detecting ball strike position |
EP3637042A1 (en) * | 2018-10-12 | 2020-04-15 | Swiss Timing Ltd. | Video acoustical method and system for determining an impact point of a thrown body on a landing area |
GB2606539A (en) * | 2021-05-12 | 2022-11-16 | Sony Group Corp | Apparatus, method and computer program product for generating location information of an object in a scene |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2009208401A1 (en) * | 2008-01-31 | 2009-08-06 | Susan G. Forrester | A cricket bat and ball contact detection system and indicator |
US8447559B2 (en) * | 2009-02-03 | 2013-05-21 | R0R3 Devices, Inc. | Systems and methods for an impact location and amplitude sensor |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5489099A (en) * | 1992-10-30 | 1996-02-06 | Accu-Sport International, Inc. | Apparatus and method for tracking the flight of a golf ball |
US5521634A (en) * | 1994-06-17 | 1996-05-28 | Harris Corporation | Automatic detection and prioritized image transmission system and method |
US5768151A (en) * | 1995-02-14 | 1998-06-16 | Sports Simulation, Inc. | System for determining the trajectory of an object in a sports simulator |
US5953056A (en) * | 1996-12-20 | 1999-09-14 | Whack & Track, Inc. | System and method for enhancing display of a sporting event |
GB2358755A (en) * | 1998-08-12 | 2001-08-01 | Allan Plaskett | Method of and system for analysing events |
WO2001082626A1 (en) * | 2000-04-13 | 2001-11-01 | Koninklijke Philips Electronics N.V. | Method and apparatus for tracking moving objects using combined video and audio information in video conferencing and other applications |
-
2003
- 2003-06-27 GB GB0315007A patent/GB0315007D0/en not_active Ceased
- 2003-08-14 GB GB0319065A patent/GB2403362B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5489099A (en) * | 1992-10-30 | 1996-02-06 | Accu-Sport International, Inc. | Apparatus and method for tracking the flight of a golf ball |
US5521634A (en) * | 1994-06-17 | 1996-05-28 | Harris Corporation | Automatic detection and prioritized image transmission system and method |
US5768151A (en) * | 1995-02-14 | 1998-06-16 | Sports Simulation, Inc. | System for determining the trajectory of an object in a sports simulator |
US5953056A (en) * | 1996-12-20 | 1999-09-14 | Whack & Track, Inc. | System and method for enhancing display of a sporting event |
GB2358755A (en) * | 1998-08-12 | 2001-08-01 | Allan Plaskett | Method of and system for analysing events |
WO2001082626A1 (en) * | 2000-04-13 | 2001-11-01 | Koninklijke Philips Electronics N.V. | Method and apparatus for tracking moving objects using combined video and audio information in video conferencing and other applications |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2457674A (en) * | 2008-02-19 | 2009-08-26 | Allan Plaskett | Determining the time and location of an event, for example whether a ball hit a bat in cricket |
EP2455911A1 (en) * | 2010-11-23 | 2012-05-23 | Vicomtech-Visual Interaction and Communication Technologies Center | Method for detecting the point of impact of a ball in sports events |
GB2532154A (en) * | 2013-08-01 | 2016-05-11 | Brennan Broadcast Group Pty Ltd | Synchronisation of video and audio capture |
WO2015013752A1 (en) * | 2013-08-01 | 2015-02-05 | Brennan Broadcast Group Pty Ltd | Synchronisation of video and audio capture |
AU2014295901B2 (en) * | 2013-08-01 | 2017-08-31 | Brennan Broadcast Group Pty Ltd | Synchronisation of video and audio capture |
CN103926562B (en) * | 2014-04-04 | 2016-04-27 | 中国计量学院 | Based on the tennis tactical analysis instrument of sound localization principle |
CN103926562A (en) * | 2014-04-04 | 2014-07-16 | 中国计量学院 | Tennis tactic analysis instrument based on sound localization principle |
WO2016051021A1 (en) * | 2014-10-03 | 2016-04-07 | Zenniz Oy | Determination of at least one parameter relating to a trajectory of an object |
CN105759246A (en) * | 2016-04-14 | 2016-07-13 | 中国计量学院 | Precision-adjustable tennis hitting point positioning device with self-calibration function |
GB2567800A (en) * | 2017-08-29 | 2019-05-01 | Social Entertainment Ventures Ltd | Detecting ball strike position |
EP3637042A1 (en) * | 2018-10-12 | 2020-04-15 | Swiss Timing Ltd. | Video acoustical method and system for determining an impact point of a thrown body on a landing area |
US20200114202A1 (en) * | 2018-10-12 | 2020-04-16 | Swiss Timing Ltd | Video acoustical method and system for determining an impact point of a thrown body on a landing area |
US11628335B2 (en) | 2018-10-12 | 2023-04-18 | Swiss Timing Ltd | Video acoustical method and system for determining an impact point of a thrown body on a landing area |
GB2606539A (en) * | 2021-05-12 | 2022-11-16 | Sony Group Corp | Apparatus, method and computer program product for generating location information of an object in a scene |
EP4089636A1 (en) * | 2021-05-12 | 2022-11-16 | Sony Group Corporation | Apparatus, method and computer program product for generating location information of an object in a scene |
Also Published As
Publication number | Publication date |
---|---|
GB2403362B (en) | 2005-05-11 |
GB0319065D0 (en) | 2003-09-17 |
GB0315007D0 (en) | 2003-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11305174B2 (en) | Automated or assisted umpiring of baseball game using computer vision | |
US7030905B2 (en) | Real-time method and apparatus for tracking a moving object experiencing a change in direction | |
JP5442261B2 (en) | Automatic event detection method and system in sports stadium | |
US6304665B1 (en) | System for determining the end of a path for a moving object | |
AU2021202451A1 (en) | Methods circuits devices systems and associated computer executable code for multi factor image feature registration and tracking | |
US8335345B2 (en) | Tracking an object with multiple asynchronous cameras | |
US8600116B2 (en) | Video speed detection system | |
US5768151A (en) | System for determining the trajectory of an object in a sports simulator | |
US8189857B2 (en) | Methods and processes for detecting a mark on a playing surface and for tracking an object | |
CN109522854A (en) | A kind of pedestrian traffic statistical method based on deep learning and multiple target tracking | |
GB2403362A (en) | Calculating the location of an impact event using acoustic and video based data | |
JPH11508099A (en) | Scene Motion Tracking Method for Raw Video Insertion System | |
JP2004500756A (en) | Coordination and composition of video sequences with space-time normalization | |
US10751569B2 (en) | System and method for 3D optical tracking of multiple in-flight golf balls | |
CN106446002A (en) | Moving target-based video retrieval method for track in map | |
CN110298864B (en) | Visual sensing method and device for golf push rod equipment | |
KR20190063153A (en) | System and method for simultaneous reconsttuction of initial 3d trajectory and velocity using single camera images | |
CN111970434A (en) | Multi-camera multi-target athlete tracking shooting video generation system and method | |
Gyemi et al. | Three-dimensional video analysis of helmet-to-ground impacts in North American youth football | |
KR20040041297A (en) | Method for tracing and displaying multiple object movement using multiple camera videos | |
KR101703316B1 (en) | Method and apparatus for measuring velocity based on image | |
Seidl et al. | Evaluating the indoor football tracking accuracy of a radio-based real-time locating system | |
Lee et al. | Moving object performance analysis system using multi-camera video and position sensors | |
JP2021184540A (en) | Motion capture camera system and video data acquisition method using the same | |
CN117495899B (en) | Method, device, equipment and chip for detecting motion trail and round start |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20140814 |