AU2014295901A1 - Synchronisation of video and audio capture - Google Patents
Synchronisation of video and audio capture Download PDFInfo
- Publication number
- AU2014295901A1 AU2014295901A1 AU2014295901A AU2014295901A AU2014295901A1 AU 2014295901 A1 AU2014295901 A1 AU 2014295901A1 AU 2014295901 A AU2014295901 A AU 2014295901A AU 2014295901 A AU2014295901 A AU 2014295901A AU 2014295901 A1 AU2014295901 A1 AU 2014295901A1
- Authority
- AU
- Australia
- Prior art keywords
- sound
- video
- test
- test sound
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43072—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42203—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4341—Demultiplexing of audio and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4344—Remultiplexing of multiplex streams, e.g. by modifying time stamps or remapping the packet identifiers
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
A method/apparatus for detecting a contact of interest occurring in the performance of an event or activity, particularly a sporting or entertainment event or activity, the method of including the steps of capturing and storing sequences of video images of the event selective later replay by a visual display, detecting and recording sounds propagating in the vicinity of the target area for selective later manifestation of sound information such as an audible replay of the sounds, and processing the video image data and/or the sound data by a processing system so as to relatively time shift the video display and the manifested sound information so that they are better synchronised than the actual video capture times and the sound detection times. A test sound is generated for calibration, while capturing and storing video, analysing the video replay to determine the correct time shift for synchronisation.
Description
WO 2015/013752 PCT/AU2014/000772 SYNCHRONISATION OF VIDEO AND AUDIO CAPTURE Field of Invention The present invention relates to methods and apparatus for enabling determination of parameters of a contact of interest occurring in the performance of events or activities, particularly spotting or 5 entertainment events or activities. Background In the game of cricket determining whether the ball has contacted the bat can be essential to ensure correct umpiring decisions are made. For example, the batsman is dismissed or "out" under the rules of cricket if the ball that is bowled towards the batsman touches the bat and then continues its travel to be 10 caught (without having bounced after touching the bat) by the wicket keeper standing behind the stumps. It can be difficult for an umpire to decide if the ball did touch the bat particularly if there is little or no discernible deviation of the path of the ball. Umpires however frequently can hear the sound of the ball contacting the bat so that, in making a decision, the umpire takes into account visual cues (including the position and actions of the batsman as the ball reaches and passes), the trajectory of the ball before and 15 after reaching the batsman and passing the bat, and also the sounds detected. In order to enhance the information and entertainment experience for viewers of a televised cricket match, televisors have not only utilized television cameras capturing video images of the playing of the cricket match, Television cameras are placed behind the stumps at both ends of the pitch so as to capture images of the flight of the ball that is bowled at the batsman, hence providing a similar view of the action as the 20 umpire standing behind the stumps at the bowler's end of the pitch. In addition, televisors have placed a microphone at the base of the middle stump so as to detect and record sounds during the play of a cricket match particularly from the area immediately in front of the stumps where the batsman is located and including regions where the bat used by the batsman strikes the ball. The television images captured can be stored for later replay in conjunction with manifestations of the 25 sounds recorded during the relevant sequences of replayed video images. The captured and replayed sounds, or replayed sound information such as a graphical trace of sound amplitude detected, can frequently verify that the ball has contacted the bat from the synchronicity of the detected sound with the bat being moved or located in close proximity to the ball in the captured images. This capture and presentation of both video and sound information can enhance the experience and enjoyment for viewers, 30 and can enable commentators to add further information or remarks to their commentary which is broadcast. Sometimes the officials administering the cricket match have an agreement with the television broadcasters to access their captured video and sound information. Thus when umpires officiating on the field of play are called upon to make a decision whether the ball has contacted the bat, or perhaps the 1 WO 2015/013752 PCT/AU2014/000772 batsman's glove (which is equivalent to contacting the bat under the rules of cricket), and the officiating umpire is uncertain, the television and audio recordings can be replayed enabling a further umpire off the field of play to view a replay of the images and the synchronised sound information captured and determine if the relevant ball-bat contact occurred. 5 In patent specification WOOO/10333 in the name of one of the present inventors, Allan Plaskett, there is described and illustrated such a system ofcapturing video images and simultaneously recording sounds in proximity to the microphone located at the cricket stumps. This system has been implemented and is known in the cricket world as the "Snickometer" (TM). As mentioned in passing in the patent specification relating to the Snickometer, the time delay between the video signal and the audio signal 10 (caused, for example, by the time taken for the sound to reach the microphone) is compensated. However no further detail is provided in that patent specification of how that compensation factor is detennined and applied. Although not public knowledge, the compensation in the Snickometer system utilized in the past in television broadcasts has been determined by synchronisation checks performed during live play when the ball strikes the bat at midriff height over the popping crease, The operator relatively time shifts IS the video and audio outputs for the observed and recorded contact of the ball with the bat until they are in synchronisation in the display. That time shift is then applied to future video and audio capturing during continued play of the cricket game. To understand this problem of synchronising the recorded sound with the captured video images, the speed of sound in air and its influence oi the data capture needs to be considered. 20 The speed of sound in air is approximately 343m per second (but this can vary depending on atmospheric conditions, including air temperature and humidity). The standard television frame capture rate for most purposes has been 25 frames per second. For slow motion replays of events or activities in which there is considerable movement, the frame capture rate is 75 frames per second. Thns when a sequence of video fiamnes captured at 75fps is replayed at the standard screen refresh rate of 25fps, the motion is "slowed" to 25 one third of the actual speed. A cricket ball can be bowled by some fast bowlers at speeds significantly in excess of I 00km per hour although for exemplifying the relevant factors in the following analysis, a cricket ball travelling at 100kph will be used. For video capture at 25fps, a ball travelling at I Ofkph travels about 1.1 in in the interval between 30 successive image frames (in 0.04 seconds). Hence between successive image frames at standard image capture rates of 25fps, the ball can have moved I. i n. In the same time interval of 0,04seconds, the sound, eg. generated by contact of cricket ball with the bat, travels about 13.7m. Hence if the ball contacted the bat 2ni in front of the stumps where the microphone is located, the sound will reach the microphone within about 15% of the tine interval between successive video frames. Hence a time shift 35 to adjust to this time delay is only about 15% of the interval between successive frames. However 2 WO 2015/013752 PCT/AU2014/000772 because of the considerable distance that the bail travels between successive frames it is always difficult or even impossible to deduce with any certainty whether the instant that the ball is in closest proximity to the bat is indeed synchronous with a detected sound. As a result the Snickomueter system is of littic or no practical use at standard frame capture rates of 25fps to verify a ball-bat contact with considerable 5 certainty, At the standard slow motion frame capture rate of 75fps, a cricket ball travelling at I 00kph will travel O037m or 37cm between successive captured image frames. In the time interval between successive frames captured at 75fps, i.e, in 0,0133 seconds, sound travels about 4.6 meters. Hence for a ball-bat contact occurring 2m in front of the microphone, the sound will take about 40% of the tine interval 10 between image capture frames before reaching the microphone. 'This can produce a very noticeable break in synchronicity between the image captured and the sound recorded if there is no relative adjustment, In particular, without any time compensation a ball will appear at a considerable distance from the bat when the sound is detected. Advances in camera technology have now enabled a "super slow motion" video camera to be available. 15 This "super slomo" camera captures frames at I 50fps. At such frame capture speeds, a cricket ball travelling at 100kph will travel at about I85cm between successive frames (ie, in 0.0067 seconds), and sound travels about 2.3m, This means that if a sound from a ball-bat contact occurs 2.3m in front of the stumps where the microphone is located, the sound will be detected one complete video frame interval later than a frame captured at the instant of ball-bat contact. Of course, the ball-bat contact can occur 20 anywhere in a time continuum and indeed will normally be in the interval between successive frames however it is clear from this example that the time delay for Ithe sound from a bat-ball impact to reach the sound detector microphone creates a necessity for compensation to be accurately applied so that the greatly enhanced accuracy of contact determinations that is theoretically possible with "super stomo" image capture is achievable. 25 In this discussion, it has been assumed that the microphone or other sound detector is effectively instantaneous in detecting and signalling when the sound travelling from a ball-bat contact reaches the sound detector, but if there is any response time delay, that will be a fixed characteristic of the instrument and can be adjusted using the present calibration system. Also in this description of background, only the game of cricket has been discussed. Although the present 30 invention has been developed particularly for cricket, there are other sports and activities where the method and apparatus of the invention can be applicable. For example, in baseball determining whether a ball-bat contact has occurred can be important for ruling whether a "foul tip" has occurred. In Australian Rules football, determining whether the football has touched the goal post can decide the score awarded, In tennis, determining whether the ball when served has touched the tape across the top of the net is 35 important to decide if the serve is valid or if the serve needs to be replayed. The present invention can be 3 WO 2015/013752 PCT/AU2014/000772 used in such other circumstances. For convenience however, the description herein will continue to refer to cricket. Objects of the Invention It is an object of the present invention to provide a method and apparatus for enabling determination of 5 parameters of a contact occurring in the performance of an event or activity, particularly a sporting or entertainment event or activity, involving capturing video images of the event or activity for later replay with an accurately time shifted manifestation of detected sounds generated during the event or activity, An object is also to provide related data set recording the video and sound information enabling later replay with an accurately time shifted manifestation of detected sounds generated during the event or 1.0 activity It is a further and preferred object to provide a method and apparatus in which different time shifts can be selectively applied for improving or maintaining accurate synchronicity of replayed video and audio information. Summary of the Invention 15 According to a first aspect of the present invention there is provided a method of enabling determination of parameters of a contact of interest occurring in the performance of an event or activity , particularly a sporting or entertainment event or activity, the method of including the steps: capturing sequences of video images of the event or activity occurring in an imaged target area where the contact of interest is to be encountered and storing the captured video image data for selective 20 later replay by a visual display; detecting sounds propagating in the vicinity of the target area and recording the detected sound data for selective later manifestation of sound information such as an audible replay of the sounds, anWor a visual display representing the sounds or parameters of the sounds, and/or a display of an analysis of parameters of the sounds; 25 processing the video image data and/or the sound data by a processing system so as to relatively time shift the video display and the manifested sound information so that they are better synchronised than the actual video capture times and the sound detection times; and wherein the processing step includes calibrating the processing system by: generating a test sound in the target area while capturing and storing sequences of video images of the test 30 sound generation, detecting and recording of the test sound generated, 4 WO 2015/013752 PCT/AU2014/000772 analysing a video replay by the visual display so as to determine the instant in the sequence of video images of the test sound generation, determining a time shift to be relatively applied to the actual detection time of the test sound so as to synchronise the video replay of video images captured at the instant of the test sound generation with the 5 manifestation of sound information relating to the actual test sound generated and detected, Preferably the analysing step (c) of the calibration includes determining an instant along a time continuum including times both at and between successive sequential captured video images or frames. The test sound generating step (a) of the calibration may include operating a test sound generator having mechanical means whose mechanical operation is visually observable to indicate by the mechanical 10 configuration (lie moment of generation of the test sound. In one embodiment, the mechanical means of the test sound generator is operative to change its mechanical configuration and to generate the test sound upon its reaching a predetermined observable mechanical configuration. In this embodiment, the analysing step (c) of the calibration may comprise making measurements of the mechanical means in at least two different video frames and determining by calculation from the measurements when the 15 predetermined mechanical configuration was reached in the time contionun. For example, the mechanical means may comprise a clapper board having a movable striker hinged to a base member and angularly moveable by an operator until it strikes the base member and generates the test sound. In this embodiment, the analysing step (e) of the calibration comprises measuring the angular position of the movable striker of the clapper board and calculating from the changes in the angular position in different 20 video frames the instant when the striker has struck the base member and generated the test sound. In an alternative embodiment, the mechanical means may comprise a linearly moveable striker movable along an observable path until it strikes a stop member and generates the test sound. In this arrangement the analysing step (c) of the calibration comprises measurement in different video frames of the position of the striker along its linear movement path and calculating from those position measurements the instant 25 in the time continuum when the striker struck the stop member and generated the test sound. In a further alternative embodiment, the test sound generating step (a) of the calibration includes operating a test sound generator having an associated count-down time display which visually displays in a manner captured in the video images the time remaining before the instant of generation of the test sound whereby successive captured video image frames show the countdown time display and the last 30 captured image of the visual display before the test sound generation enable determination of the time interval in the time continuum after that image capture that elapses before the test sound generation. Preferably the step of calibrating the processing system includes repeating the performance of steps (a) to (d) at each of a plurality of predetermined test locations at different separation distances from the sound detector so as to establish a stored set of differing time shifts to be selectively applied depending on an 35 estimation of the vicinity of a contact of interest during an event or activity There may be three 5 WO 2015/013752 PCT/AU2014/000772 repetitions of the performance of steps (a) to (d), the predetermined test locations including an expected average or most common location at which a contact of interest occurs, a farther test location at a greater separation from the sound detector, and a nearer test location closer to the sound detector than the average most common location, whereby an operator monitoring the event or activity or monitoring captured 5 video images thereof can select from the approximate distance of expected separation of a contact of interest from the sound detector whichever of the stored set of differing time shifts is most applicable to synchronise the video display and sound manifestation relating to a contact of interest, According to a second aspect of the present invention there is provided an apparatus for use in 10 determination of parameters of a contact of interest occurring in the performance of an event or activity, particularly a sporting or entertainment event or activity, the apparatus including a video camera for capturing sequences of video images of the event or activity occurring in an imaged target area where the contact of interest is to be encountered and a video store for storing the captured video image data for selective later replay by a visual display; 15 a sound detector for detecting sounds propagating in the vicinity of the target area and a sound data store for recording the detected sound data for selective later manifestation of sound information such as an audible replay of the sounds, and/or a visual display representing the sounds or parameters of the sounds, and/or a display of an analysis of parameters of the sounds; a processing system for processing the video image data and/or the sound data so as to relatively time 20 shift the video display and the manifested sound information so that they are better synchronised than the actual video capture times and the sound detection times; and wherein the processing system is calibrated by: generating a test sound in the target area while the video camera is capturing and the video store is storing sequences of video images of the test sound generation, 25 detecting by the sound detector and recording in the sound data store the test sound generated, analysing a video replay by the visual display so as to determine the instant in the sequence of video images of the test sound generation, determining a time shin to be relatively applied to the actual detection time of the test sound so as to synchronise the video replay of video images captured at the instant of the test sound generation with the 30 manifestation of sound information relating to the actual test sound generated and detected. In one possible embodiment according to this second aspect the analysing step (c) in the calibration of the processing system may include determining the instant of the test sound generated along a time continuum including times both at and between successive sequential captured video images or frames, 6 WO 2015/013752 PCT/AU2014/000772 and wherein the test sound generating step (a) of the calibration includes operating a test sound generator having mechanical means whose mechanical operation is visually observable to indicate by the mechanical configuration the moment of generation of the test sound. The mechanical means of the test sound generator may be operative to change its mechanical configuration and to generate the test sound 5 upon its reaching a predetermined observable mechanical configuration. In another possible embodiment according to this second aspect the analysing step (c) in the calibration of the processing system may include determining the instant of the test sound generated along a time continuum including times both at and between successive sequential captured video images or frames, and the test sound generating step (a) of the calibration may include operating a test sound generator 10 having an associated count-down time display which visually displays in a manner captured in the video images the time remaining before the instant of generation of the test sound whereby successive captured video image frames show the countdown time display and the last captured image of the visual display before the test sound generation enable determination of the time interval in the time continuum after that image capture that elapses before the test sound generation. 15 Preferably (lie processing system is calibrated by repeating the performance of steps (a) to (d) at each of a plurality of predetermined test locations at different separation distances from the sound detector so as to establish a stored set of differing time shifts to be selectively applied depending on an estimation of the vicinity of a contact of interest during an event or activity. Preferably there are three repetitions of the performance of steps (a) to (d), the predetermined test locations including an expected average or most 20 common location at which a contact of interest occurs, a farther test location at a greater separation from the sound detector, and a nearer test location closer to the sound detector than the average most common location, whereby an operator monitoring the event or activity or monitoring captured video images thereof can select from the approximate distance of expected separation of a contact of interest from the sound detector whichever of the stored set of differing time shifts is most applicable to synchronise the 25 video display and sound manifestation relating to a contact of interest. According to a third aspect of the present invention there is provided a method of enabling determination of parameters of a contact of interest occurring in the performance of an event or activity, particularly a sporting or entertainment event or activity, the method of including the steps: capturing sequences of video images of the event or activity occurring in an imaged target area 30 where the contact of interest is to be encountered and storing the captured video image data for selective later replay by a visual display; detecting sounds propagating in the vicinity of the target area and recording the detected sound data for selective later manifestation of sound information such as an audible replay of the sounds, and/or a visual display representing the sounds or parameters of the sounds, and/or a display of an analysis of 35 parameters of the sounds; 7 WO 2015/013752 PCT/AU2014/000772 processing the video image data and/or the sound data by a processing system so as to relatively time shift the video display and the manifested sound information so that they are better synchronised than the actual video capture times and the sound detection times; and wherein the relative time shift is determined by determining the location relative to the sound 5 detector of the actual or expected or suspected contact of interest along a spatial continuum and deriving from that location the expected instant of the incident of interest along a time continuum including times both at and between successive sequential captured video images or frames, followed by applying the determined time shift to a slow motion replay so as to synchronise the video replay of video images captured at the instant of sound generation from an incident of interest with the manifestation of sound 10 information relating to the actual sound detected from the incident of interest. According to a fourth aspect of the present invention there is provided a set of stored data values recording captured image and sound data relating to a sporting or entertainment event or activity in which parameters of a contact of interest occurring in the performance of the event or activity are to be determined, 15 the video data comprising recorded sequences of video images captured by a video camera directed towards the event or activity occurring in an imaged target area where the contact of interest was encountered and then being stored for selective later replay by a visual display; the sound data comprising recorded sounds propagating in the vicinity of the target area where the contact of interest was encountered and detected by a sound detector and being stored for selective 20 later manifestation of sound information such as an audible replay of the sounds, and/or a visual display representing the sounds or parameters of the sounds, and/or a display of an analysis of parameters of the sounds; wherein the set of stored data values further includes time shift data enabling a processing system for processing the video image data and/or the sound data so as to relatively time shift the video display 25 and the manifested sound information so that they are better synchronised than the actual video capture times and the sound detection times; and wherein the time shift data has been determined by: generating a test sound in the target area while the video camera is capturing and the video store is storing sequences of video images of the test sound generation, detecting by the sound detector and recording in the sound data store the test sound generated, 30 analysing a video replay by the visual display so as to determine the instant in the sequence of video images of the test sound generation, determining a time shift to be relatively applied to the actual detection time of the test sound so as to synchronise the video replay of video images captured at the instant of the test sound generation with the manifestation of sound information relating to the actual test sound generated and detected. 8 WO 2015/013752 PCT/AU2014/000772 Brief Description of the Drawings Possible and preferred features of the present invention will now be described with particular reference to the accompanying drawings. However it is to be understood that the features illustrated in and described with reference to the drawings are not to be construed as limiting on the scope of the invention. In the 5 drawings: Fig. I is a schematic diagram showing apparatus for capturing and displaying video and synchronised audio information usable in the present invention, Fig. 2 shows hypothetical readings and computations to determine a time shift to be applied to audio data utilizing a clapper board for calibration, 10 Fig. 3 shows an alternative mechanical device for use in calibration, Fig. 4 shows an electro-nechanical device usable for calibration purposes, Fig. 5 shows a possible electronic device usable in calibration. Fig, 6 shows plots of time shifts usable for different ball-bat contact locations. Description of Preferred Embdiments 15 Referring to Fig, 1, the system illustrated schematically is shown being used for a cricket game where the imaged target area 10 comprises an area in the vicinity of the stamps I I at one end of cricket pitch 12 with the so called "popping crease" 13 marked on the surface I 22m (4 feet) in front of the stumps. The batsman commonly stands and plays strokes at zone I5 approximately at the position of the popping crease 13, however sometimes a batsman will stand in front of the popping crease, e.g. at about the 20 position of zone 16 or will move forward to that zone during play of a stroke. Alternatively, a batsman sometimes steps back to about zone 17 when playing a stroke at the ball. The apparatus includes a camera 20 directed towards the target area 10 so as to capture sequences of video images of the activity including movement of the ball bowled towards the batsman as well as movement of the batsman and particularly movements of the bat. Video image data captured by the 25 camera 20 is stored by the video data store 21 and this data will include information concering the time of recordal of the image frames. Video data from the store 21 can be selectively retrieved later by processor 30 for replay by a visual display 41 constituting part of the AV output 40. The system also includes a sound detector 25, such as a microphone 26 located at the base of the stumps I I and arranged to detect sounds propagating in the vicinity of the target area 10. The detected sound 30 data is stored by the audio data storage 27 which will include data concerning the time of recordaL The time of recordal of the video data in the store, 21 and the time of recordal of the audio data in the store 27 can be achieved by any conventional means such as by synchronising the clock data produced by the 9 WO 2015/013752 PCT/AU2014/000772 ilicro-processor. Alternatively the video data store 21 and audio data store 27 may be constituted by a single memory device in which video data from the camera 20 and audio data from the sound detector 25 are being stored with real time data links between the respective video and audio data, The audio data stored is capable of retrieval selectively for producing manifestations of sound information by a sound 5 output 45 which is part of the AV output 40. The sound manifestation may comprise one or more of such manifestations as (i) an audible replay of the sounds detected and recorded (frequency shifted if desired, particularly if replayed in "slow motion" which would otherwise greatly reduce the frequency of the sound produced possibly to sub-audible frequencies); 10 (ii) visual display representing the sounds such as a trace generated by AV output 40 or on visual display 4 1 showing detected sound amplitude on a vertical axis and time on a horizontal axis (see schematically illustrated sound (race 46); and (iii) a visual display of an analysis of parameters of the sounds, such as a moving upright bar to indicate sound amplitude, a pulsing light of intensity related to sound amplitude, or an 15 instantaneous light pulse or sound generated only when the pre-determined threshold amplitude of the detected sound is exceeded. As described and illustrated in the Plaskett patent specification WOOf/10333, the video display and the manifested sound information are relatively time shifted so that they are better synchronised than the actual video capture times and sound detection times. The objective is for the sound manifestations 20 relating to sound producing events (particularly the cricket ball contacting the cricket bat) to be better synchronised with the video images being displayed by the visual display 41 during replay of activities in the target area 10. For this purpose, the processor 30 which retrieves video data from the storage 21 and the audio data from the storage 27 and processes them for presentation by the AV output 40 includes a time shifter module 31 which is operative to apply the time shift for improving the synchronisation, 25 According to the present invention, the system is calibrated before use in real time to monitor and display activities during the cricket match. In a first part of the calibration process, a test sound is generated in the target area 10 while the camera 20 is capturing images of the test sound generation and the store 21 is storing video image data during the test sound generation. As schematically illustrated in Fig 1, a clapper board 50 can be held directly above the popping crease 13 during the calibration process so that 30 the clapper board 50 is clearly imaged by thc camera 20. The striker arm 51 which is hinged to the base member 52 is angularly moved by an operator until it strikes the base member 52 and generates a test sound which will be a sharp report or clap produced by the impact of the striker 51 with the base 52. The test sound generated is detected by the microphone 26 and recorded by store 27. The next step of the calibration comprises producing and presenting by the visual display 41 a video replay of the test sound 35 generation so as to enable determination of the instant in the sequence of video images of the test sound generation. From an analysis of the video replay, a time shift is determined to be applied to the actual 10 WO 2015/013752 PCT/AU2014/000772 detection time of the test sound so as to synchronise the video replay of video images captured at the instant of test sound generation with the manifestation of sound information relating to the actual test sound that was generated and detected. The analysis of the video replay and determination of the time shift using the clapper board 50 in Fig. I is 5 exemplified in the presentation of Fig. 2 Because the instant at which the striker 51 impacts the base 52 to create the test sound is unlikely to occur with exact co-incidence to the instant of an image frame capture, the steps of the calibration method exemplified in Fig. 2 include measuring the angular position of the moveable striker 5 1 of the clapper board 50 and calculating by the processor 30 from the changes in the angular position in different video frames the instant when the striker 51 has impacted the base 10 member 52, Referring to Fig. 2, from a first frame ("Frame 0") displayed by display 41 the position of the striker 51 in relation of the base 52 can be measured, e.g. by reading or determining Cartesian co ordinates of the hinge point 55, the outer point 56 of the impact surface of the base 52 and the outer point 57 of the hinged striker, From these three sets of Cartesian co-ordinates, the internal angle 0 can be calculated. These Cartesian co-ordinates can be "read" from the display for example by using a point and 15 click procedure using a mouse to move a cursor onto the three points 55, 56 and 57 successively and a simple algorithm or software to record the click locations and convert these to Cartesian co-ordinates, Likewise the computation of the internal angle 0 is a simple algorithm. A similar procedure is repeated for at least one further captured frame in the sequence of video frames captured as the striker 51 of the clapper board 50 is moved down to the impact point. Fig. 2 shows a table 20 of Cartesian co-ordinate readings for three image frames (which can be, but are not necessarily, successive frames) and also shows computed internal angles in radians (and approximate degrees). Typically about 25 frames may pass between successive samplings, e.g. between "Frame 0" and "Frame I" and between Frame I" and "Frame 2". These internal angles can then be used as inputs of a regression analysis for solving a quadratic equation 25 to represent the real time continuous movement of the striker of the clapper board during the video capture leading up to the instant of generation of the test sound. The use of such sampled data as inputs to solve a quadratic equation and generate regression co-efficients for the curve that best fits the data points is a known mathematical analysis frequently used to find a line or curve of best fit for sampled or collected data and hence enable interpolation or extrapolation for other data points. In particular, as 30 shown in Fig. 2, the regression analysis of the sampled angles can produce a curve as shown in the graph of 0 as a function of frame number which fits the sample data points. Thereafter solution of this quadratic equation for 0-0 (i.e. the internal angle of the clapper board reaching 0 corresponding to the impact and hence the test sound generation) produces two solutions, but the first of these is the one relevant to predict the exact time instant in between successive frames when the impact and test sound 35 generation occurred. In the example of Fig. 2, the quadratic equation solutions of 2.29 and 8.96 indicate that the impact (test sound generation) occurred 0.29 cycles or frame sampling intervals after Frame No. 11 WO 2015/013752 PCT/AU2014/000772 2. This figure of 0.29 cycles therefore gives the time shift to be applied to audio signals arising from events occurring at the position of the popping crease 13 where the clapper board was located during the calibration. For example in the case of super slow motion capture speeds of 150fps, the time shift will be 029 4150:0 A19 seconds; This time shift may not be hunanly discernible and is indeed irrelevant 5 because the time shift is applied during the slow motion replay of the video and audio data captured and stored. What is significant is that the actual sound detection after sound generation is delayed by 0.3 of one framc capture cycle, and during that time interval, a cricket ball travelling at OOkph travels 6cm which, if not corrected, can produce a significant disconformity between the slow motion replay of the ball movement with the event producing the audible sound (particularly a ball contacting bat). 10 Turning to Fig 3, there is ilustrated an alternative possible test sound generator having mechanical means whose mechanical operation is visually observable to indicate by the mechanical configuration the mornent of generation of the test sound. In Fig. 3 the mechanical means comprises lineally movable striker 61 which is movable along the path 63 unil it strikes a stop member 65 and thereby generates the test sound. The striker 61 in Fig. 3 is shown as a ball 62 made of a hard material such as metal which is 15 caused during the calibration operation to travel along the path 63, such as a track having a linear groove to guide the ball 62, until it strikes the stop member 65. The ball can be caused to travel along the path by an impeller 66 which is schematically illustrated as a manually released spring-loaded pin 67 enabling the user to draw back the pin 67 against the action of the compression spring 68 and, upon release, the pin returns to the position shown to impel the ball 62 along the path 63. The capture of video images 20 continues as the ball travels along (he path, To determine the instant of generation of the test sound (the ball 62 impacting the stop 65), measurement of the position of the striker 61 along its linear movement path 63 in different captured video frames can be performed. For example an operator monitoring the replayed successive image frames can use a pointer such as a cursor operating tinder mouse control to capture a reading of the position of the striker 25 61. LUsing such measurements from several video frames (at least 2, but not necessarily consecutive, frames). The substantially linear movement of the striker 61 at (assumed) substantially constant speed enables easy calculation of the instant in the time continuum when the striker 61 impacts the stop member 65 and generates the test sound. The calculation of the time shift to be applied can be tested after being stored by replaying the captured images and captured sound of the calibration test sound generation with 30 the calculated time shift applied to verify synchronisation. In Fig 4. the test sound generation for the calibration comprises operating a test sound generator 70 having an associated countdown time display 73 which visually displays in a manner captured in the video images the time remaining before the instant of generation of test sound. In Fig. 4 the test sound generator 70 includes a speaker 71 and associated electronic driver circuitry (not shown) operated so that 35 a sharp report or noise is generated at an instant determined by the driver circuit while the countdown time display 73 displays by the moving pointer 74 the time remaining before the instant of sound 12 WO 2015/013752 PCT/AU2014/000772 generation, In the illustrated embodiment in Fig. 4, the countdown time display is operated so that the pointer 74 requires 0.02 seconds to scan through 360* and the display is calibrated in increments of 0,001 seconds. With this embodiment used for super slow motion video capture, the pointer will take three captured image frames to travel 3600. The last of the captured image frames before the pointer 74 returns 5 to the starting point when the speaker 71 generates the test sound will enable direct reading from the scale of the time remaining after that image frame capture when the test sound is generated. In the illustrated example where the pointer 74a is shown in the last captured image pointing to 0.0033, the test sound generation would occur exactly half way in the time interval between the relevant captured image frame and the next successive captured image frame. The time shift would therefore be one half cycle of the 10 image frame replay rate, In Fig. 5 there is shown an alternative electronic test sound generator 80 having an associated digital countdown time display 83 which visually displays in a manner captured in the video images the time remaining before the instant of generation of test sound which can be the sharp report or noise produced by loud speaker 81 as in Fig. 4. With this arrangement, the digital display 83 may start, as illustrated, at 15 0.2 seconds before the test sound generation and countdown in increments of ,0001 seconds so that the last captured image of the visual display before the test sound generation (at 0.000 seconds) will provide a direct read out of the time interval in the time continuum after that last image capture that elapses before the test sound is generated, It will be seen from the forgoing description and the various possible embodiments of the test sound 20 generator for calibrating the apparatus and determining the time shift required, the apparatus can be then used in a cricket game with the time shift accurately determined for the venue and surrounding conditions including effects of ambient temperature, ambient humidity, and instrument response time variability. However the calibration process described and illustrated so far is based on a calibration carried out at or above the popping crease 13, i.e. 1 .22m in front of the microphone 26. In the preferred system, the step 25 of calibrating the processing system includes repeating the performance of tie calibration steps at each of a plurality of predetermined test locations at different separation distances from the sound detector 26 so as to establish a stored set of differing time shifts to be selectively applied depending on an estimation of the vicinity of a contact of interest during an event or activity. For example as shown in Fig. I there can be three repetitions of the performance of the calibration steps, 30 the predetermined test locations including the expected average or most common location 15 at which a contact of interest occurs, a farther test location 16 at a greater separation frn the sound detector 26, and a nearer test location 17 closer to the sound detector 26 than the average most common location. An operator monitoring the event or activity or monitoring captured video images thereof can select from the approximate distance of expected separation of a contact of interest from the sound detector (i.e. at 35 location 15, 16, or 17) whichever of the stored set of differing tine shifts from the three calibrations is most applicable to synchronise the video display and sound manifestation relating to a contact of interest, 13 WO 2015/013752 PCT/AU2014/000772 As a further extension of the system enabling selection of differing time shifts to be applied to the sound manifestations, the performance of multiple calibrations and the resulting determination of multiple time shifts for different separations between the test sound source and the sound detector, it can also be possible to enable operator selection of a continuously variable time shift. In particular, by conducting 5 three calibrations at the median zone 15, forward zone 16 and nearer zone 17 the resulting calculated time shifts can enable interpolation of time shifts for an event occurring in the cricket game between zone 15 and 16 or between zone 15 and 17. Likewise extrapolation beyond the time shift values calculated from the multiple calibrations is possible to determine time shifts to be applied for events occurring further from the sound detector than zone 16, e.g. if a batsman has moved forward by a considerable distance 10 beyond zone 16 when playing the stroke. Applied to the cricket game, upon viewing the cricket game and particularly the events in the target area 10, preferably by monitoring a slow motion replay of the captured video data on the visual display 41, the operator can input into the processor an approximate location of the event of interest (the possible contact of the ball with the bat) eg. by a point and click operation conducted when a video frame at or close to 15 the time of the event of interest is being displayed. For example, if the batsman has moved considerably fonvard and the bat at the time of the ball passing by is apparently approximately I m forward of the popping crease, i.e. is about 2,2in from the sound detector, the operator can point to the location on the video display using a mouse controlling the cursor on the screen and by clicking at the approximate location of the suspected ball-bat contact the distance of that selected point can be compared to the 20 multiple calibration points and associated calculated time shifts to thereby determine a time shift into being applied for that particular incident The determination of the time shift by the processor 30 by interpolation within or extrapolation beyond the time shifts determined during the calibration can be performed by calculation at each instance of replay for analysis or can be earlier calculated and values stored in a look-Lip table. 25 In Fig. 6 there is illustrated in schematic form plots of time shifts to be applied for a range of distances in front of the stumps and, by inputting the distance of the event of interest from the stumps, the time shift can be read off or calculated from the graph. The multiple calibrations enable the calculation of at least one time shift, eg at the distance of the popping crease enabling the graph to be drawn. The time shift at the stumps where the microphone is located will be 0 since there will be no time delay for the sound to 30 travel from the contact (e g. ball striking the stumps) to the microphone. In fact a test calibration taken at the location of the stumps can determine if there is any response time delay of the sound detector and if so, this can be compensated by having a positive time shift at the location of the stumps along the horizontal axis, In the system of calibration described and illustrated herein, the linear distance along the line from stumps 35 at one end to stumps at the other end has been considered the predominant variable affecting the sound travel time and this is expected to be a legitimate approximation for most incidents in a cricket game. 14 WO 2015/013752 PCT/AU2014/000772 However a sound of interest (ball contacting bat) may occur at a height significantly above ground level e.g. if the batsman has lifted the bat attempting to strike the ball travelling past at a significantly height, sometimes as much as 2i above the ground. Even if the bat and ball touch substantially at the line of the popping crease, the contact 2n above the ground will add significantly to the sound travel time to the 5 microphone. Hence the calibration process may include a generation of test sounds at multiple heights above ground level enabling calculation of time shifts for incidents occurring at different heights, For example in Fig. 6, there are shown additional plots enabling determination of time shifts to be applied if the incident occurred at different heights above ground level. The point marked 91 for example would enable determination of the time shift applicable for an incident occurring in front of the stumps at the 10 line of the popping crease but at a height 2m above ground. Similarly, distances of the incident (ball-bat contact) spaced laterally (transverse to the line of the cricket pitch) will also affect the time delay and hence calibrations and resulting time shifts can be determined accordingly. Essentially time shifts can be determined to compensate for events of interest occurring with variable locations in three dimensions relative to the location of the microphone using the present invention. 15 It wilI be seen that the present invention enables more accurate synchronisation of manifestations of sounds generated during an activity of interest with the associated video images, particularly when high speed video image capture is being used and slow motion replays are being generated and presented to enable determination whether there has been a contact of interest occurring which generated a sound subsequently detected. When utilised with cricket matches and particularly super slow motion video 20 image capture, the compensation by relative time shifting of the audio sound manifestation will enable more accurate assessment of synchronicity and hence determination whether a relevant contact of interest occurred or not. Variations in ambient conditions that can affect the speed of sound such as temperature and humidity, and even variations in instrument response times can be compensated. Varying distances of the contact of interest producing the sound can also be accurately compensated so that the 25 determination of the parameters of the contact can be more accurate than in the past. When referring to "parameters" of contact, the main parameter of interest is of course whether the contact occurred or not, but other "parameters" may also be relevant such as the characteristics of the sound produced which may differ if the ball strikes the bat compared to the ball striking the batsman's glove, or the ball striking the protective leg pads, etc. and such parameters therefore can be better assessed if they are accurately 30 synchronised. It is to be understood that various alterations, modifications and/or additions may be made to the features of the possible and preferred embodiment(s) of the invention as herein described without departing from the spirit and scope of the invention, 35 15
Claims (11)
1. A method of enabling determination of parameters of a contact of interest occurring in the performance of an event or activity , particularly a spotting or entertainment event or activity, the method of including the steps: 5 capturing sequences of video images of the event or activity occurring in an imaged target area where the contact of interest is to be encountered and storing the captured video image data for selective later replay by a visual display; detecting sounds propagating in the vicinity of the target area and recording the detected sound data for selective later manifestation of sound information such as an audible replay of the 10 sounds, and/or a visual display representing the sounds or parameters of the sounds, and/or a display of an analysis of parameters of the sounds; processing the video image data and/or the sound data by a processing system so as to relatively tine shift the video display and the manifested sound information so that they are better synchronised than the actual video capture times and the sound detection times; and 15 wherein the processing step includes calibrating the processing system by: (a) generating a test sound in the target area while capturing and storing sequences of video images of the test sound generation, (b) detecting and recording of the test sound generated, (c) analysing a video replay by the visual display so as to determine the instant in the 20 sequence of video images of the test sound generation, (d) determining a time shift to be relatively applied to the actual detection time of the test sound so as to synchronise the video replay of video images captured at the instant of the test sound generation with the manifestation of sound information relating to the actual test sound generated and detected. 25
2. A method as claimed in claim I wherein the analysing step (c) of the calibration includes determining an instant along a time continuum including times both at and between successive sequential captured video images or frames. 30 3, A method as claimed in claim 2 wherein the test sound generating step (a) of the calibration includes operating a test sound generator having mechanical means whose mechanical operation is visually observable to indicate by the mechanical configuration the moment of generation of the test sound. 16 WO 2015/013752 PCT/AU2014/000772
4. A method as claimed in claim 3 wherein the mechanical means of the test sound generator is operative to change its mechanical configuration and to generate the test sound upon its reaching a predetermined observable mechanical configuration. 5 5. A method as claimed in claim 4 wherein the analysing step (o) of the calibration comprises making measurements of the mechanical means in at least two different video frames and determining by calculation from the measurements when the predetermined mechanical configuration was reached in the time continuum. 10 6. A method as claimed in claim 5 wherein the mechanical means comprises a clapper board having a movable striker hinged to a base member and angularly moveable by an operator until it strikes the base member and generates the test sound,
7. A method as claimed in claim 6 wherein the analysing step (c) of the calibration comprises 15 measuring the angular position of the movable striker of the clapper board and calculating from the changes in the angular position in different video frames the instant when the striker has struck the base member and generated the test sound,
8. A method as claimed in claim 4 wherein the mechanical means comprises a linearly moveable 20 striker movable along an observable path until it strikes a stop member and generates the test sound.
9. A method as claimed in claim 8 wherein the analysing step (c) of the calibration comprises measurement in different video frames of the position of the striker along its linear movement 25 path and calculating from those position measurements the instant in the time continuum when the striker struck the stop member and generated the test sound.
10. A method as claimed in claim 2 wherein the test sound generating step (a) of the calibration includes operating a test sound generator having an associated count-down time display which 30 visually displays in a manner captured in the video images the tine remaining before the instant of generation of the test sound whereby successive captured video image frames show the countdown time display and the last captured image of the visual display before the test sound generation enable determination of the time interval in the time continuum 35 after that image capture that elapses before the test sound generation. 17 WO 2015/013752 PCT/AU2014/000772 114 A method as claimed in any preceding claim wherein the step of calibrating the processing system includes repeating the performance of steps (a) to (d) at each of a plurality of predetermined test locations at different separation distances from the sound detector so as to establish a stored set of differing time shifts to be selectively applied depending on an estimation 5 of the vicinity of a contact of interest during an event or activity.
12. A method as claimed in claim I I wherein there are three repetitions of the performance of steps (a) to (d), the predetermined test locations including an expected average or most common location at which a contact of interest occurs, a farther test location at a greater separation from 10 the sound detector, and a nearer test location closer to the sound detector than the average most common location, whereby an operator monitoring the event or activity or monitoring captured video images thereof can select from the approximate distance of expected separation of a contact of interest from the sound detector whichever of the stored set of differing time shifts is most applicable to synchronise the video display and sound manifestation relating to a contact of 15 interest.
13. Apparatus for use in determination of parameters of a contact of interest occurring in the performance of an event or activity, particularly a sporting or entertainment event or activity, the apparatus including: 20 a video camera for capturing sequences of video images of the event or activity occurring in an imaged target area where the contact of interest is to be encountered and a video store for storing the captured video image data for selective later replay by a visual display; a sound detector for detecting sounds propagating in the vicinity of the target area and a sound data store for recording the detected sound data for selective later manifestation of sound 25 information such as an audible replay of the sounds, and/or a visual display representing the sounds or parameters of the sounds, and/or a display of an analysis of parameters of the sounds; a processing system for processing the video image data and/or the sound data so as to relatively time shift the video display and the manifested sound information so that they are better synchronised than the actual video capture times and the sound detection times; and 30 wherein the processing system is calibrated by: (a) generating a test sound in the target area while the video camera is capturing and the video store is storing sequences of video images of the test sound generation, (b) detecting by the sound detector and recording in the sound data store the test sound generated, 18 WO 2015/013752 PCT/AU2014/000772 (c ) analysing a video replay by the visual display so as to determine the instant in the sequence of video images of the test sound generation, (d) determining a time shift to be relatively applied to the actual detection time of the test sound so as to synch ronise the video replay of video images captured at the 5 instant of the test sound generation with the manifestation of sound information relating to the actual test sound generated and detected.
14. Apparatus as claimed in claim 13 wherein the analysing step (c) in the calibration of the processing system includes determining the instant of the test sound generated along a time 10 continuum including times both at and between successive sequential captured video images or frames, and wherein the test sound generating step (a) of the calibration includes operating a test sound generator having mechanical means whose mechanical operation is visually observable to indicate by the mechanical configuration the moment of generation of the test sound. 15 IS, Apparatus as claimed in claim 14 wherein the mechanical means of the test sound generator is operative to change its mechanical configuration and to generate the test sound upon its reaching a predetermined observable mechanical configuration, 16, Apparatus as claimed in claim 13 wherein the analysing step (c) in the calibration of the 20 processing system includes determining the instant of the test sound generated along a time continuum including times both at and between successive sequential captured video images or frames, and wherein the test sound generating step (a) of the calibration includes operating a test sound generator having an associated count-down time display which visually displays in a manner captured in the video images the time remaining before the instant of generation of the 25 test sound whereby successive captured video image frames show the countdown time display and the last captured image of the visual display before the test sound generation enable determination of the time interval in the time continuum after that image capture that elapses before the test sound generation. 30 17. Apparatus as claimed in any one of claims 13 to 16 wherein the processing system is calibrated by repeating the performance of steps (a) to (d) at each of a plurality of predetermined test locations at different separation distances from tie sound detector so as to establish a stored set of differing time shifts to be selectively applied depending on an estimation of the vicinity of a contact of interest during an event or activity. 35 18, Apparatus as claimed in claim 17 wherein there are three repetitions of the performance of steps (a) to (d), the predetermined test locations including an expected average or most common 19 WO 2015/013752 PCT/AU2014/000772 location at which a contact of interest occurs, a farther test location at a greater separation from the sound detector, and a nearer test location closer to the sound detector than the average most common location, Whereby an operator monitoring the event or activity or monitoring captured video images thereof can select from the approximate distance of expected separation of a contact 5 of interest from the sound detector whichever of the stored set of differing time shifts is most applicable to synchronise the video display and sound manifestation relating to a contact of interest.
19. A method of enabling determination of parameters of a contact of interest occurring in the 10 performance of an event or activity, particularly a sporting or entertainment event or activity, the method of including the steps; capturing sequences of video images of the event or activity occurring in an imaged target area where the contact of interest is to be encountered and storing the captured video image data for selective later replay by a visual display; is detecting sounds propagating in the vicinity of the target area and recording the detected sound data for selective later manifestation of sound information such as an audible replay of the sounds, andfor a visual display representing the sounds or parameters of the sounds, and/or a display of an analysis of parameters of the sounds; processing the video image data and/or the sound data by a processing system so as to 20 relatively time shift the video display and the manifested sound information so that they are better synehronised than the actual video capture times and the sound detection times; and wherein the relative time shift is determined by determining the location relative to the sound detector of the actual or expected or suspected contact of interest along a spatial continuum and deriving from that location the expected instant of the incident of interest along a tirmc 25 continuum including times both at and between successive sequential captured video images or frames, followed by applying the determined time shift to a slow motion replay so as to synchronise the video replay of video images captured at the instant of sound generation from an incident of interest with the manifestation of sound information relating to the actual sound detected from the incident of interest. 30 20. A set of stored data values recording captured image and sound data relating to a sporting or entertainment event or activity in which parameters of a contact of interest occurring in the performance of the event or activity are to be determined, 20 WO 2015/013752 PCT/AU2014/000772 the video data comprising recorded sequences of video images captured by a video camera directed towards the event or activity occurring in an imaged target area where the contact of interest was encountered and then being stored for selective later replay by a visual display; the sound data comprising recorded sounds propagating in the vicinity of the target area 5 where the contact of interest was encountered and detected by a sound detector and being stored for selective later manifestation of sound information such as an audible replay of the sounds, and/or a visual display representing the sounds or parameters of the sounds, and/or a display of an analysis of parameters of the sounds; wherein the set of stored data values further includes time shift data enabling a 10 processing system for processing the video image data and/or the sound data so as to relatively time shift the video display and the manifested sound information so that they are better synchronised than the actual video capture times and the sound detection times; and wherein the time shift data has been determined by: (a) generating a test sound in the target area while the video camera is capturing and the video 15 store is storing sequences of video images of the test sound generation, (b) detecting by the sound detector and recording in the sound data store the test sound generated, (c) analysing a video replay by the visual display so as to determine the instant in the sequence of video images of the test sound generation, 20 (d) determining a time shift to be relatively applied to the actual detection time of the test sound so as to synchronise the video replay of video images captured at the instant of the test sound generation with the manifestation of sound information relating to the actual test sound generated and detected. 21
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2014295901A AU2014295901B2 (en) | 2013-08-01 | 2014-08-01 | Synchronisation of video and audio capture |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2013902873A AU2013902873A0 (en) | 2013-08-01 | Calibration for video and audio capture | |
AU2013902873 | 2013-08-01 | ||
AU2014295901A AU2014295901B2 (en) | 2013-08-01 | 2014-08-01 | Synchronisation of video and audio capture |
PCT/AU2014/000772 WO2015013752A1 (en) | 2013-08-01 | 2014-08-01 | Synchronisation of video and audio capture |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2014295901A1 true AU2014295901A1 (en) | 2016-02-11 |
AU2014295901B2 AU2014295901B2 (en) | 2017-08-31 |
Family
ID=52430751
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2014295901A Ceased AU2014295901B2 (en) | 2013-08-01 | 2014-08-01 | Synchronisation of video and audio capture |
Country Status (4)
Country | Link |
---|---|
AU (1) | AU2014295901B2 (en) |
GB (1) | GB2532154A (en) |
NZ (1) | NZ631342A (en) |
WO (1) | WO2015013752A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6996384B2 (en) * | 2018-03-27 | 2022-01-17 | 富士通株式会社 | Display program, display method and display device |
CN108881992A (en) * | 2018-07-09 | 2018-11-23 | 深圳市潮流网络技术有限公司 | A kind of multimedia audio-video data synchronization calculation method |
EP3879507A1 (en) * | 2020-03-12 | 2021-09-15 | Hexagon Technology Center GmbH | Visual-acoustic monitoring system for event detection, localization and classification |
CN112083246A (en) * | 2020-09-11 | 2020-12-15 | 四川长虹教育科技有限公司 | Touch display device system delay measuring device and method |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6831729B1 (en) * | 2001-12-06 | 2004-12-14 | Matthew L. Davies | Apparatus and method of using same for synchronizing film with sound |
GB0315007D0 (en) * | 2003-06-27 | 2003-07-30 | Roke Manor Research | An acoustic event synchronisation and characterisation system for sports |
WO2007098537A1 (en) * | 2006-03-01 | 2007-09-07 | Brennan Broadcast Group Pty Ltd | Detecting contacts during sporting activities etc |
WO2009094728A1 (en) * | 2008-01-31 | 2009-08-06 | Molyneux William M | A cricket bat and ball contact detection system and indicator |
US8447559B2 (en) * | 2009-02-03 | 2013-05-21 | R0R3 Devices, Inc. | Systems and methods for an impact location and amplitude sensor |
AU2013100500B4 (en) * | 2012-06-08 | 2013-08-15 | Brennan Broadcast Group Pty Ltd | Football contact determination |
-
2014
- 2014-08-01 WO PCT/AU2014/000772 patent/WO2015013752A1/en active Application Filing
- 2014-08-01 NZ NZ631342A patent/NZ631342A/en not_active IP Right Cessation
- 2014-08-01 GB GB1601448.2A patent/GB2532154A/en not_active Withdrawn
- 2014-08-01 AU AU2014295901A patent/AU2014295901B2/en not_active Ceased
Also Published As
Publication number | Publication date |
---|---|
AU2014295901B2 (en) | 2017-08-31 |
NZ631342A (en) | 2017-10-27 |
WO2015013752A1 (en) | 2015-02-05 |
GB201601448D0 (en) | 2016-03-09 |
GB2532154A (en) | 2016-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2007219710B2 (en) | Detecting contacts during sporting activities etc | |
AU2014295901B2 (en) | Synchronisation of video and audio capture | |
US20070021242A1 (en) | Method and system for optimiza of baseball bats and the like | |
US7978217B2 (en) | System for promoting physical activity employing impact position sensing and response | |
CN103990279B (en) | Based on the golf ball-batting analogy method of internet | |
CN104001330B (en) | Based on the golf ball-batting simulation system of internet | |
CN102075680A (en) | Image processing apparatus, image processing method and program | |
US10786742B1 (en) | Broadcast synchronized interactive system | |
WO2020010040A1 (en) | Systems and methods for determining reduced player performance in sporting events | |
JP2020030190A (en) | Position tracking system and position tracking method | |
JP5606222B2 (en) | Measuring apparatus and measuring method | |
US10950276B2 (en) | Apparatus and method to display event information detected from video data | |
JP6760610B2 (en) | Position measurement system and position measurement method | |
KR20000064088A (en) | Analysis Broadcasting System And Method Of Sports Image | |
RU2530863C1 (en) | Method of training and assessment of accuracy of free throws in basketball | |
JP7007165B2 (en) | Darts game device, darts fraud judgment method and program | |
AU2015291766A1 (en) | Systems for reviewing sporting activities and events | |
KR101276054B1 (en) | Ball mark tracking system | |
AU2013100500B4 (en) | Football contact determination | |
KR20230080954A (en) | Foul judgment system in badminton matches and method of operation of system | |
US11103760B2 (en) | Line fault detection systems and method for determining whether a sport gaming device has bounced off an area of a sports field | |
AU2012200201B8 (en) | Detecting contacts during sporting activities etc | |
JP7032948B2 (en) | Tactical analyzer | |
AU2013101177B4 (en) | Australian rules football goal post contact determination | |
KR101707626B1 (en) | Apparatus and method for comparing moving pictures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) | ||
MK14 | Patent ceased section 143(a) (annual fees not paid) or expired |