GB2341997A - Object enhancement for viewing sporting images - Google Patents

Object enhancement for viewing sporting images Download PDF

Info

Publication number
GB2341997A
GB2341997A GB9916612A GB9916612A GB2341997A GB 2341997 A GB2341997 A GB 2341997A GB 9916612 A GB9916612 A GB 9916612A GB 9916612 A GB9916612 A GB 9916612A GB 2341997 A GB2341997 A GB 2341997A
Authority
GB
United Kingdom
Prior art keywords
image
images
interpolated
detection
enhancing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9916612A
Other versions
GB9916612D0 (en
GB2341997B (en
Inventor
Ian Galbraith Hay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of GB9916612D0 publication Critical patent/GB9916612D0/en
Publication of GB2341997A publication Critical patent/GB2341997A/en
Application granted granted Critical
Publication of GB2341997B publication Critical patent/GB2341997B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2625Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect

Abstract

An image manipulation apparatus and method for enhancing an object in a tv image particularly applicable to displaying sporting events, comprises:<BR> a detection means to detect an object such as a tennis ball in a first and second image, the detection algorithm being potentially one of the following - Fast Fourier transform, motion detection, boundary detection or colour detection,<BR> an interpolation means for interpolating between first and second images and<BR> an enhancing means for enhancing the object in the first, second and interpolated images. The enhancing means comprising an overlay generator that generates an intermediate image consisting of a neutral background and a highlighted portion corresponding to the location of the object. This intermediate image is superimposed with the interpolated and first and second images to produced an enhanced image of the object with a trail along its trajectory.

Description

2341997 This invention relates to an image manipulation apparatus.
This invention has particular, but not exclusive, utility in relation to televised sports broadcasts. One of the difficulties television viewers experience with televised sports broadcasts is that tracking the ball or other object can prove difficuft when the ball or object is moving rapidly or is depicted against a low- contrast background. This problem is experienced in relation to many sports, including tennis, cricket, golf, football and baseball. In each of these sports, tracking the ball can be difficult if the ball is depicted against the sky or a crowd of people, or where the bail is simply travelling quickly with respect to the camera.
Australian Patent Specification 79750194 discloses a sports event video manipulation system in which objects are tracked and may also be highlighted. Whilst the system described in Application 79750194 assists In tracking large, slow moving objects, however, where the object is small and moving rapidly across the image, highlighting the position of the object within each image may still leave h difficult for viewers to discern the object.
In accordance with one aspect of this invention, there is provided an image manipulation apparatus comprising: detection means arranged to locate an object in a first image and a second image and to produce an output corresponding to said object's location in each image; interpolation means responsive to at least one of the first image and the second image and to the output of said detection means for each of said first and second images, arranged to produce at least one interpolated image in which the location of said object is interpolated between its position in the first and second images; and enhancement means arranged to enhance the visibility of the object in at least one of the first image, the second image, or the interpolated image.
Preferably, the interpolation means includes trajectory interpolation means responsive to the output of said detection means for each of said first and second images to calculate therefrom a path taken by the object, and to interpolate at least one position along said path.
Preferably, the interpolation means further includes image creation means responsive to one of the first and second images, the position interpolated from the trajectory interpolation means, and to the output of said detection means for the first or second image to produce the interpolated image therefrom.
Preferably, the trajectory interpolation means is also responsive to the position of the object in at least one image before or after said first and second images in calculating the path of the object.
Preferably, said trajectory interpolation means includes means for locating discontinuities in the path of said object, such as those produced by said object bouncing.
Preferably, the enhancement means is responsive to one of the first or second images and to the output of said detection means for the first or second images to produce therefrom an enhanced image with the visibility of the object therein enhanced.
Preferably, the enhancement means Is arranged to increase the brightness of the object with respect to its surroundings.
Preferably, the brightness of the object with respect to its surroundings is increased by surrounding the object with a low brightness value.
Preferably, the enhancement means is arranged to increase the colour contrast of the object with respect to its surroundings.
Preferably, the contrast of the object with respect to its surroundings is Increased by surrounding the object with a contrasting colour.
Preferably, the enhancement means further enhances the visibility of the object by elongating said object along its path.
Preferably, the enhancement means includes superimposition means arranged to superimpose the interpolated images onto at least one of the first or second 5 images.
Preferably, the enhanced image Is input to the interpolation means whereby the interpolated images also have the visibility of the object enhanced.
Preferably, the detection means is responsive to the first and second images and to a database storing at least one parameter which controls operation of the detection means in locating the object.
Preferably, the parameter is selected from the following list: typical object size, maximum and minimum object sizes, maximum rate of change of object size between subsequent images, maximum object speed, typical object colour and shape, proximity to the edges of the image, proximity to large objects.
In accordance with a second aspect of this invention, there Is provided a method of manipulating an image, comprising the steps of: locating an object in a first image and a second image; producing at least one interpolated image in which the location of said object is interpolated between its position in the first and second images; and enhancing the visibility of the object in at least one of the first image, the second image, or the interpolated image.
Preferably, the method further comprises the step of calculating a path taken by the object from its position in the first and second images, and interpolating at least one position along said path.
Preferably, the position of the object in at least one image before or after said first and second images is also used in the step of calculating the path of the object.
Preferably, said step of calculating the path taken by the object includes the step of locating discontinuities in the path of said object, such as those produced by said object bouncing.
Preferably, the step of enhancing comprises enhancing the visibility of the object 5 in the first or second images according to its location.
Preferably, the step of enhancing includes increasing the brightness of the object with respect to its surroundings.
Preferably, the brightness of the object with respect to its surroundings is increased by surrounding the object with a low brightness value.
Preferably, the step of enhancing includes increasing the colour contrast of the object with respect to its surroundings.
Preferably, the contrast of the object with respect to its surroundings is increased by surrounding the object with a contrasting colour.
Preferably, the step of enhancing Includes elongating said object along its path, Preferably, the step of enhancing includes superimposing the interpolated images onto the first or second image.
Preferably, the stop of locating the object is performed with reference to at least one parameter selected from the following list: typical object size, maximum and minimum object sizes, maximum rate of change of object size between subsequent images, maximum object speed, typical object colour and shape, proximity to the edges of the image, proximity to large objects.
The invention will now be described with reference to two embodiments and the accompanying drawings, in which:
Figure 1 is a block diagram of an Image manipulation apparatus according to the first embodiment; Figures 2a and 2b show consecutive images input to the apparatus shown in figure 1; Figures 2c - 2f show interpolated images produced by the apparatus shown in figure 1; Figure 2g shows the ouput image produced by the apparatus shown in figure 11; Figure 3a - 3d illustrates the manner of operation of the detection means; and Figure 4 is a block diagram of the apparatus of the second embodiment.
The first embodiment shown in figure 1 is directed toward an image manipulation apparatus 10 embodied as a computer (not shown) running software, The image manipulation apparatus of the embodiment is intended for use In television broadcasts of sporting events, such as tennis matches. The apparatus 10 of the embodiment receives digital images from a source 12. The source 12 can be the output from a digital camera. or an analog-to-digital converted output from an analog camera.
A diagrammatic representation of a first digital image 14 and a second digital image 16 are shown in figures 2a and 2b, respectively. The digital images 14 and 16 include an object 18 and other imagery.
The digital images 14 and 16 shown in figures 2a and 2b represent an object 18 moving relatively fast across the digital images 14 and 16. As such, the position of the object 18 has significantly altered between the digftW image 14 and the digital image 16, whilst the remainder of the image has remained substantially constant. In use, the source 12 produces a continuous stream of digital images, 6- whereby the digital images 14 and 16 would represent adjacent images in the stream.
The image manipulation apparatus 10 comprises a detection module 20, an interpolation module 22 and an enhancement module 24.
The detection module 20 processes each digital image and produces an output 26 corresponding to the location of the object in the digital image. In doing so, the detection module 20 references a database 28 containing parameters that control operation of the detection module 20. The database 28 may contain parameters for more than one sport, in which case the detection module 20 references the database 28 according to the sport being recorded by the television camera, as selected by an operator.
The parameters stored in the database 28 include such things as the detection algorithm used by the detection module 20, the colour of the target object, the ideal shape of the target object, the range of sizes of the target object and the range of speed of the target object, proximity to the edges of the image, proximity to large objects. The database 28 may also contain parameters regarding objects to disregard in the image.
There are many algorithms that can be used by the detection module 20 to locate the object, including Fast Fourier Transform (FFr), motion detection algorithms, boundary detection algorithms and colour detection algorithms. The decision as to which algorithm to employ will be made according to which algorithm proves most efficient for the particular parameters and sport In question. Further, other algorithms may be equally effective, in which case such algorithms are also applicable to the embodiment.
An example of one form of detection algorithm will now be described with reference to figures 3a - 3d. This detection algorithm utilises the fact that television images are raster based images, that is they comprise a series of successive scan lines. Each scan line consists of colour and brightness information for each picture element in the scan line. Typically, scan lines consist of several hundred picture elements. In a digital form, the colour and brightness of each picture element in each scan line is represented as one or more numbers.
Figure 3a shows a portion of a scan fine taken from twelve successive images.
The brightness is shown along the Y-axis with the picture element position in the scan line shown along the X-axis. Figure 3a shows the brightness of the same scan line in twelve adjacent images taken from the same source. As such, the brightness information varies slightly from scan line to scan line, but is generally the same magnitude, representng a background.
Figure 3b shows the same scan lines shown in figure 3a, this time with a moving object included. The object moves ten picture elements between each successive scan line. That Is, the moving object is at position 10 in the first scan line, position 20 in the second scan line and so on, It should be appreciated that figures 3a and 3b represent a single scan line only. In practice, an object would occupy multiple scan lines, however for the present example algorithm each scan line can be initially processed independently.
The presence of the moving object will result in the brightness and colour values where the object is located being different from those normally present in the background information. A=rdingly, the object is most simply detected by subtracting the brightness information of the preceding scan line from the current scan line. The results are shown in figure 3c, with low-level noise throughout the scan line, along with peaks corresponding to the position of the moving object in successive scan lines.
Figure 3d shows the results of figure 3c with the noise truncated therefrom, leaving the information as to the position of the object in each scan line.
When this algorithm is applied to an entire image, each scan line may contain one or more picture elements corresponding to the position of the object. If desired, this information can be further processed to detect the boundary of the object, compute its centroid or other processing as desired. Alternatively, the position within each scan line of pixel elements which relate to the object can be output. Further, additional processing can be performed to provide more consistent detection of the object. For instance, other objects in the image may be moving, such as players of the sport in question. By contrast with the object of interest, which is typically the ball, the players are relatively large, and so can be removed by using a size-limiting filter in accordance with information contained in the database 28. Further, the "shape" of the picture elements indicated as belonging to the object can be compared with an ideal shape for the object. Objects which are vastly different from the ideal shape of the object can be ignored as not relevant.
It should be appreciated that the above algorithm is an example of one algorithm only, and other algorithms may be adopted. Further, enhancements to the above algorithm may be applied as desired. For instance, the above algorithm provides best results if the camera is not moving. Where the camera is zooming or panning, some correlation of the position of the background wIth preceding images may be necessary to ensure correct operation. It is envisaged that the zooming and panning information from each source may be provided as input variables to the detection module 20 to assist in correlation.
A further enhancement involves the use of trajectory prediction or extrapolation, whereby based on the position of the object in previous images, the position of the object in the current image is predicted. This may be used to create an initial field of search, thereby avoiding the need to search the entire picture if the object is located in the initial field of search.
The output 26 of the detedon module 20 corresponds with the location of the object 18 within the digital image 16. The output 26 may be in any convenient format, such as co-ordinates of pixels considered to comprise the object 18, a boundary definition for the object 18 or an image which includes only those pixels considered to be within the object 18.
9- The enhancement module 24 of the embodiment comprises an overlay generation module 30 and an Image superimposition module 32.
The overlay generation module 30 receives the digital image 16 and the output 26 from the detection module 20 and produces therefrom an intermediate image 34.
The intermediate image 34 comprises neutral background, such as black, and a highlighted portion in a location corresponding to that of the object 18 in the digital image 1 B. It should be appreciated that enhancement of the object 18 can be performed in the number of ways. For instance, increasing the brightness, contrast, or altering the colour of the object within the image, surrounding the object with a contrasting halo, positioning one or more arrows adjacent to the object in the image, and so forth.
The trajectory interpolation module 22 receives the output 26 from the detection module 20. The trajectory interpolation module 22 Includes a buffer (not shown) which stores the output 26 corresponding to a preceding image, in the embodiment the first digital image 14, The trajectory interpolation module 22 interpolates a path between the location of the object 18 in the preceding Image and the location of the object 18 in the digital image 16. The trajectory interpolation module 22 then interpolates four points along the path of the object 18 and produces therefrom four outputs 36a, 36b, 36c and 36d, each corresponding to one of said points.
Each of the outputs 36a, 36b, 36c and 36d is input to a corresponding interpolation overlay module 38a, 38b, 38c and 38d, respectively. Each interpolation overlay module 38a, 38b, 38c and 38d is also responsive to the output 26 from the detection module 20 and the digital image 16. Each interpolation overlay module 38a - 38d produces an interpolated image 40a - 40d respectively, each of which includes a highlighted portion 42. The highlighted portions 42 are located within each of the interpolated images 40a - 40d at a location corresponding to the interpolated location in the respective output 36a - 36d. The highlighted portion 42 in each image 40a - 40d is formed by translating the object 18 from the digital image 16 and the output 26. The interpolated images 40a - 40d are shown at figures 2c 2f, respectively.
The image superimposition module 32 superimposes the intermediate image 34, the interpolated images 40a - 40d and the digital image 16 to produce an output image 44, as shown in figure 2g. As can be seen from figure 2g, the output image 44 has the visibility of the object 18 enhanced both in terms of contrast and by providing a tail 46 along the trajectory of the object.
As described above in relation to the enhancement module 24, the interpolation overlay modules 38a - 38d may also highlight the interpolated positions of the object in a number of ways. Further, it is envisaged that each module 38a - 38d may highlight the corresponding portion by a different amount so as to produce a fading effect.
A second embodiment is shown in figure 4 with like reference numerals denoting like parts to those shown in figure 1. In contrast to the first embodiment, the image manipulation apparatus 50 of the second embodiment includes four additional image superimposition modules 32a, 32b, 32c and 32d, one for each of the interpolated overlay modules 38a, 38b, 38c and 38d.
The apparatus 50 also includes an image editing module 52 which receives the output 26 from the detection module 20 and the digital image 16. The Image editing module 52 produces therefrom an edited image 54 of the same form as the image 16 in which the object 18 has been removed therefrom and replaced with surrounding colours. This can be achieved by reference to the preceding image 14.
Each image superimposition module 32a - 32d is responsive to the edited image 54 and to a corresponding interpolated image 40a - 40d, respectively to produce therefrom an interpolated output image 44a - 44d, respectively.
The interpolated output images 44a - 44d can then be replayed as normal images to simulate slow motion with interpolated positions of the object. In contrast to simply slowing the original images, the interpolated output images 44a - 44d include additional information as to the movement of the object from the image 14 5 to the Image 16.
A variant of this embodiment is to make the trajectory interpolation module 22 and the overlay generation modules 38a - 38d responsive to the overlay image 34 or the output image 44 rather than the image 26. In this variation, the interpolated output Image 44a - 44d each have a highlighted object 1 B. It should be appreciated that the scope of the present invention need not be limited to the particular scope of the embodiments described above.
In particular, it is envisaged that the invention can be embodied either as hardware, software or a combination of both.
In other embodiments, it is envisaged that the image superimposition module 32 may receive a further image from another source such as the database 28, which would be combined with the original image and the overlay image. The further image, it is envisaged, may contain information to be overlayed on the image, such as the speed of the object, or may contain portions of the image to be enhanced in addition to said object. One example of the latter is where the further image comprises a stored image of an empty playing area to enhance the visibility of the object relative to the playing area markings. Alternatively, the overlay generation module 30 may be arranged to insert the information into the overlay image 34.
Where the images are taken from a known reference point, the speed of the object can be calculated from the location and size of the object in two or more subsequent images and said speed superimposed on the output image.
Further, other methods of determining the location of the object within an Image and enhancing the visibility of the object may be employed, The trajectory interpolation module may refer to several preceding images and corresponding object positions to better interpolate the posftion of the object.
Additionally, the apparatus may be arranged to process images from several sources. This is particularly advantageous if the sources are positioned orthogonally to each other, eg. side, end and top views.
The claims defining the invention are as follows:
1. An image manipulation apparatus comprising: detection means arranged to locate an object in a first image and a second image and to produce an output corresponding to said object's location in each image; interpolation means responsive to at least one of the first image and the second image and to the output of said detection means for each of said first and second images, arranged to produce at least one interpolated image in which the location of said object is interpolated between its position in the first and second images; and enhancement means arranged to enhance the visibility of the object in at least one of the first image, the second image, or the Interpolated image.
2. An Image manipulation apparatus as claimed in claim 1, wherein the interpolation means includes trajectory interpolation means responsive to the output of said detection means for each of said first and second images to calculate therefrom a path taken by the object, and to interpolate at least one position along said path.
3. An image manipulation apparatus as claimed in claim 2, wherein the interpolation means further includes image creation means responsive to one of the first and second images, the position interpolated from the traje ctory interpolation means, and to the output of said detection means for the first or second image to produce the interpolated image therefrom.
4. An image manipulation apparatus as claimed in claim 2 or 3, wherein the trajectory interpolation means Is also responsive to the position of the object in at least one image before or after said first and second images in calculating the path of the object.
5. An image manipulation apparatus as claimed in any one of claims 2 to 4, wherein said trajectory interpolation means includes means for locating discontinuities in the path of said object, such as those produced by said object bouncing.
6. An image manipulation apparatus as claimed in any one of the preceding claims, wherein the enhancement means is responsive to one of the first or second images and to the output of said detection means for the first or second images to produce therefrorn an enhanced image with the visibility of the object therein enhanced.
7. An image manipulation apparatus as claimed in claim 6, wherein the enhancement means is arranged to increase the brightness of the object with respect to its surroundings.
8. An image manipulation apparatus as claimed in claim 7, wherein the brightness of the object with respect to its surroundings is increased by surrounding the object with a low brightness value.
9. An image manipulation apparatus as claimed In claim 6, 7 or 8, wherein the enhancement means is arranged to increase the colour contrast of the object with respect to its surroundings.
1O.An irnage manipulation apparatus as claimed in claim 9, wherein the contrast of the object with respect to its surroundings is increased by surrounding the object with a contrasting colour.
1 1.An image manipulation apparatus as claimed in any one of claims 6 to 10, wherein the enhancement means further enhances the visibility of the object by elongating said object along its path.
12.An image manipulation apparatus as claimed In claim 11, wherein the enhancement means includes superimposition means arranged to superimpose the interpolated images onto the first or second image.
13.An image manipulation apparatus as claimed in any one of claims 6 to 10, wherein the enhanced image is input to the interpolation means whereby the interpolated images also have the visibility of the object enhanced.
15- 14.An image manipulation apparatus as claimed in any one of the preceding claims, wherein the detection means Is responsive to the first and second images and to a database storing at least one parameter which controls operation of the detection means in locating the object.
15.An image manipulation apparatus as claimed in claim 14, wherein the parameter is selected from the following list: typical object size, maximum and minimum object sizes, maximum rate of change of object size between subsequent images, maximum object speed, typical object colour and shape, proximity to the edge of the image, proximity to large objects.
16.An image manipulation apparatus substantially as herein described with reference to figures 1 to 3d or figure 4 of the accompanying drawings.
17.A method of manipulating an image, comprising the steps of: locating an object in a first image and a second image; producing at least one Interpolated image in which the location of said object is interpolated between its position in the first and second images; and enhancing the visibility of the object in at least one of the first image, the second image. or the interpolated image.
18.A method of manipulating an image as claimed in claim 17, further comprising the step of calculating a path taken by the object from its position in the first and second images, and interpolating at least one position along said path.
19.A method of manipulating an image as claimed In claim 18, wherein the position of the object in at least one Image before or after said first and second images is also used in the step of calculating the path of the object.
20.A method of manipulating an image as claimed in any one of claims 17 to 19, wherein said step of calculating the path taken by the object includes the step of locating discontinuities in the path of said object, such as those produced by said object bouncing.
16- 21.A method of manipulating an image as claimed in any one of claims 17 to 19, wherein the step of enhancing comprises enhancing the visiblifty of the object in the first or second images according to its location.
22.A method of manipulating an image as claimed in claim 21, wherein the step of enhancing includes increasing the brightness of the object.
23.A method of manipulating an image as claimed in claim 21 or 22, wherein the step of enhancing includes increasing the colour contrast of the object with respect to its surroundings.
24.A method of manipulating an image as claimed in claim 23, wherein the contrast of the object with respect to its surroundings is increased by surrounding the object with a contrasting colour.
25.A method of manipulating an image as claimed in any one of claims 21 to 24, wherein the step of enhancing includes elongating said object along its path.
26.A method of manipulating an image as claimed in any one of claims 21 to 25, wherein the step of enhancing includes superimposing the interpolated images onto the first or second image.
27.A method of manipulating an image as claimed in any one of claims 17 to 26, wherein the stop of locating the object is performed with reference to at least one parameter selected from the following list: typical object size, maximum and minimum object sizes, maximum rate of change of objectsize between subsequent images, maximum object speed, typical object colour and shape, proximity to the edge of the image, proximity to large objects.
28.A method of manipulating an image substantially as herein described with reference to figures 1 to 3d or figure 4 of the accompanying drawings.
GB9916612A 1998-07-20 1999-07-16 Image manipulation apparatus Expired - Fee Related GB2341997B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU77419/98A AU702724B1 (en) 1998-07-20 1998-07-20 Image manipulation apparatus

Publications (3)

Publication Number Publication Date
GB9916612D0 GB9916612D0 (en) 1999-09-15
GB2341997A true GB2341997A (en) 2000-03-29
GB2341997B GB2341997B (en) 2002-10-09

Family

ID=3757934

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9916612A Expired - Fee Related GB2341997B (en) 1998-07-20 1999-07-16 Image manipulation apparatus

Country Status (2)

Country Link
AU (1) AU702724B1 (en)
GB (1) GB2341997B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8059152B2 (en) 2008-07-16 2011-11-15 Sony Corporation Video detection and enhancement of a sport object
WO2012101542A1 (en) * 2011-01-28 2012-08-02 Koninklijke Philips Electronics N.V. Motion vector based comparison of moving objects

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2012515A (en) * 1978-01-16 1979-07-25 Cbs Inc Television system for displaying or recording pahts of motion
WO1995010915A1 (en) * 1993-10-12 1995-04-20 Orad, Inc. Sports event video
GB2316826A (en) * 1996-08-29 1998-03-04 Kokusai Denshin Denwa Co Ltd Moving object detection in a coded moving picture signal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3097063B2 (en) * 1996-02-06 2000-10-10 日本電信電話株式会社 Real-time multiplex read / write method of video
AU5629998A (en) * 1997-02-24 1998-08-27 Redflex Traffic Systems Pty Ltd Imaging apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2012515A (en) * 1978-01-16 1979-07-25 Cbs Inc Television system for displaying or recording pahts of motion
WO1995010915A1 (en) * 1993-10-12 1995-04-20 Orad, Inc. Sports event video
GB2316826A (en) * 1996-08-29 1998-03-04 Kokusai Denshin Denwa Co Ltd Moving object detection in a coded moving picture signal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8059152B2 (en) 2008-07-16 2011-11-15 Sony Corporation Video detection and enhancement of a sport object
WO2012101542A1 (en) * 2011-01-28 2012-08-02 Koninklijke Philips Electronics N.V. Motion vector based comparison of moving objects
RU2602792C2 (en) * 2011-01-28 2016-11-20 Конинклейке Филипс Электроникс Н.В. Motion vector based comparison of moving objects

Also Published As

Publication number Publication date
GB9916612D0 (en) 1999-09-15
AU702724B1 (en) 1999-03-04
GB2341997B (en) 2002-10-09

Similar Documents

Publication Publication Date Title
US5892554A (en) System and method for inserting static and dynamic images into a live video broadcast
EP3127321B1 (en) Method and system for automatic television production
US8077917B2 (en) Systems and methods for enhancing images in a video recording of a sports event
US10205889B2 (en) Method of replacing objects in a video stream and computer program
US5682437A (en) Method of converting two-dimensional images into three-dimensional images
US5673081A (en) Method of converting two-dimensional images into three-dimensional images
US10515471B2 (en) Apparatus and method for generating best-view image centered on object of interest in multiple camera images
US8798151B2 (en) Video display device, interpolated image generation circuit and interpolated image generation method
EP1798691A2 (en) Method and apparatus for generating a desired view of a scene from a selected viewpoint
KR20090006068A (en) Method and apparatus for modifying a moving image sequence
JPH11508099A (en) Scene Motion Tracking Method for Raw Video Insertion System
WO1997000582A1 (en) System and method of real time insertions into video using adaptive occlusion with a synthetic reference image
JPH09185720A (en) Picture extraction device
US20180035076A1 (en) Video processing apparatus, video processing system, and video processing method
EP0832537A4 (en) System and method for inserting static and dynamic images into a live video broadcast
US7091989B2 (en) System and method for data assisted chroma-keying
JP3740394B2 (en) High dynamic range video generation method and apparatus, execution program for the method, and recording medium for the execution program
JP2000048211A (en) Movile object tracking device
KR20030002919A (en) realtime image implanting system for a live broadcast
AU702724B1 (en) Image manipulation apparatus
WO2016199418A1 (en) Frame rate conversion system
JPH04213973A (en) Image shake corrector
JP2825863B2 (en) Moving object detection device
Chacón-Quesada et al. Evaluation of different histogram distances for temporal segmentation in digital videos of football matches from tv broadcast
CN110278439A (en) De-watermarked algorithm based on inter-prediction

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20030716