AU702724B1 - Image manipulation apparatus - Google Patents

Image manipulation apparatus Download PDF

Info

Publication number
AU702724B1
AU702724B1 AU77419/98A AU7741998A AU702724B1 AU 702724 B1 AU702724 B1 AU 702724B1 AU 77419/98 A AU77419/98 A AU 77419/98A AU 7741998 A AU7741998 A AU 7741998A AU 702724 B1 AU702724 B1 AU 702724B1
Authority
AU
Australia
Prior art keywords
image
images
interpolated
manipulation apparatus
manipulating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU77419/98A
Inventor
Ian Galbraith Hay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to AU77419/98A priority Critical patent/AU702724B1/en
Application granted granted Critical
Publication of AU702724B1 publication Critical patent/AU702724B1/en
Priority to GB9916612A priority patent/GB2341997B/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2625Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect

Description

Pluu/u11 zId/b3/ Regulation 3.2
AUSTRALIA
Patents Act 1990
ORIGINAL
COMPLETE SPECIFICATION STANDARD PATENT Nam3 of Applicant: Actual Inventor Address for service is: lan Galbraith Hay lan Galbraith Hay WRAY ASSOCIATES 239 Adelaide Terrace Perth, WA 6000 Attorney code: WR Invention Title: "Image Manipulation Apparatus" The following statement is a full description of this invention, including the best method of performing it known to me:- -2- This invention relates to an image manipulation apparatus.
This invention has particular, but not exclusive, utility in relation to televised sports broadcasts. One of the difficulties television viewers experience with televised sports broadcasts is that tracking the ball or other object can prove difficult when the ball or object is moving rapidly or is depicted against a low-contrast background. This problem is experienced in relation to many sports. including tennis, cricket, golf, football and baseball. In each of these sports, tracking the ball can be difficult if the ball is depicted against the sky or a crowd of people, or where the ball is simply travelling quickly with respect to the camera.
Australian Patent Specification 79750/94 discloses a sports event video manipulation system in which objects are tracked and may also be highlighted.
Whilst the system described in Application 79750/94 assists in tracking large, slow moving objects; however, where the object is small and moving rapidly across the image, highlighting the position of the object within each image may still leave it difficult for viewers to discern the object.
In accordance with one aspect of this invention, there is provided an image manipulation apparatus comprising: detection means arranged to locate an object in a first image and a second image and to produce an output corresponding to said object's location in each image; interpolation means responsive to at least one of the first image and the second image and to the output of said detection means for each of said first and second images, arranged to produce at least one interpolated image in which the location of said object is interpolated between its position in the first and second images; and enhancement means arranged to enhance the visibility of the object in at least one of the first image, the second image, or the interpolated image.
Prefetably, the interpolation means includes trajectory interpolation means responsive to the output of said detection means for each of said first and second images to calculate therefrom a path taken by the object, and to interpolate at least one position along said path.
Preferably, the interpolation means further includes image creation means responsive to one of the first and second images, the position interpolated from the trajectory interpolation means, and to the output of said detection means for the first or second image to produce the interpolated image therefrom.
Preferably, the trajectory interpolation means is also responsive to the position of the object in at least one image before or after said first and second images in calculating the path of the object.
Preferably, said trajectory interpolation means includes means for locating discontinuities in the path of said object, such as those produced by said object bouncing.
Preferably, the enhancement means is responsive to one of the first or second images and to the output of said detection means for the first or second images to produce therefrom an enhanced image with the visibility of the object therein enhanced.
Preferably, the enhancement means is arranged to increase the brightness of the object with respect to its surroundings.
Preferably, the brightness of the object with respect to its surroundings is increased by surrounding the object with a low brightness value.
Preferably, the enhancement means is arranged to increase the colour contrast of the object with respect to its surroundings.
Preferably, the contrast of the object with respect to its surroundings is increased by surrounding the object with a contrasting colour.
Preferably, the enhancement means further enhances the visibility of the object by elongating said object along its path.
Preferably, the enhancement means includes superimposition means arranged to superimpose the interpolated images onto at least one of the first or second images.
Preferably, the enhanced image is input to the interpolation means whereby the interpolated images also have the visibility of the object enhanced.
Preferably, the detection means is responsive to the first and second images and to a database storing at least one parameter which controls operation of the detection means in locating the object.
Preferably, the parameter is selected from the following list: typical object size, maximum and minimum object sizes, maximum rate of change of object size between subsequent images, maximum object speed, typical object colour and shape, proximity to the edges of the image, proximity to large objects.
In accordance with a second aspect of this invention, there is provided a method of manipulating an image, comprising the steps of: locating an object in a first image and a second image; producing at least one interpolated image in which the location of said object is interpolated between its position in the first and second images; and enhancing the visibility of the object in at least one of the first image, the second image, or the interpolated image.
Preferably, the method further comprises the step of calculating a path taken by the object from its position in the first and second images, and interpolating at least one position along said path.
Preferably, the position of the object in at least one image before or after said first and second images is also used in the step of calculating the path of the object.
Preferably, said step of calculating the path taken by the object includes the step of locating discontinuities in the path of said object, such as those produced by said object bouncing.
Preferably, the step of enhancing comprises enhancing the visibility of the object in the first or second images according to its location.
Preferably, the step of enhancing includes increasing the brightness of the object with respect to its surroundings.
Preferably, the brightness of the object with respect to its surroundings is increased by surrounding the object with a low brightness value.
Preferably, the step of enhancing includes increasing the colour contrast of the object with respect to its surroundings.
Preferably, the contrast of the object with respect to its surroundings is increased by surrounding the object with a contrasting colour.
Preferably, the step of enhancing includes elongating said object along its path.
Preferably, the step of enhancing includes superimposing the interpolated images onto the first or second image.
Preferably, the step of locating the object is performed with reference to at least one parameter selected from the following list: typical object size, maximum and minimum object sizes, maximum rate of change of object size between subsequent images, maximum object speed, typical object colour and shape, proximity to the edges of the image, proximity to large objects.
The invention will now be described with reference to two embodiments and the accompanying drawings, in which:
M
Figure 1 is a block diagram of an image manipulation apparatus according to the first embodiment; Figures 2a and 2b show consecutive images input to the apparatus shown in figure 1; Figures 2c 2f show interpolated images produced by the apparatus shown in figure 1; Figure 2g shows the output image produced by the apparatus shown in figure 1; Figure 3a 3d illustrates the manner of operation of the detection means; and Figure 4 is a block diagram of the apparatus of the second embodiment.
The first embodiment shown in figure 1 is directed toward an image manipulation apparatus 10 embodied as a computer (not shown) running software. The image manipulation apparatus of the embodiment is intended for use in television broadcasts of sporting events, such as tennis matches. The apparatus 10 of the embodiment receives digital images from a source 12. The source 12 can be the output from a digital camera, or an analog-to-digital converted output from an analog camera.
A diagrammatic representation of a first digital image 14 and a second digital image 16 are shown in figures 2a and 2b, respectively. The digital images 14 and 16 include an object 18 and other imagery.
The digital images 14 and 16 shown in figures 2a and 2b represent an object 18 moving relatively fast across the digital images 14 and 16. As such, the position of the object 18 has significantly altered between the digital image 14 and the digital image 16, whilst the remainder of the image has remained substantially constant. In use, the source 12 produces a continuous stream of digital images,
I
WAILD.
-7whereby the digital images 14 and 16 would represent adjacent images in the stream.
The image manipulation apparatus 10 comprises a detection module 20, an interpolation module 22 and an enhancement module 24.
The detection module 20 processes each digital image and produces an output 26 corresponding to the location of the object in the digital image. In doing so, the detection module 20 references a database 28 containing parameters that control operation of the detection module 20. The database 28 may contain parameters for more than one sport, in which case the detection module references the database 28 according to the sport being recorded by the television camera, as selected by an operator.
The parameters stored in the database 28 include such things as the detection algorithm used by the detection module 20, the colour of the target object, the ideal shape of the target object, the range of sizes of the target object and the range of speed of the target object, proximity to the edges of the image, proximity to large objects. The database 28 may also contain parameters regarding objects to disregard in the image.
There are many algorithms that can be used by the detection module 20 to locate the object, including Fast Fourier Transform (FFT), motion detection algorithms, boundary detection algorithms and colour detection algorithms. The decision as to which algorithm to employ will be made according to which algorithm proves most efficient for the particular parameters and sport in question. Further, other algorithms may be equally effective, in which case such algorithms are also applicable to the embodiment.
An example of one form of detection algorithm will now be described with reference to figures 3a 3d. This detection algorithm utilises the fact that television images are raster based images, that is they comprise a series of -8successive scan lines. Each scan line consists of colour and brightness information for each picture element in the scan line. Typically, scan lines consist of several hundred picture elements. In a digital form, the colour and brightness of each picture element in each scan line is represented as one or more numbers.
Figure 3a shows a portion of a scan line taken from twelve successive images.
The brightness is shown along the Y-axis with the picture element position in the scan line shown along the X-axis. Figure 3a shows the brightness of the same scan line in twelve adjacent images taken from the same source. As such, the briqhtnP.ss information varies slightly from scan line to scan line, but is generally the same magnitude, representing a background.
Figure 3b shows the same scan lines shown in figure 3a, this time with a moving object included. The object moves ten picture elements between each successive scan line. That is, the moving object is at position 10 in the first scan line, position 20 in the second scan line and so on. It should be appreciated that figures 3a and 3b represent a single scan line only. In practice, an object would occupy multiple scan lines, however for the present example algorithm each scan line can be initially processed independently.
The presence of the moving object will result in the brightness and colour values where the object is located being different from those normally present in the background information. Accordingly, the object is most simply detected by subtracting the brightness information of the preceding scan line from the current scan line. The results are shown in figure 3c, with low-level noise throughout the scan line, along with peaks corresponding to the position of the moving object in successive scan lines.
Figure 3d shows the results of figure 3c with the noise truncated therefrom, leaving the information as to the position of the ubject in each scan line, -9- When this algorithm is applied to an entire image, each scan line may contain one or more picture elements corresponding to the position of the object. If desired, this information can be further processed to detect the boundary of the object, compute its centroid or other processing as desired. Alternatively, the position within each scan line of pixel elements which relate to the object can be output.
Further, additional processing can be performed to provide more consistent detection of the object. For instance, other objects in the image may be moving, such as players of the sport in question. By contrast with the object of interest, which is typically the ball, the players are relatively large, and so can be removed by using a size-limiting filter in accordance with information contained in the database 28. Further, the "shape" of the picture elements indicated as belonging to the object can be compared with an ideal shape for the object. Objects which are vastly different from the ideal shape of the objec't can be ignored as not relevant.
It should be appreciated that the above algorithm is an example of one algorithm only, and other algorithms may be adopted. Further, enhancements to the above algorithm may be applied as desired. For instance, the above algorithm provides best results if the camera is not moving. Where the camera is zooming or panning, some correlation of the position of the background with preceding images may be necessary to ensure correct operation. It is envi-aged that the zooming and panning information from each source may be provided as input variables to the detection module 20 to assist in correlation.
A further enhancement involves the use of trajectory prediction or extrapolation, whereby based on the posith-n of the object in previous images, the position of the object in the current image is predicted. This may be used to create an initial field of search, thereby avoiding the need to search the entire picture if the object is located in the initial field of search.
The output 26 of the detection module 20 corresponds with the location of the object 18 within the digital image 16. The output 26 may be in any convenient format, such as co-ordinates of pixels considered to comprise the object 18, a boundary definition for the object 18 or an image which includes only those pixels considered to be within the object 18.
The enhancement module 24 of the embodiment comprises an overlay generation module 30 and an image superimposition module 32.
The overlay generation module 30 receives the digital image 16 and the output 26 from the detection module 20 and produces therefrom an intermediate image 34.
The intermediate image 34 comprises neutral background, such as black, and a highlighted portion in a location corresponding to that of the object 18 in the digital image 16.
It should be appreciated that enhancement of the object 18 can be performed in the number of ways. For instance, increasing the brightness, contrast, or altering the colour of the object within the image, surrounding the object with a contrasting halo, positioning one or more arrows adjacent to the object in the image, and so forth.
The trajectory interpolation module 22 receives the output 26 from the detection module 20. The trajectory interpolation module 22 includes a buffer (not shown) which stores the output 26 corresponding to a preceding image, in the embodiment the first digital image 14. The trajectory interpolation module 22 interpolates a path between the location of the object 18 in the preceding image and the location of the object 18 in the digital image 16. The trajectory interpolation module 22 then interpolates four points along the path of the object 18 and produces therefrom four outputs 36a, 36b, 36c and 36d, each corresponding to one of said points.
Each of the outputs 36a, 36b, 36c and 36d is input to a corresponding interpolation overlay module 38a, 38b, 38c and 38d, respectively. Each interpolation overlay module 38a, 38b, 38c and 38d is also responsive to the -11 output 26 from the detection module 20 and the digital image 16. Each interpolation overlay module 38a 38d produces an interpolated image 40a respectively, each of which includes a highlighted portion 42. The highlighted portions 42 are located within each of the interpolated images 40a 40d at a location corresponding to the interpolated location in the respective output 36a 36d. The highlighted portion 42 in each image 40a 40d is formed by translating the object 18 from the digital image 16 and the output 26. The interpolated images 40a 40d are shown at figures 2c 2f, respectively.
The image superimposition module 32 superimposes the intermediate image 34, the interpolated images 40a 40d and the digital image 16 to produce an output image 44, as shown in figure 2g. As can be seen from figure 2g, the output image 44 has the visibility of the object 18 enhanced both in terms of contrast and by providing a tail 46 along the trajectory of the object.
As described above in relation to the enhancement module 24, the interpolation overlay modules 38a 38d may also highlight the interpolated positions of the object in a number of ways. Further, it is envisaged that each module 38a 38d may highlight the corresponding portion by a different amount so as to produce a fading effect.
A second embodiment is shown in figure 4 with like reference numerals denoting like parts to those shown in figure 1. In contrast to the first embodiment, the image manipulation apparatus 50 of the second embodiment includes four additional image superimposition modules 32a, 32b, 32c and 32d, one for each of the interpolated overlay modules 38a, 38b, 38c and 38d.
The apparatus 50 also includes an image editing module 52 which receives the output 26 from the detection module 20 and the digital image 16. The image editing module 52 produces therefrom an edited image 54 of the same form as the image 16 in which the object 18 has been removed therefrom and replaced -12with surrounding colours. This can be achieved by reference to the preceding image 14.
Each image superimposition module 32a 32d is responsive to the edited image 54 and to a corresponding interpolated image 40a 40d, respectively to produce therefrom an interpolated output image 44a 44d, respectively.
The interpolated output images 44a 44d can then be replayed as normal images to simulate slow motion with interpolated positions of the object. In contrast to simply slowing the original images, the interpolated output images 44a 44d include additional information as to the movement of the object from the image 14 to the image 16.
A variant of this embodiment is to make the trajectory interpolation module 22 and the overlay generation modules 38a 38d responsive to the overlay image 34 or the output image 44 rather than the image 26. In this variation, the interpolated output image 44a 44d each have a highlighted object 18.
It should be appreciated that the scope of the present invention need not be limited to the particular scope of the embodiments described above.
In particular, it is envisaged that the invention can be embodied either as hardware, software or a combination of both.
In other embodiments, it is envisaged that the image superimposition module 32 may receive a further image from another source such as the database 28, which would be combined with the original image and the overlay image. The further image, it is envisaged, may contain information to be overlayed on the image, such as the speed of the object, or may contain portions of the image to be enhanced in addition to said object. One example of the latter is where the further image comprises a stored image of an empty playing area to enhance the visibility of the object relative to the playing area markings. Alternatively, the 'r.
-13overlay generation module 30 may be arranged to insert the information into the overlay image 34.
Where the images are taken from a known reierence point, the speed of the object can be calculated from the location and size of the object in two or more subsequent images and said speed superimposed on the output image.
Further, other methods of determining the location of the object within an image and enhancing the visibility of the object may be employed.
The trajectory interpolation module may refer to several preceding images and corresponding object position,; to better interpolate the position of the object.
Additionally, the apparatus may be arranged to process images from several sources. This is particularly advantageous if the sources are positioned orthogonally to each other, eg. side, end and top views.
-1~14"

Claims (23)

1. An image manipulation apparatus comprising: detection means arranged to locate an object in a first image and a second image and to produce an output corresponding to said object's location in each image; interpolation means responsive to at least one of the first image and the second image and to the output of said detection means for each of said first and second images, arranged to produce at least one interpolated image in which the location rl said object is interpolated between its position in the first and second images; and enhancement means arranged to enhance the visibility of the object in at least one of the first image, the second image, or the interpolated image.
2. An image manipulation apparatus as claimed in claim 1, whE.'ein the interpolation means includes trajectory interpolation means responsive to the output of said detection means for each of said first and second images to calculate therefrom a path taken by the object, and to interpolate at least one position along said path.
3. An image manipulation apparatus as claimed in claim 2, wherein t' interpolation means further includes image creation means responsive to one of the first and second images, the position interpolated from the trajectory interpolation means, and to the output of said detection means for the first or second image to produce the interpolated image therefrom.
4. An image manipulation apparatus as claimed in claim 2 or 3, wherein the tra ectory interpolation means is also responsive to the position of the object in at least one image before or after said first and second images in calculating the path of the object. An image manipulation apparatus as claimed in any one of claims 2 to 4, whereii said trajectory interpolation means includes means for locating discontinuities in the path of said object, such as those produced by said object bouncing.
6. An image manipulation apparatus as claimed in any one of the preceding claims, wherein the enhancement means is responsive to one of the first or second images and to the output of said detection means for the first or second images to produce therefrom an enhanced image with the visibility of the objC therein enhanced.
7. An image manipulation apparatus as claimed in claim 6, wherein the enhancement means is arranged to increase the brightness of the object with respect to its surroundings.
8. An image manipulation apparatus as claimed in claim 7, wherein the brightness of the object with respect to its surroundings is increased by surrounding the object with a low brightness value.
9. An image manipulation apparatus as claimed in claim 6, 7 or 8, wherein the enhancement means is arranged to increase the colour contrast of the object with respect to its surroundings. image manipulation apparatus as claimed in claim 9, wherein the contrast of the object with respect to its surroundings is increased by surrounding the object with a contrasting colour.
11.An image manipulation apparatus as claimed in any one of claims 6 to wherein the enhancement means further enhances the visibility of the object by elongating said object along its path.
12.An image manipulation apparatus as claimed in claim 11, wherein the enhancement means includes suoerimposition means arraned to superimpose the interpolated images onto the first or second image. -16-
13.An image manipulation apparatus as claimed in any one of claims 6 to wherein the enhanced image is input to the interpolation means whereby the interpolated images also have the visibility of the object enhanced.
14.An image manipulation apparatus as claimed in any one of the preceding claims, wherein the detection means is responsive to the first and second images and to a database storing at least one parameter which controls operation of the detection means in locating the object. imag.: manipulation apparatus as claimed in claim 14, wherein the parameter is selected from the following list: typical object size, maximum and minimum object sizes, mraximum rate of change of object size between subsequent images, maximum object speed, typical object colour and shape, proximity to the edge of the image, proximity to large objeuts.
16.An image manipulation apparatus substantially as herein described with reference to figures 1 to 3d or figure 4 of the accompanying drawings.
17.A method of manipulating an image, comprising the steps of: locating an object in a first image and a second image; producing at least one interpolated image in which the location of said object is interpolated between its position in the first and second images; and enhancing the visibility of the object in at least one of the first image, the second image, or the interpolated image.
18.A method of manipulating an image as claimed in claim 17, further comprising the step of calculating a path taken by the object from its position in the first and second images, and interpolating at least one position along said path.
19.A method of manipu!ating an image as claimed in claim 18, wherein the position of the object in at least one image before or after said first and second images is also used in the step of calculating the path of the object. I -17- method of manipulating an image as claimed in any one of claims 17 to 19, wherein said step of calculating the path taken by the object includes the step of locating discontinuities in the path of said object, such as those produced by said object bouncing.
21.A method of manipulating an image as claimed in any one of claims 17 to 19, wherein the step of enhancing comprises enhancing the visibility of the object in the first or second images according to its location.
22.A method of manipulating an image as claimed in claim 21, wherein the step of enhancing includes increasing the brightness of the object.
23.A method of manipulating an image as claimed in claim 21 or 22, wherein the step of enhancing includes increasing the colour contrast of the object with respect to its surroundings.
24.A method of manipulating an image as claimed in claim 23, wherein the contrast of the object with respect to its surroundings is increased by surrounding the object with a contrasting colour. method of manipulating an image as claimed in any one of claims 21 to 24, wherein the step of enhancing includes elongating said object along its path.
26.A method of manipulating an image as claimed in any one of claims 21 to wherein the step of enhancing includes superimposing the interpolated images onto the first or second image.
27.A method of manipulating an image as claimed in any one of claims 17 to 26, wherein the step of locating the object is performed with reference to at least one parameter selected from the following list: typical object size, maximum and minimum object sizes, maximum rate of change of object size between subsequent images, maximum object speed, typical object colour and shape, proximity to the edge of the image, proximity to large objects. -18-
28.A method of manipulating an image substantially as herein described with reference to figures 1 to 3d or figure 4 of the accompanying drcwings. Dated this 20th day of July 1998. lan Galbraith HAY APPLICANT Wray Associates Perth, Western Australia Patent Attorneys for the Applicant ABSTRACT An image manipulation apparatus comprising: detection means arranged to locate an object such as a tennis ball in a first image and a second image, interpolation means responsive to the second image and to the location of the object in each of said first and second images, arranged to produce at least one interpolated image in which the location of said object is interpolated between its position in the first and second images; and enhancement means arranged to enhance the visibility of the object in at least one of the first image, thile second image, or the interpolated image.
AU77419/98A 1998-07-20 1998-07-20 Image manipulation apparatus Ceased AU702724B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU77419/98A AU702724B1 (en) 1998-07-20 1998-07-20 Image manipulation apparatus
GB9916612A GB2341997B (en) 1998-07-20 1999-07-16 Image manipulation apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU77419/98A AU702724B1 (en) 1998-07-20 1998-07-20 Image manipulation apparatus

Publications (1)

Publication Number Publication Date
AU702724B1 true AU702724B1 (en) 1999-03-04

Family

ID=3757934

Family Applications (1)

Application Number Title Priority Date Filing Date
AU77419/98A Ceased AU702724B1 (en) 1998-07-20 1998-07-20 Image manipulation apparatus

Country Status (2)

Country Link
AU (1) AU702724B1 (en)
GB (1) GB2341997B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8059152B2 (en) 2008-07-16 2011-11-15 Sony Corporation Video detection and enhancement of a sport object
WO2012101542A1 (en) * 2011-01-28 2012-08-02 Koninklijke Philips Electronics N.V. Motion vector based comparison of moving objects

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995010915A1 (en) * 1993-10-12 1995-04-20 Orad, Inc. Sports event video
JPH09214869A (en) * 1996-02-06 1997-08-15 Nippon Telegr & Teleph Corp <Ntt> Method for real time multiplex read/write for moving image
AU5629998A (en) * 1997-02-24 1998-08-27 Redflex Traffic Systems Pty Ltd Imaging apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4168510A (en) * 1978-01-16 1979-09-18 Cbs Inc. Television system for displaying and recording paths of motion
JP3363039B2 (en) * 1996-08-29 2003-01-07 ケイディーディーアイ株式会社 Apparatus for detecting moving objects in moving images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995010915A1 (en) * 1993-10-12 1995-04-20 Orad, Inc. Sports event video
JPH09214869A (en) * 1996-02-06 1997-08-15 Nippon Telegr & Teleph Corp <Ntt> Method for real time multiplex read/write for moving image
AU5629998A (en) * 1997-02-24 1998-08-27 Redflex Traffic Systems Pty Ltd Imaging apparatus

Also Published As

Publication number Publication date
GB9916612D0 (en) 1999-09-15
GB2341997A (en) 2000-03-29
GB2341997B (en) 2002-10-09

Similar Documents

Publication Publication Date Title
US5892554A (en) System and method for inserting static and dynamic images into a live video broadcast
US5353392A (en) Method and device for modifying a zone in successive images
EP3127321B1 (en) Method and system for automatic television production
EP1864505B1 (en) Real-time objects tracking and motion capture in sports events
US8077917B2 (en) Systems and methods for enhancing images in a video recording of a sports event
US9196043B2 (en) Image processing apparatus and method
EP1798691A2 (en) Method and apparatus for generating a desired view of a scene from a selected viewpoint
US10205889B2 (en) Method of replacing objects in a video stream and computer program
US8761497B2 (en) Removal of shadows from images in a video signal
US10515471B2 (en) Apparatus and method for generating best-view image centered on object of interest in multiple camera images
JPH11508099A (en) Scene Motion Tracking Method for Raw Video Insertion System
JPH09185720A (en) Picture extraction device
WO1997000581A1 (en) System and method for inserting static and dynamic images into a live video broadcast
AU702724B1 (en) Image manipulation apparatus
Kumar et al. The extraction of events and replay in cricket video
JP3569391B2 (en) Character appearance frame extraction device
Vanherle et al. Automatic Camera Control and Directing with an Ultra-High-Definition Collaborative Recording System
JP2005182402A (en) Field area detection method, system therefor and program
CN116524015A (en) Image processing apparatus, image processing method, and storage medium
WO2021261997A1 (en) Method for detecting and/or tracking moving objects within a certain zone and sports video production system in which such a method is implemented
JP2023161439A (en) Video processing device, control method for the same, and program
KR20050008246A (en) An apparatus and method for inserting graphic images using camera motion parameters in sports video
Chen et al. Motion-tolerance contextual visual saliency preserving for video retargeting
JPH02274179A (en) Automatic tracing device for moving object

Legal Events

Date Code Title Description
MK14 Patent ceased section 143(a) (annual fees not paid) or expired