WO2000000980A2 - Playback editing of stored a/v material - Google Patents

Playback editing of stored a/v material Download PDF

Info

Publication number
WO2000000980A2
WO2000000980A2 PCT/IB1999/001103 IB9901103W WO0000980A2 WO 2000000980 A2 WO2000000980 A2 WO 2000000980A2 IB 9901103 W IB9901103 W IB 9901103W WO 0000980 A2 WO0000980 A2 WO 0000980A2
Authority
WO
WIPO (PCT)
Prior art keywords
clips
clip
data
reference time
time axis
Prior art date
Application number
PCT/IB1999/001103
Other languages
French (fr)
Other versions
WO2000000980A3 (en
Inventor
Octavius J. Morris
Ronald W. J. J. Saeijs
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Ab filed Critical Koninklijke Philips Electronics N.V.
Priority to AU39520/99A priority Critical patent/AU3952099A/en
Publication of WO2000000980A2 publication Critical patent/WO2000000980A2/en
Publication of WO2000000980A3 publication Critical patent/WO2000000980A3/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers

Definitions

  • the present invention relates to the storage, retrieval and editing of audio and/or video data particularly, but not essentially, in conjunction with optical disc storage for the data and the use of MPEG-compliant coding schemes.
  • A/V audio and/or video
  • Branch points along the path a user chooses to take through the interactive movie should ideally appear seamless, otherwise the user will lose the suspension of disbelief normally associated with watching a movie.
  • Other examples are the removal of advertisements from recordings of broadcast television programmes, or the editing of home video (camcorder) recordings.
  • a method for controlling read-out of stored audio/video data from a storage device comprising the steps of: identifying a number of separately stored data clips within the storage device to be read out in sequence; pinning the clips in respective positions along a single reference time axis; defining at least one presentation path through the clips, and reading out the same.
  • At least some of the clips pinned to the reference time axis may comprise more than one data component (for example, audio, video and graphics), with at least one separate presentation path defined through the clips for each such component.
  • clips not comprising common data components may be pinned to the reference time axis in overlapping manner, whilst others may be limited in dependence on the capabilities of the hosting device.
  • the clips stored may be discrete entities, or parts of longer sequences, in which case the step of identifying separately stored clips may suitably comprise identifying start and finish points in a stored data sequence longer than the clip, with the pinning to the reference time-line comprising placing those start and finish points along the time-line.
  • the process may include the further step of duplicating a clip on determining that component data therefrom is required more than once along the length of the reference time axis.
  • a stored data playback apparatus comprising memory access means coupled with a storage device containing a plurality of stored data objects, the memory access means being operable to identify a number of separately stored data clips within the storage device to be read out in sequence, to generate a single reference time axis and apply indicators thereto identifying when each identified clip is to be read out from the storage device, to generate at least one presentation path through the clips, and to read out the same.
  • the memory access means may suitably be configured to generate at least one separate presentation path defined through the clips for each such component.
  • the memory access means may be operable to identify a separately stored clip by identifying start and finish points in a stored data sequence longer than said clip, and to apply to the reference time-line those start and finish points.
  • the apparatus may further comprise memory write means coupled with the storage device and operable to write data at storage locations therein, said write means being further operable to duplicating a clip on determination by the memory access means that component data therefrom is required more than once along the length of the reference time axis.
  • This memory write means may be further arranged to write a sequence of clips assembled by the memory access means back to the storage means as a further clip.
  • Figure 1 is a block schematic representation of an optical disc record/reply apparatus suitable to embody the invention
  • Figure 2 is a more detailed schematic showing components within the apparatus of Figure 1 ;
  • Figure 3 represents the recording of blocks of information in sequence areas on an optical disc
  • Figure 4 represents the playback of information stored on the disc in Figure 3;
  • Figure 5 generally illustrates the editing of stored video data, with bridge sequences omitted.
  • Figure 6 represents the presentation of overlapping clips extracted from storage and applied to a reference time axis.
  • Figure 1 shows an embodiment of an apparatus suitable to host the present invention, in the form of an optical disc record and playback device.
  • the handling of frame-based audio and video signals is concentrated upon, although it will be recognised that other types of signal may alternately or additionally be processed, such as data signals, and the invention is equally applicable to other memory devices such as magnetic data storage means and computer hard disc drives.
  • the apparatus comprises an input terminal 1 for receiving a video signal to be recorded on optical disc 3. Further, the apparatus comprises an output terminal 2 for supplying a video signal reproduced from the disc.
  • the data area of the disc 3 consists of a contiguous range of physical sectors, having corresponding sector addresses. This address space is divided into sequence areas, with a sequence area being a contiguous sequence of sectors, with a fixed length.
  • the apparatus as shown in Figure 1 is decomposed into two major system parts, namely the disc subsystem DSS 6 and what is referred to herein as the video recorder subsystem VRS 8 controlling both recording and playback.
  • the two subsystems are characterised by a number of features, as will be readily understood, including that the disc subsystem DSS can be addressed transparently in terms of logical addresses LA and can guarantee a maximum sustainable bit-rate for reading and/or writing.
  • FIG. 2 shows a schematic version of the apparatus in more detail.
  • the apparatus comprises a signal processing unit 100 which is incorporated in the VRS 8 of Figure 1.
  • the signal processing unit 100 receives the video signal via the input terminal 1 and processes the video data into a channel signal for recording on the disc 3.
  • a read/write unit indicated by dashed line 102 is provided, incorporated in the DSS 6 of Figure 1.
  • the read/write unit 102 comprises a read/write head 104, which in the present example is configured for reading from/writing to optical disc 3.
  • Positioning means 106 are present for positioning the head 104 in a radial direction across the disc 3.
  • a read/write amplifier 108 is present in order to amplify the signals to and from the disc 3.
  • a motor 110 rotates the disc 3 in response to a motor control signal supplied by signal generation unit 112.
  • a microprocessor 114 is present for controlling all the circuits via control lines 116, 118, and 120.
  • the signal processing unit 100 is adapted to convert the video data received via the input terminal 1 into blocks BD of information in the channel signal: the size of the blocks of information can be variable but may (for example) be between 2MB and 4MB.
  • the write unit 102 is adapted to write a block of information of the channel signal in a sequence area on the disc 3.
  • the information blocks corresponding to the original video signal are written into many sequence areas that are not necessarily contiguous, as may be seen in the recording diagram of Figure 3, which arrangement is known as fragmented recording. It is a characteristic of the disc sub-system that it is able to record and write such fragmented recordings fast enough to meet real-time deadlines.
  • the apparatus is further provided with an input unit 130 for receiving an exit position (out-point) in a first video signal recorded on the disc 3 and for receiving an entry position (in-point) in a second video signal recorded on that same disc.
  • the apparatus comprises a bridging sequence generating unit 134, incorporated in the signal processing unit 100, for generating bridging sequences to link the two video streams.
  • the video signal which is a real time signal
  • the real-time file consists of a succession of signal block sequences SEQ for recording in corresponding sequence areas.
  • SEQ signal block sequences
  • real time data is allocated contiguously.
  • Each real time file represents a single A/V stream. The data of the A/V stream is obtained by concatenating the sequence data in the order of the file sequence.
  • Playback of a video signal recorded on the disc 3 will be briefly discussed with reference to Figure 4.
  • Playback of a video signal is controlled by means of a playback control program (PBC).
  • PBC playback control program
  • each PBC program defines a new playback sequence, which may comprise an edited version of recorded video and/or audio segments, and may specify a sequence of segments from respective sequence areas.
  • the PBC required to recreate the original file sequence from Figure 3) re-orders the fragmented recorded segments to provide a playback frame succession PFS corresponding to the original sequence.
  • Figure 5 shows the edited version starts at a point P., in a sequence area in the succession of areas of file A and continues until point P 2 in the next sequence area of file A. Then reproduction jumps over to the point P 3 in the sequence area in file B.1 and continues until point P 4 in the sequence area in file B.1. Next reproduction jumps over to the point P 5 in the same file B.2, which may be a point earlier in the succession of sequence areas of file B than the point P 3 , or a point later in the succession than the point P 4 . From the point P 5 in the sequence area in file B.2 reproduction continues until point P 6 .
  • a clip is a portion of a multiplexed stream.
  • the multiplexed stream of which it is a portion is compliant to its defining specification in all respects (for example in the case of an MPEG-2 multiplexed stream it complies with the MPEG-2 Program Stream specification).
  • a clip begins with the data that must be fetched in order to start decoding correctly. Decoding of some elementary streams in the multiplex may need to start at a later address in the clip because partial access units may be present at the start and end of a clip. Presentation may need to start at an even later presentation unit.
  • a clip is not necessarily the longest extent of stream that is compliant with its defining specification.
  • a clip contains no discontinuous points or changes in basic MPEG coding parameters.
  • the memory application needs to know which of these two cases applies, for example when the play back system needs to make a connection between two clips, it is indicated in the higher level control structures in the memory database.
  • Each clip is stored in its own real-time file.
  • the physical allocation of the real-time file in memory (on disc) ensures a continuous supply of data for reading and writing.
  • Parts of the data in a clip may be shared between two or more clips by using data sharing mechanisms defined in the file system. In general, an initial recording is made into a single clip, with the number of clips rising as edit operations are made.
  • a stored play object is the basic organisational element of the memory database. Stored play objects are not necessarily visible in the user interface of the system. They may be used by other objects that are visible to the user to define how a number of audio, video, teletext and graphical elements are composed on a time line for presentation.
  • the stored play object defines a time axis used in play back operations as a reference for playing clips.
  • the presentation time of each clip is defined on this time axis.
  • sequences of clips are played back.
  • the stored play object defines the order of clips and the time at which each one should be presented.
  • the clip defines the data to be decoded in order to make the required presentation. In many presentations, only a part of a clip may need to be presented: start and end presentation times are defined in the time-base of the clip in the stored play object to define which parts of the clip should be presented.
  • a stored play object may define several clips that are to be presented simultaneously. This feature enables, for example, the application of dubbed audio that is recorded after the initial recording and for additional graphics files that are to be presented simultaneously.
  • the essential information in a stored play object is a schedule table that describes the play back order and time of each clip, and information about the characteristics of the connection between each clip. Points of discontinuity of an elementary or multiplexed stream at the boundary between two clips, generated for example by editing, are recorded in the stored play object.
  • a global time axis is defined that is used as a reference for all playback operations within the stored play object.
  • the presentation time of each clip is mapped onto this reference time axis.
  • the stored data object may describe a number of parallel streams of data (presentation paths) that can be presented synchronously.
  • the essential information in a stored play object is a schedule table that describes the play back order and time of the clip data that makes up each presentation path, and information about the characteristics of the connections between each clip.
  • the duration of an interval is the maximum period during which presentation of data on the presentation path can be continued without passing a discontinuity in the coded data.
  • connection Two types of connections between stream data are defined. There are connections for which continuous supply of data and seamless presentation can be guaranteed, and those for which they cannot be guaranteed.
  • Information in the stored play object defines the properties of connections between streams in each interval so that the player replay subdevice can manage the fetching of data and decoding correctly.
  • t is the start of the reference time line (axis T) and tg is the end of the reference time line.
  • the clips are composed onto the reference time line as follows: Clip 1 from t., to t 3
  • Clip 2 from t 3 to t ⁇ (note that it is not allowed to have a gap between the end of Clip 1 and the start of Clip 2, because these clips contain video)
  • Clip 3 from t to t 4 Clip 4 from t 5 to t 6 (note that it is allowed to have a gap between the end of Clip 3 and the start of Clip 4, because these clips contain no video)
  • Video presentation path For each presentation path, an ordered list of "Interval-Items" is defined. In the example of Figure 6 there are four presentation paths: one video path, two audio paths and a graphics path. Video presentation path:
  • Interval-Item 1 Clip 1 , t1 to t3
  • Interval-Item 2 Clip 2, t3 to t6 Audio presentation path 1 :
  • Interval-Item 1 Clip 1 , t1 to t3
  • Interval-Item 2 Clip 2, t3 to t6
  • Audio presentation path 2
  • Interval-Item 1 Clip 3, t1 to t4 Interval-Item 2: Clip 2, t4 to t5 Interval-Item 3: Clip 4, t5 to t6 Graphics presentation path 1 :
  • Interval-Item 1 Clip 5, t2 to t6
  • the interval items define the decoding and presentation of data from one clip, and are defined in time order for each presentation path.
  • each interval item defines a time window on a clip, that is to say it defines a part of a clip.
  • the overall condition that each part of a clip may be only referred to once may suitably be applied. If the same audio or video data needs to be used more than once in a stored play object, it is preferably accomplished by creating two or more clips containing the same underlying data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

A method for flexible creation of audio/video presentations by editing together extracts from pre-existing and stored material in a memory device such as an optical disc. A series of audio, video and/or graphics clips (Clip 1-5) are extracted from storage as a single presentation item and applied to a single reference time-line (T). Respective presentation pathways are derived for the different components identifying for sections along the time-line (T) the clip supplying that component.

Description

DESCRIPTION
PLAYBACK EDITING OF STORED A/V MATERIAL
The present invention relates to the storage, retrieval and editing of audio and/or video data particularly, but not essentially, in conjunction with optical disc storage for the data and the use of MPEG-compliant coding schemes.
In recent times, a need has arisen for both domestic and commercial audio and/or video (herein "A/V") devices and systems to support a greater amount of user interactivity, and arising from this is a need for simplified joining of A/V segments in which the transitions through a sequence of segments or clips may be handled smoothly by a decoder. This implies that from the user's point of view there is no severe disruption in the viewed frames and the audio continues substantially uninterrupted. Applications for sequential audio and video compositing are numerous, one example being a series of character-user interactions presented as short seamless clips where the outcome of the interaction will determine which clip appears next. A development of this is interactive motion pictures where the user (viewer) can influence the storyline. Branch points along the path a user chooses to take through the interactive movie should ideally appear seamless, otherwise the user will lose the suspension of disbelief normally associated with watching a movie. Other examples are the removal of advertisements from recordings of broadcast television programmes, or the editing of home video (camcorder) recordings.
It is therefore an object of the present invention to provide a means to enable the presentation of 'new' A/V material in the form of edited and arranged segments from an array of pre-existing stored material.
It is a further object to enable such presentation without the need for extensive copying of data during the assembly process.
In accordance with the present invention there is provided a method for controlling read-out of stored audio/video data from a storage device, comprising the steps of: identifying a number of separately stored data clips within the storage device to be read out in sequence; pinning the clips in respective positions along a single reference time axis; defining at least one presentation path through the clips, and reading out the same. By simply pinning the clips to a common time-line, such as to control the sequence and timing of readout, it is not necessary to buffer large numbers of clips as a sequence is built up.
At least some of the clips pinned to the reference time axis may comprise more than one data component (for example, audio, video and graphics), with at least one separate presentation path defined through the clips for each such component. For some classes of component, clips not comprising common data components may be pinned to the reference time axis in overlapping manner, whilst others may be limited in dependence on the capabilities of the hosting device.
The clips stored may be discrete entities, or parts of longer sequences, in which case the step of identifying separately stored clips may suitably comprise identifying start and finish points in a stored data sequence longer than the clip, with the pinning to the reference time-line comprising placing those start and finish points along the time-line. The process may include the further step of duplicating a clip on determining that component data therefrom is required more than once along the length of the reference time axis.
Also in accordance with the present invention there is provided a stored data playback apparatus comprising memory access means coupled with a storage device containing a plurality of stored data objects, the memory access means being operable to identify a number of separately stored data clips within the storage device to be read out in sequence, to generate a single reference time axis and apply indicators thereto identifying when each identified clip is to be read out from the storage device, to generate at least one presentation path through the clips, and to read out the same.
Where at least some of the clips pinned to the reference time axis comprise more than one data component, the memory access means may suitably be configured to generate at least one separate presentation path defined through the clips for each such component. The memory access means may be operable to identify a separately stored clip by identifying start and finish points in a stored data sequence longer than said clip, and to apply to the reference time-line those start and finish points.
The apparatus may further comprise memory write means coupled with the storage device and operable to write data at storage locations therein, said write means being further operable to duplicating a clip on determination by the memory access means that component data therefrom is required more than once along the length of the reference time axis. This memory write means may be further arranged to write a sequence of clips assembled by the memory access means back to the storage means as a further clip.
Preferred embodiments will now be described by way of example only, and with reference to the accompanying drawings in which:
Figure 1 is a block schematic representation of an optical disc record/reply apparatus suitable to embody the invention;
Figure 2 is a more detailed schematic showing components within the apparatus of Figure 1 ;
Figure 3 represents the recording of blocks of information in sequence areas on an optical disc;
Figure 4 represents the playback of information stored on the disc in Figure 3;
Figure 5 generally illustrates the editing of stored video data, with bridge sequences omitted; and
Figure 6 represents the presentation of overlapping clips extracted from storage and applied to a reference time axis.
The following description considers in particular A/V devices operating according to the MPEG standards (ISO/IEC 11172 for MPEG1 and, in particular, ISO/IEC 13818 for MPEG2) although the skilled practitioner will recognise the applicability of the present invention to other A/V coding schemes not in conformance with the MPEG standard.
Figure 1 shows an embodiment of an apparatus suitable to host the present invention, in the form of an optical disc record and playback device. In the description of the apparatus, the handling of frame-based audio and video signals is concentrated upon, although it will be recognised that other types of signal may alternately or additionally be processed, such as data signals, and the invention is equally applicable to other memory devices such as magnetic data storage means and computer hard disc drives.
The apparatus comprises an input terminal 1 for receiving a video signal to be recorded on optical disc 3. Further, the apparatus comprises an output terminal 2 for supplying a video signal reproduced from the disc.
The data area of the disc 3 consists of a contiguous range of physical sectors, having corresponding sector addresses. This address space is divided into sequence areas, with a sequence area being a contiguous sequence of sectors, with a fixed length. The apparatus as shown in Figure 1 is decomposed into two major system parts, namely the disc subsystem DSS 6 and what is referred to herein as the video recorder subsystem VRS 8 controlling both recording and playback. The two subsystems are characterised by a number of features, as will be readily understood, including that the disc subsystem DSS can be addressed transparently in terms of logical addresses LA and can guarantee a maximum sustainable bit-rate for reading and/or writing.
Figure 2 shows a schematic version of the apparatus in more detail. The apparatus comprises a signal processing unit 100 which is incorporated in the VRS 8 of Figure 1. The signal processing unit 100 receives the video signal via the input terminal 1 and processes the video data into a channel signal for recording on the disc 3. A read/write unit indicated by dashed line 102 is provided, incorporated in the DSS 6 of Figure 1. The read/write unit 102 comprises a read/write head 104, which in the present example is configured for reading from/writing to optical disc 3. Positioning means 106 are present for positioning the head 104 in a radial direction across the disc 3. A read/write amplifier 108 is present in order to amplify the signals to and from the disc 3. A motor 110 rotates the disc 3 in response to a motor control signal supplied by signal generation unit 112. A microprocessor 114 is present for controlling all the circuits via control lines 116, 118, and 120.
The signal processing unit 100 is adapted to convert the video data received via the input terminal 1 into blocks BD of information in the channel signal: the size of the blocks of information can be variable but may (for example) be between 2MB and 4MB. The write unit 102 is adapted to write a block of information of the channel signal in a sequence area on the disc 3. The information blocks corresponding to the original video signal are written into many sequence areas that are not necessarily contiguous, as may be seen in the recording diagram of Figure 3, which arrangement is known as fragmented recording. It is a characteristic of the disc sub-system that it is able to record and write such fragmented recordings fast enough to meet real-time deadlines. In order to enable editing of the video data recorded in an earlier recording step on the disc 3, the apparatus is further provided with an input unit 130 for receiving an exit position (out-point) in a first video signal recorded on the disc 3 and for receiving an entry position (in-point) in a second video signal recorded on that same disc. Additionally, the apparatus comprises a bridging sequence generating unit 134, incorporated in the signal processing unit 100, for generating bridging sequences to link the two video streams.
Recording of a video signal will be briefly discussed with reference to Figure 3. In the video recorder subsystem, the video signal, which is a real time signal, is converted into a real-time file RTF as shown in the upper part of Figure 3. The real-time file consists of a succession of signal block sequences SEQ for recording in corresponding sequence areas. There is no constraint on the location of the sequence areas on the disc and, hence, any two consecutive sequence areas comprising portions of data of the video signal recorded may be anywhere in the logical address space LAS as shown in the lower part of Figure 3. Within each sequence area, real time data is allocated contiguously. Each real time file represents a single A/V stream. The data of the A/V stream is obtained by concatenating the sequence data in the order of the file sequence. Next, playback of a video signal recorded on the disc 3 will be briefly discussed with reference to Figure 4. Playback of a video signal is controlled by means of a playback control program (PBC). In general, each PBC program defines a new playback sequence, which may comprise an edited version of recorded video and/or audio segments, and may specify a sequence of segments from respective sequence areas. As may be seen from comparison of Figures 3 and 4, the PBC required to recreate the original file sequence (from Figure 3) re-orders the fragmented recorded segments to provide a playback frame succession PFS corresponding to the original sequence.
The editing of one or more video signals recorded on the disc 3 is discussed with reference to Figure 5, which shows two video signals indicated by two sequences of fragments named "file A" and "file B". For realising an edited version of one or more video signals recorded earlier, a new PBC program is generated for defining the A/V sequence obtained by concatenating parts from earlier A/V recordings in a new order. The parts may be from the same recording or from different recordings. In order to play back a PBC program, data from various parts of (one or more) real time files has to be delivered to a decoder. This implies a new data stream that is obtained by concatenating parts of the streams represented by each real-time file. In Figure 5, this is illustrated for a PBC program that uses three parts, one from file A and two from file B (respectively indicated as file B.1 and file B.2).
Figure 5 shows the edited version starts at a point P., in a sequence area in the succession of areas of file A and continues until point P2 in the next sequence area of file A. Then reproduction jumps over to the point P3 in the sequence area in file B.1 and continues until point P4 in the sequence area in file B.1. Next reproduction jumps over to the point P5 in the same file B.2, which may be a point earlier in the succession of sequence areas of file B than the point P3, or a point later in the succession than the point P4. From the point P5 in the sequence area in file B.2 reproduction continues until point P6.
A clip is a portion of a multiplexed stream. The multiplexed stream of which it is a portion is compliant to its defining specification in all respects (for example in the case of an MPEG-2 multiplexed stream it complies with the MPEG-2 Program Stream specification). A clip begins with the data that must be fetched in order to start decoding correctly. Decoding of some elementary streams in the multiplex may need to start at a later address in the clip because partial access units may be present at the start and end of a clip. Presentation may need to start at an even later presentation unit.
A clip is not necessarily the longest extent of stream that is compliant with its defining specification. Thus, a clip contains no discontinuous points or changes in basic MPEG coding parameters. When two clips are connected, there may be a discontinuity. Alternatively it is possible that two clips can be joined with no discontinuity. When the memory application needs to know which of these two cases applies, for example when the play back system needs to make a connection between two clips, it is indicated in the higher level control structures in the memory database.
Each clip is stored in its own real-time file. The physical allocation of the real-time file in memory (on disc) ensures a continuous supply of data for reading and writing. Parts of the data in a clip may be shared between two or more clips by using data sharing mechanisms defined in the file system. In general, an initial recording is made into a single clip, with the number of clips rising as edit operations are made.
A stored play object is the basic organisational element of the memory database. Stored play objects are not necessarily visible in the user interface of the system. They may be used by other objects that are visible to the user to define how a number of audio, video, teletext and graphical elements are composed on a time line for presentation.
The stored play object defines a time axis used in play back operations as a reference for playing clips. The presentation time of each clip is defined on this time axis. When a stored play object is played, sequences of clips are played back. The stored play object defines the order of clips and the time at which each one should be presented. The clip defines the data to be decoded in order to make the required presentation. In many presentations, only a part of a clip may need to be presented: start and end presentation times are defined in the time-base of the clip in the stored play object to define which parts of the clip should be presented.
A stored play object may define several clips that are to be presented simultaneously. This feature enables, for example, the application of dubbed audio that is recorded after the initial recording and for additional graphics files that are to be presented simultaneously.
The essential information in a stored play object is a schedule table that describes the play back order and time of each clip, and information about the characteristics of the connection between each clip. Points of discontinuity of an elementary or multiplexed stream at the boundary between two clips, generated for example by editing, are recorded in the stored play object.
A global time axis is defined that is used as a reference for all playback operations within the stored play object. The presentation time of each clip is mapped onto this reference time axis.
The stored data object may describe a number of parallel streams of data (presentation paths) that can be presented synchronously. The essential information in a stored play object is a schedule table that describes the play back order and time of the clip data that makes up each presentation path, and information about the characteristics of the connections between each clip.
This is done by dividing the reference time axis into a number of intervals for each presentation path. There are no gaps between adjacent intervals, and the collection of intervals covers the entire reference time axis. For each interval, presentation data for each stream is described. The duration of an interval is the maximum period during which presentation of data on the presentation path can be continued without passing a discontinuity in the coded data.
Two types of connections between stream data are defined. There are connections for which continuous supply of data and seamless presentation can be guaranteed, and those for which they cannot be guaranteed. Information in the stored play object defines the properties of connections between streams in each interval so that the player replay subdevice can manage the fetching of data and decoding correctly.
The following example, described with reference to Figure 6, shows the structure of audio, video and graphics data in a stored data object. Two clips containing multiplexed audio and video data (Clips 1 and 2) are connected together to form a single continuous presentation. An auxiliary audio channel (Clips 3 and 4) has been recorded after the original recording to overdub the original audio and may be presented in parallel with the original clips. An auxiliary graphics presentation path has also been recorded (Clip 5). In the Figure, t, is the start of the reference time line (axis T) and tg is the end of the reference time line.
The clips are composed onto the reference time line as follows: Clip 1 from t., to t3
Clip 2 from t3 to tβ (note that it is not allowed to have a gap between the end of Clip 1 and the start of Clip 2, because these clips contain video)
Clip 3 from t, to t4 Clip 4 from t5 to t6 (note that it is allowed to have a gap between the end of Clip 3 and the start of Clip 4, because these clips contain no video)
Figure imgf000013_0001
For each presentation path, an ordered list of "Interval-Items" is defined. In the example of Figure 6 there are four presentation paths: one video path, two audio paths and a graphics path. Video presentation path:
Interval-Item 1 : Clip 1 , t1 to t3 Interval-Item 2: Clip 2, t3 to t6 Audio presentation path 1 :
Interval-Item 1 : Clip 1 , t1 to t3 Interval-Item 2: Clip 2, t3 to t6
Audio presentation path 2:
Interval-Item 1 : Clip 3, t1 to t4 Interval-Item 2: Clip 2, t4 to t5 Interval-Item 3: Clip 4, t5 to t6 Graphics presentation path 1 :
Interval-Item 1: Clip 5, t2 to t6
The interval items define the decoding and presentation of data from one clip, and are defined in time order for each presentation path.
Various conditions may be imposed by the nature and capabilities of the hosting system; for example, it may be an imposed condition that there is never any more than one video presentation path, whilst additional audio and data paths may be supported. Effectively, each interval item defines a time window on a clip, that is to say it defines a part of a clip. In order to avoid errors, the overall condition that each part of a clip may be only referred to once may suitably be applied. If the same audio or video data needs to be used more than once in a stored play object, it is preferably accomplished by creating two or more clips containing the same underlying data.
From reading the present disclosure, other variations will be apparent to persons skilled in the art. Such variations may involve other features which are already known in the methods and apparatuses for editing of audio and/or video signals and component parts thereof and which may be used instead of or in addition to features already described herein. Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present application also includes any novel feature or any novel combination of features disclosed herein either implicitly or explicitly or any generalisation thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention. The applicants hereby give notice that new claims may be formulated to such features and/or combinations of such features during the prosecution of the present application or of any further application derived therefrom.

Claims

1. A method for controlling read-out of stored audio/video data from a storage device, comprising the steps of:
5 identifying a number of separately stored data clips within the storage device to be read out in sequence; pinning the clips in respective positions along a single reference time axis; defining at least one presentation path through the clips, and o reading out the same.
2. A method as claimed in Claim 1 , wherein at least some of the clips pinned to the reference time axis comprise more than one data component, with at least one separate presentation path defined through 5 the clips for each such component.
3. A method as claimed in Claim 2, wherein clips not comprising common data components are pinned to the reference time axis in overlapping manner. 0
4. A method as claimed in any of Claims 1 to 3, wherein the step of identifying separately stored clips comprises identifying start and finish points in a stored data sequence longer than said clip, and the pinning to the reference time-line comprises placing those start and finish 5 points along the time-line.
5. A method as claimed in any of Claims 1 to 4, comprising the further step of duplicating a clip on determining that component data therefrom is required more than once along the length of the reference time axis.
6. Stored data playback apparatus comprising memory access means coupled with a storage device containing a plurality of stored data
5 objects, the memory access means being operable to identify a number of separately stored data clips within the storage device to be read out in sequence, to generate a single reference time axis and apply indicators thereto identifying when each identified clip is to be read out from the storage device, to generate at least one presentation path through the o clips, and to read out the same.
7. Apparatus as claimed in Claim 6, wherein at least some of the clips pinned to the reference time axis comprise more than one data component, with the memory access means being configured to generate 5 at least one separate presentation path defined through the clips for each such component.
8. Apparatus as claimed in Claim 6 or Claim 7, wherein the memory access means is operable to identify a separately stored clip by 0 identifying start and finish points in a stored data sequence longer than said clip, and to apply to the reference time-line those start and finish points.
9. Apparatus as claimed in any of Claims 6 to 8, further 5 comprising memory write means coupled with the storage device and operable to write data at storage locations therein, said write means being further operable to duplicating a clip on determination by the memory access means that component data therefrom is required more than once along the length of the reference time axis.
10. Apparatus as claimed in Claim 9, wherein said memory write means is further arranged to write a sequence of clips assembled by the memory access means back to the storage means as a further clip.
PCT/IB1999/001103 1998-06-27 1999-06-14 Playback editing of stored a/v material WO2000000980A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU39520/99A AU3952099A (en) 1998-06-27 1999-06-14 Playback editing of stored a/v material

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB9813838.1A GB9813838D0 (en) 1998-06-27 1998-06-27 Playback editing of stored A/V material
GB9813838.1 1998-06-27

Publications (2)

Publication Number Publication Date
WO2000000980A2 true WO2000000980A2 (en) 2000-01-06
WO2000000980A3 WO2000000980A3 (en) 2000-03-23

Family

ID=10834447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB1999/001103 WO2000000980A2 (en) 1998-06-27 1999-06-14 Playback editing of stored a/v material

Country Status (3)

Country Link
AU (1) AU3952099A (en)
GB (1) GB9813838D0 (en)
WO (1) WO2000000980A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004057613A1 (en) * 2002-12-19 2004-07-08 Koninklijke Philips Electronics N.V. Characteristic point information (cpi) for multilayer video

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0564247A1 (en) * 1992-04-03 1993-10-06 Adobe Systems Inc. Method and apparatus for video editing
EP0801391A2 (en) * 1996-04-12 1997-10-15 Sony United Kingdom Limited Editing of recorded material
WO1998006098A1 (en) * 1996-08-06 1998-02-12 Applied Magic, Inc. Non-linear editing system for home entertainment environments
EP0843311A2 (en) * 1996-11-15 1998-05-20 Hitachi Denshi Kabushiki Kaisha Method for editing image information with aid of computer and editing system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0564247A1 (en) * 1992-04-03 1993-10-06 Adobe Systems Inc. Method and apparatus for video editing
EP0801391A2 (en) * 1996-04-12 1997-10-15 Sony United Kingdom Limited Editing of recorded material
WO1998006098A1 (en) * 1996-08-06 1998-02-12 Applied Magic, Inc. Non-linear editing system for home entertainment environments
EP0843311A2 (en) * 1996-11-15 1998-05-20 Hitachi Denshi Kabushiki Kaisha Method for editing image information with aid of computer and editing system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Video Engineering, Andrew F. Inglis et al. 1996, pages 308-312, McGraw-Hill, XP002921919 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004057613A1 (en) * 2002-12-19 2004-07-08 Koninklijke Philips Electronics N.V. Characteristic point information (cpi) for multilayer video

Also Published As

Publication number Publication date
GB9813838D0 (en) 1998-08-26
WO2000000980A3 (en) 2000-03-23
AU3952099A (en) 2000-01-17

Similar Documents

Publication Publication Date Title
JP4864203B2 (en) Precise editing of encoded AV sequence frames
JP4508871B2 (en) Device for recording main and auxiliary files on a track on a record carrier
EP1402740B1 (en) Changing a playback speed for a video presentation recorded in a progressive frame structure format
KR100712148B1 (en) Method for editing source video to slow motion or fast motion on the recordable media
US20060110111A1 (en) Editing of real time information on a record carrier
AU763849B2 (en) Signal processing on information files so as to obtain characteristic point information sequences
US20030113095A1 (en) After-recording method and apparatus for digital recording medium and reproduction method and apparatus for the digital recording medium
MXPA02004693A (en) Picture accurate edit without decoding and re encoding of mpeg bit stream for recordable dvd.
EP1016083B1 (en) Editing of digital video information signals
JP2008521317A (en) Method and apparatus for frame-by-frame editing of audio / video streams
MXPA02004680A (en) Delete and undelete for recordable dvd editing.
JP2002218393A (en) Recording-reproducing method and apparatus thereof
WO2000000980A2 (en) Playback editing of stored a/v material
KR19980080748A (en) Information playback device and information playback method
EP1577892A1 (en) Data processing device
JP3507990B2 (en) Moving image reproducing apparatus and moving image recording / reproducing apparatus
JPH11273262A (en) Information recording and reproduction device
US20060093314A1 (en) Editing of data frames
JPH07302468A (en) Moving image and voice recording and reproducing device
JP2001319463A (en) Device and method for editing, and system and method for preparing materials for sending

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

WWE Wipo information: entry into national phase

Ref document number: 2000/00506

Country of ref document: TR

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642