WO2002071213A2 - Interactive broadcasting editing system - Google Patents

Interactive broadcasting editing system Download PDF

Info

Publication number
WO2002071213A2
WO2002071213A2 PCT/GB2002/000969 GB0200969W WO02071213A2 WO 2002071213 A2 WO2002071213 A2 WO 2002071213A2 GB 0200969 W GB0200969 W GB 0200969W WO 02071213 A2 WO02071213 A2 WO 02071213A2
Authority
WO
WIPO (PCT)
Prior art keywords
application
interactive
representations
audio
display field
Prior art date
Application number
PCT/GB2002/000969
Other languages
French (fr)
Other versions
WO2002071213A3 (en
WO2002071213A8 (en
Inventor
Neil Cashman
Scott Walker
Walter Perotto
Original Assignee
Digital Interactive Broadband Services Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Digital Interactive Broadband Services Limited filed Critical Digital Interactive Broadband Services Limited
Priority to AU2002234784A priority Critical patent/AU2002234784A1/en
Publication of WO2002071213A2 publication Critical patent/WO2002071213A2/en
Publication of WO2002071213A8 publication Critical patent/WO2002071213A8/en
Publication of WO2002071213A3 publication Critical patent/WO2002071213A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present invention relates to an editing apparatus and a method for editing an interactive application, in particular in conjunction with an audio/video sequence with which the interactive application is to be output.
  • the content creators/directors of the audio/video sequence provide instructions to an interactive content programmer.
  • the interactive content programmer then creates the interactive application accordingly.
  • the interactive application will be designed to run for a predetermined time, with application resources becoming available at predetermined times during the progress of the interactive application and application triggers being provided to trigger predefined application events at predetermined times.
  • the present invention is based on a recognition that this system of creating interactive applications is unsatisfactory. In particular, it requires the content creators of the audio/video sequence to determine in advance exactly what they require for the interactive application. It may, therefore, be necessary to complete the audio/video sequence before providing instructions to the interactive content programmer, thereby creating significant additional delays.
  • a method of editing an interactive application in conjunction with an audio/video sequence with which it is to be output including: decomposing the interactive application into a core application, application resources and application triggers; providing a display field with an axis representing time; displaying side by side on the display field representations of the audio/video sequence, the core application, the application resources and the application triggers, the representations being positioned on and extending along the time axis according to their availability in time; controlling the positions and extent of the representations with respect to the time axis; and reconstructing the interactive application in the edited form as represented on the display field.
  • editing apparatus for editing an interactive application in conjunction with an audio/video sequence with which it is to be output
  • the apparatus including: a decomposer for decomposing the interactive application into a core application, application resources and application triggers; a controller for driving a display to show a display field with an axis representing time and to show side by side on the display field representations of the audio/video sequence, the core application, the application resources and the application triggers, the representations being positioned on and extending along the time axis according to their availability in time and the controller being responsive to a user input to control the positions and extent of the representations with respect to the time axis; and an interpreter for reconstructing the interactive application in the edited form as represented on the display field.
  • a content creator is able to view the component parts of the interactive application with respect to time and in conjunction with the audio/video sequence.
  • decomposing and reconstructing the interactive application it then becomes possible for the creator to modify on screen the occurrences of the interactive application, resources and triggers with respect to time and thereby edit the interactive application to be transmitted to the end users. It thereby becomes a simple matter for the audio/video technician to synchronise parts of the interactive application with the audio/video sequence.
  • a plurality of interactive applications are decomposed and the respective representations are displayed side-by-side on the display field so as to allow editing of each of the interactive applications simultaneously.
  • the plurality of interactive applications may include at least interactive applications for use with different predetermined APIs.
  • the decomposer and interpreter are preferably capable of converting between the representations for the display field and a plurality of predetermined APIs.
  • start and end points of representations may be controlled for the core application and the application resources.
  • the content creator is able to change his or her mind about the content of the interactive application.
  • the creator can decide to associate different resources with the interactive application or merely to delete some resources from the interactive application.
  • the creator can decide to delete certain triggers or introduce new triggers.
  • a position is indicated selectively along the time axis and the state of the audio/video sequence and also the interactive application for the indicated time is displayed.
  • the creator for any desired time during the audio/video sequence, it is possible for the creator to receive a preview of the audio/video sequence at that time and also the state of the interactive application at that time. In this way, the creator is able to determine easily whether or not events and resources of the interactive application are occurring at the correct time in relation to the audio/video sequence.
  • the various components of the interactive application and the audio/video sequence arranged side-by-side in the direction of the time axis, it becomes a simple matter for the creator to choose a particular time in the audio/video sequence and to align the components of the interactive application in the display field accordingly.
  • the state of the interactive application for the indicated time is dependent on data, which, when running the interactive application, would have been calculated previously and, if so, a process is conducted to calculate the data.
  • data which, when running the interactive application, would have been calculated previously and, if so, a process is conducted to calculate the data.
  • an interactive application unlike the audio/video sequence, may be different at any particular time depending on previous interactivity with end users, it may not be possible to display a definitive state for the interactive application at that time. However, by checking whether the current state of the interactive application is dependent on earlier states, then appropriate action can be taken.
  • it is determined whether or not the data is dependent on values which, when running the interactive application, would have been input by a user and the data is selectively calculated on the basis of default values or selected values.
  • an example of the state of the interactive application at the indicated time may be given on the basis of default values.
  • selected values may be provided so as to display the state of the interactive application at the indicated time on the basis of previous states determined by the selected values.
  • representations of a plurality of audio/video sequences are displayed side-by-side in the display field and the representations are controlled so as to cut between the respective audio/video sequences in order to produce the output sequence.
  • the invention provides an editing system for creating or editing an interactive television programme, comprising data processing apparatus including a display on which is presented a time line with a series of tracks including at least one track for an audio / video sequence and a plurality of interactive application tracks which represent a plurality of versions of an interactive application programmed in different programming languages, the editing system being arranged to provide a plurality of interactive programme outputs for a corresponding plurality of television networks which require interactive content to be programmed using different interactive languages.
  • components of the interactive application are split across multiple tracks.
  • the components of an interactive application comprise a core application, application resources and application triggers.
  • the invention provides a computer readable storage medium having recorded thereon code components that, when loaded on a computer and executed, will cause that computer to operate according to the invention.
  • Figure 1 illustrates an apparatus embodying the present invention
  • Figure 2 illustrates schematically the process conducted by the apparatus of Figure 1;
  • Figure 3 illustrates a display field embodying the present invention
  • Figure 4 illustrates the process of inserting or deleting resources and triggers
  • Figure 5 illustrates the process of previewing an interactive application.
  • the present invention may be embodied in an editing apparatus as illustrated schematically in Figure 1.
  • the various functional blocks are intended to signify the various functions of the apparatus and could be embodied in different ways, for instance within combined units .
  • a controller 2 drives a display 4 and is responsive to an input unit 6.
  • the input unit 6 may be of any convenient type, including a keyboard, a mouse or a plurality of dedicated control buttons, sliders etc, for controlling the functions of the controller 2. Indeed, it is possible to combine the display 4 and input 6 into one unit, for instance, in the case of a touch sensitive display.
  • the controller 2 is arranged to receive audio/video data and also interactive content data.
  • the audio/video data may be provided from any suitable source, such as the illustrated memory 8. However, the data should be available in such a way that the controller can preview the audio/video data at any time within the sequence to be edited.
  • the controller may include a return channel so as to select portions of the audio/video sequence from the memory 8, rather than downloading the entire sequence.
  • the interactive content is provided from a decomposing unit 10.
  • This retrieves an interactive application from a memory 12 and comprises an interpreter for analysing the content of the interactive application as a whole and determining from this the core application, the application resources and the application triggers. These are provided to the controller 2 in component form.
  • the interpretation function of the decomposer 10 could be integrated with the functions of the controller 2 such that the controller 2 retrieved interactive applications directly from a memory without the need of a decomposer unit.
  • the controller 2 provides representations of the component parts of the interactive application on the display 4.
  • the input 6 allows editing of the representations on the screen 4 and the controller accordingly edits the actual interactive application components.
  • the edited components are then provided to an interpreter 14 which combines the components to form once again a complete interactive application. This may be stored in memory 16.
  • the resulting output sequence may be provided to a separate memory for future multiplexing with the interactive application or rewritten to memory 8 in the edited form.
  • video data may be provided from a number of different video sources 22, including tape and disc storage mediums.
  • audio data may be provided from a variety of similar sources 24.
  • the non-linear interactive content editor 20 is illustrated in conjunction with a digital multiplexer 28. This is intended to illustrate what happens after the editing of the interactive application is complete and the audio/video sequence and interactive application are ready for transmission.
  • the digital multiplexer 28 is used for multiplexing the edited interactive application and the output audio/video sequence.
  • the multiplexed output could be provided to a broadcast network.
  • it could be provided to a non-broadcast network with the audio/video sequence being streamed from a video server together with the triggers and the application and resources being provided on a server for on-demand delivery.
  • the controller 2 drives the display 4 to produce a display field having at least a time axis.
  • a suitable display field 40 is illustrated in Figure 3.
  • time axis extends in the horizontal direction. Representations of various audio/video sequences and interactive application components are then displayed side-by-side, stacked along the vertical axis.
  • the representations comprise elongate bars representing the time over which their corresponding sequences or components will be available to the end user.
  • the display field 40 may extend over the entire time of the required audio/video output sequence. Alternatively, the display field 40 may display only a section of the total audio/video output sequence and the controller 2 may provide a function by which the displayed portion may be moved freely within the total extent of the audio/video output sequence.
  • the display field 40 of Figure 3 displays an example in which two video sequences 42, 44 and two audio sequences 46 and 48 are displayed in conjunction with three interactive applications.
  • the first interactive application is displayed as a representation 50 of the core application, representations 52 of the application resources and representations 54 of the triggers.
  • the second and third applications are shown as representations 56 and 58 of the respective core applications.
  • the first application may be for use in the Open TV environment whereas the second and third applications are for use in the Liberate and MediaHighway environments respectively.
  • the controller could provide a display field 50 in which the second and third applications are also displayed in conjunction with their respective resources and triggers.
  • the user may select one of the interactive applications for which to display all of the components.
  • the input 6 is used to manipulate the various representations on the display field 40 as required. For instance, the start and end points of the core applications and the application resources may be moved along the time axis so as to change the times within which those components are available from the service provider to the end user. Similarly, the input 6 may allow the user merely to move representations along the time axis without changing their extent. Furthermore, the input 6 may be used to move the positions of the triggers along the time axis. In this way, the user can easily manipulate the timing of the various parts of the interactive application.
  • the input 6 may be used to move the start and end points of the various audio and video sequences. Since only one output audio/video sequence is to be produced, no sensible meaning can be given to sequences which overlap along the time axis. Therefore, audio and video sequences from various sources are chosen such that only one video sequence and only one audio sequence is available at any one point along the time axis. In view of this, in a preferred embodiment, moving an end of one video sequence will automatically move the end of the video sequence adjacent in time. Also, since one video sequence is usually associated with a corresponding audio sequence, it is possible for the controller to retain a link between these two corresponding sequences such that editing one sequence along the time axis causes automatic editing of the other corresponding sequence.
  • Additional resources may be acquired from any appropriate source, for instance, an additional database memory as illustrated in Figure 1 as memory 18.
  • step 100 the audio video content into the interactive content editor.
  • step 102 the process then imports the required interactive application by adding it to the interactive content editor in conjunction with the audio/video content. As explained above, this is achieved (using the appropriate API and development environment) either in the decomposer 10 or within the controller 2.
  • step 104 it is determined whether or not the input 6 has been actuated to add a new resource or trigger event to the interactive application. If the addition of a resource has been selected, then the process proceeds to step 106 in which a list of resource types/options is displayed.
  • the list may be displayed on the display 4 in place of the display field 40 or in any other way, for instance as a window superimposed on the display 40 or on some other display device.
  • the list of resources may correspond to any additional resources available to the controller 2.
  • a memory 18 may be provided as a database of additional resources. Where those resources are intended for a variety of development environments, the controller can retrieve those resources via the interpretation of the decomposer 10 or apply its own interpretation in the same way as when retrieving the original interactive application.
  • step 108 following selection of a particular resource from the list by means of the input 6, that resource is added to the interactive application in step 108.
  • step 110 the new resource is multiplied with the audio/video and application stream.
  • the resultant stream can either be stored on a recording medium (ready for transmission) or sent directly to the broadcast chain. There, the stream may be multiplexed with different AV and data sources.
  • step 112 it is determined whether or not the added resource is required to be triggered at a specific time. If no specific trigger time is required, then the process proceeds to step 114 where it is determined whether or not any further resources or triggers are required. If no further resources or triggers are required, then the process can end. However, otherwise the process returns to step 104.
  • step 116 a list of trigger options is displayed to the user. This display may take the same form as that discussed for step 106.
  • the options may include known options for triggers including triggering resource display, scene change etc.
  • step 118 the trigger is added to the interactive application.
  • step 120 the new trigger is multiplied with the audio/video and application stream.
  • the resultant stream can either be stored on a recording medium (ready for transmission) or sent directly to the broadcast chain. There, the stream may be multiplexed with different AV and data sources.
  • a time line 60 is displayed vertically on the display field 40.
  • the time line 60 provides an indication of a particular time for each of the displayed representations. It is then possible to display a preview of the audio/video sequence and one of the interactive applications for that indicated time.
  • the preview image 70 may be provided in any suitable manner, for instance on a separate display screen, as a window superimposed over the display field 40 or as an image to replace selectively the display field 40 on the display 4.
  • the particular state of the interactive application might be any one of a number of states depending on the earlier inputs made to it.
  • the controller employs a process to provide an appropriate preview. This will be explained with reference to Figure 5.
  • the interactive content editor contains at least part of the interactive application.
  • the operator selects a preview option in step 202.
  • the process then awaits selection by the input 6 of an appropriate API.
  • an appropriate API For instance, as illustrated in Figure 3, three equivalent interactive applications 50, 56 and 58 may be displayed on the display field 40 at the same time.
  • the operator can select which of these interactive applications is to be previewed by selecting the development environment or API. For instance, by selecting open TV, the interactive application 50 will be previewed.
  • a similar process may be used to select between different interactive applications which are to be broadcast at the same time using the same development environment.
  • step 204 it is determined whether or not an API has been selected by the operator. If an API has been selected, then, in step 206, the process determines which portion of the interactive application is to be previewed on the basis of the time indicated by the time line 60.
  • step 208 the interactive application is processed to determine whether or not the interactive application at that time is dependent on any data which would have been acquired previously had the full interactive application been run. If no such data is required, then the process can proceed to step 210 and step 212 in which the preview is built using the selected API and run using the appropriate virtual machine for the API.
  • the audio/video sequence is played and displayed simultaneously with the preview of the interactive application. This may be achieved by outputting previously acquired audio/video data or by instructing an external VTR device to play the appropriate portion.
  • step 208 determines whether the selection portion of the interactive application does require previous data. If the selection portion of the interactive application does require previous data, then the process moves from step 208 to step 214 and processes the interactive application to determine the previous data. In step 216, when it is determined that previous data is required, the operator is prompted, for instance using the display 4, to choose between default data or user- choice data.
  • step 218 the default values are loaded and, in step 220, the default values are used to create past state information based on the selected API. This past state information can then be used in step 210 to build the preview as discussed above.
  • step 222 if the operator selects the use of user-choice values in step 222, the operator is able to input the required user-choice data by means of the input 6 and/or display 4.
  • the step 220 can then make use of these values to create the appropriate past state information. In this way, the operator can create a preview of the interactive application for any different combination of user inputs.
  • the prefe ⁇ ed embodiment of the present invention is particularly for use in broadcast studio / post-production facilities, and can handle interactive applications (of any proprietary API) and their resource modules and trigger objects. It enables a professional editor to import an interactive application into a "standard” audio / video editing environment and alter the sequence / timing of application events (display of resources, triggers etc.) which would otherwise require software programming expertise. The editor therefore produces an audio / visual / application output stream which will create an interactive TV programme when broadcast, for submission to a digital TV transmission network.
  • the embodiment provides a group of tracks for an interactive application, and its resource and trigger modules. It allows the timing and sequence of application events to be modified using the multiple tracks, one for start and end point of the core application, one or more for application resources (interactive objects used by teh application) and one for trigger events (to trigger the application to perform an action).
  • the system is aimed particularly at digital TV systems which are able to carry video, audio and application streams to consumers.
  • the system handles non-self contained interactive program objects, and indeed it allows the non-programmer, i.e. the professional editor, to manipulate individual parts of a program or application by splitting the application and its contents across teh several tracks, it is independent of teh language used to program the interactive application and can display simultaneously the interactivity required as implemented in multiple languages and displayed in multiple tracks. Multiple output versions can be generated, each with the interactivity programmed in a different language and suitable for different digital TV networks.

Abstract

A system for editing an interactive application in conjunction with an audio/video sequence with which it is to be output, wherein the interactive application is decomposed into a core application, application resources and application triggers; a display field is provided with an axis representing time; representations of the audio/video sequence, the core application, the application resources and the application triggers are displayed side-by-side on the display field, the representations being displayed on and extending along the time axis according to their availability in time; the positions and extent of the representations with respect to the time axis are controlled; and the interactive application in the edited form as represented on the display field is reconstructed.

Description

INTERACTIVE BROADCAST EDITING SYSTEM
The present invention relates to an editing apparatus and a method for editing an interactive application, in particular in conjunction with an audio/video sequence with which the interactive application is to be output.
It is known to create an audio/video sequence for broadcast over a television channel and to broadcast with that sequence an interactive application. To do this, the content creators/directors of the audio/video sequence provide instructions to an interactive content programmer. The interactive content programmer then creates the interactive application accordingly. In particular, the interactive application will be designed to run for a predetermined time, with application resources becoming available at predetermined times during the progress of the interactive application and application triggers being provided to trigger predefined application events at predetermined times.
The present invention is based on a recognition that this system of creating interactive applications is unsatisfactory. In particular, it requires the content creators of the audio/video sequence to determine in advance exactly what they require for the interactive application. It may, therefore, be necessary to complete the audio/video sequence before providing instructions to the interactive content programmer, thereby creating significant additional delays.
It is known from WO 99/52045 to provide an editing system for synchronizing self contained interactive content with portions of a video presentation, the interactive content being in the form of markup documents. The system does not provide control over the interactive content itself.
According to the present invention, there is provided a method of editing an interactive application in conjunction with an audio/video sequence with which it is to be output, the method including: decomposing the interactive application into a core application, application resources and application triggers; providing a display field with an axis representing time; displaying side by side on the display field representations of the audio/video sequence, the core application, the application resources and the application triggers, the representations being positioned on and extending along the time axis according to their availability in time; controlling the positions and extent of the representations with respect to the time axis; and reconstructing the interactive application in the edited form as represented on the display field.
According to the present invention, there is also provided editing apparatus for editing an interactive application in conjunction with an audio/video sequence with which it is to be output, the apparatus including: a decomposer for decomposing the interactive application into a core application, application resources and application triggers; a controller for driving a display to show a display field with an axis representing time and to show side by side on the display field representations of the audio/video sequence, the core application, the application resources and the application triggers, the representations being positioned on and extending along the time axis according to their availability in time and the controller being responsive to a user input to control the positions and extent of the representations with respect to the time axis; and an interpreter for reconstructing the interactive application in the edited form as represented on the display field.
In this way, a content creator is able to view the component parts of the interactive application with respect to time and in conjunction with the audio/video sequence. By means of decomposing and reconstructing the interactive application, it then becomes possible for the creator to modify on screen the occurrences of the interactive application, resources and triggers with respect to time and thereby edit the interactive application to be transmitted to the end users. It thereby becomes a simple matter for the audio/video technician to synchronise parts of the interactive application with the audio/video sequence.
Preferably, a plurality of interactive applications are decomposed and the respective representations are displayed side-by-side on the display field so as to allow editing of each of the interactive applications simultaneously.
In this way, more than one interactive application can be amended in the same way.
This is particularly useful when a number of different interactive applications are produced for different respective television systems having different APIs. Hence, the plurality of interactive applications may include at least interactive applications for use with different predetermined APIs.
Thus, the decomposer and interpreter are preferably capable of converting between the representations for the display field and a plurality of predetermined APIs.
Preferably, the start and end points of representations may be controlled for the core application and the application resources.
In this way, it is possible to change the times at which the core application and resources are available with respect to the audio/video sequence. Preferably, it is possible to delete selectively application resources by deleting the corresponding representations from the display field and also to insert selectively application resources from a database by inserting corresponding representations on the display field.
In this way, the content creator is able to change his or her mind about the content of the interactive application. Hence, subsequent to instructing the interactive content programmer to create the basic interactive application with its resources and triggers, the creator can decide to associate different resources with the interactive application or merely to delete some resources from the interactive application. Preferably, it is possible to delete and insert selectively application triggers by deleting and inserting corresponding representations in the display field.
Hence, similarly, the creator can decide to delete certain triggers or introduce new triggers.
Preferably, a position is indicated selectively along the time axis and the state of the audio/video sequence and also the interactive application for the indicated time is displayed.
In this way, for any desired time during the audio/video sequence, it is possible for the creator to receive a preview of the audio/video sequence at that time and also the state of the interactive application at that time. In this way, the creator is able to determine easily whether or not events and resources of the interactive application are occurring at the correct time in relation to the audio/video sequence. With the various components of the interactive application and the audio/video sequence arranged side-by-side in the direction of the time axis, it becomes a simple matter for the creator to choose a particular time in the audio/video sequence and to align the components of the interactive application in the display field accordingly. Preferably, it is determined whether or not the state of the interactive application for the indicated time is dependent on data, which, when running the interactive application, would have been calculated previously and, if so, a process is conducted to calculate the data. Since an interactive application, unlike the audio/video sequence, may be different at any particular time depending on previous interactivity with end users, it may not be possible to display a definitive state for the interactive application at that time. However, by checking whether the current state of the interactive application is dependent on earlier states, then appropriate action can be taken. Hence, preferably, it is determined whether or not the data is dependent on values which, when running the interactive application, would have been input by a user and the data is selectively calculated on the basis of default values or selected values.
In this way, an example of the state of the interactive application at the indicated time may be given on the basis of default values. Alternatively, selected values may be provided so as to display the state of the interactive application at the indicated time on the basis of previous states determined by the selected values.
In this way, it is possible for the creator to view the state of the interactive application at the indicated time for any given set of earlier inputs. Preferably, representations of a plurality of audio/video sequences are displayed side-by-side in the display field and the representations are controlled so as to cut between the respective audio/video sequences in order to produce the output sequence.
Thus, the creator may use the same display field to move the start and end points of different streams of audio/video data so as to splice the various streams together at desired points in time and, hence, create the desired output sequence. Viewed from another aspect, the invention provides an editing system for creating or editing an interactive television programme, comprising data processing apparatus including a display on which is presented a time line with a series of tracks including at least one track for an audio / video sequence and a plurality of interactive application tracks which represent a plurality of versions of an interactive application programmed in different programming languages, the editing system being arranged to provide a plurality of interactive programme outputs for a corresponding plurality of television networks which require interactive content to be programmed using different interactive languages.. Preferably, for any particular programming language, components of the interactive application are split across multiple tracks. Preferably the components of an interactive application comprise a core application, application resources and application triggers.
Viewed from another aspect, the invention provides a computer readable storage medium having recorded thereon code components that, when loaded on a computer and executed, will cause that computer to operate according to the invention.
An embodiment of the invention will now be described by way of example only, and with reference to the accompanying drawings, in which: Figure 1 illustrates an apparatus embodying the present invention; Figure 2 illustrates schematically the process conducted by the apparatus of Figure 1;
Figure 3 illustrates a display field embodying the present invention; Figure 4 illustrates the process of inserting or deleting resources and triggers; and
Figure 5 illustrates the process of previewing an interactive application. The present invention may be embodied in an editing apparatus as illustrated schematically in Figure 1. The various functional blocks are intended to signify the various functions of the apparatus and could be embodied in different ways, for instance within combined units .
A controller 2 drives a display 4 and is responsive to an input unit 6. The input unit 6 may be of any convenient type, including a keyboard, a mouse or a plurality of dedicated control buttons, sliders etc, for controlling the functions of the controller 2. Indeed, it is possible to combine the display 4 and input 6 into one unit, for instance, in the case of a touch sensitive display.
The controller 2 is arranged to receive audio/video data and also interactive content data. The audio/video data may be provided from any suitable source, such as the illustrated memory 8. However, the data should be available in such a way that the controller can preview the audio/video data at any time within the sequence to be edited. Hence, the controller may include a return channel so as to select portions of the audio/video sequence from the memory 8, rather than downloading the entire sequence.
The interactive content is provided from a decomposing unit 10. This retrieves an interactive application from a memory 12 and comprises an interpreter for analysing the content of the interactive application as a whole and determining from this the core application, the application resources and the application triggers. These are provided to the controller 2 in component form.
It will be appreciated that the interpretation function of the decomposer 10 could be integrated with the functions of the controller 2 such that the controller 2 retrieved interactive applications directly from a memory without the need of a decomposer unit.
As will be discussed further below, the controller 2 provides representations of the component parts of the interactive application on the display 4. The input 6 allows editing of the representations on the screen 4 and the controller accordingly edits the actual interactive application components.
The edited components are then provided to an interpreter 14 which combines the components to form once again a complete interactive application. This may be stored in memory 16.
Where the audio/video sequence is also edited by the controller, the resulting output sequence may be provided to a separate memory for future multiplexing with the interactive application or rewritten to memory 8 in the edited form.
The process conducted by the decomposer 10, controller 2 and interpreter 14 is controlled by the input 6 and displayed and by the display 4. The combined functioning arrangement can be described as a non-linear interactive content editor. This is summarised as a single functional block 20 in Figure 2.
As illustrated, video data may be provided from a number of different video sources 22, including tape and disc storage mediums. Similarly, audio data may be provided from a variety of similar sources 24.
As is well known, a variety of different interactive application altering tools 26 are available for producing interactive applications. In this respect, it should be appreciated that different service providers use different forms of interactive application. Thus, there are different interactive application development environments, such as OpenAuthor (open TV), MediaHighway, MHEG, Liberate etc. Various altering tools 26 are available to provide interactive applications according to the different development environments.
The non-linear interactive content editor 20 is illustrated in conjunction with a digital multiplexer 28. This is intended to illustrate what happens after the editing of the interactive application is complete and the audio/video sequence and interactive application are ready for transmission. In particular, the digital multiplexer 28 is used for multiplexing the edited interactive application and the output audio/video sequence.
As illustrated, the multiplexed output could be provided to a broadcast network. Alternatively, it could be provided to a non-broadcast network with the audio/video sequence being streamed from a video server together with the triggers and the application and resources being provided on a server for on-demand delivery.
The controller 2 drives the display 4 to produce a display field having at least a time axis. An example of a suitable display field 40 is illustrated in Figure 3.
As illustrated, the time axis extends in the horizontal direction. Representations of various audio/video sequences and interactive application components are then displayed side-by-side, stacked along the vertical axis.
As illustrated, the representations comprise elongate bars representing the time over which their corresponding sequences or components will be available to the end user. The display field 40 may extend over the entire time of the required audio/video output sequence. Alternatively, the display field 40 may display only a section of the total audio/video output sequence and the controller 2 may provide a function by which the displayed portion may be moved freely within the total extent of the audio/video output sequence.
The display field 40 of Figure 3 displays an example in which two video sequences 42, 44 and two audio sequences 46 and 48 are displayed in conjunction with three interactive applications. The first interactive application is displayed as a representation 50 of the core application, representations 52 of the application resources and representations 54 of the triggers. The second and third applications are shown as representations 56 and 58 of the respective core applications. By way of example, the first application may be for use in the Open TV environment whereas the second and third applications are for use in the Liberate and MediaHighway environments respectively.
The controller could provide a display field 50 in which the second and third applications are also displayed in conjunction with their respective resources and triggers. However, in the illustrated embodiment, in order to simplify use, the user may select one of the interactive applications for which to display all of the components.
In operation, the input 6 is used to manipulate the various representations on the display field 40 as required. For instance, the start and end points of the core applications and the application resources may be moved along the time axis so as to change the times within which those components are available from the service provider to the end user. Similarly, the input 6 may allow the user merely to move representations along the time axis without changing their extent. Furthermore, the input 6 may be used to move the positions of the triggers along the time axis. In this way, the user can easily manipulate the timing of the various parts of the interactive application.
In a similar manner, the input 6 may be used to move the start and end points of the various audio and video sequences. Since only one output audio/video sequence is to be produced, no sensible meaning can be given to sequences which overlap along the time axis. Therefore, audio and video sequences from various sources are chosen such that only one video sequence and only one audio sequence is available at any one point along the time axis. In view of this, in a preferred embodiment, moving an end of one video sequence will automatically move the end of the video sequence adjacent in time. Also, since one video sequence is usually associated with a corresponding audio sequence, it is possible for the controller to retain a link between these two corresponding sequences such that editing one sequence along the time axis causes automatic editing of the other corresponding sequence.
As well as manipulating existing representations of resources and triggers, it is also possible to insert or delete representations so as to add or remove resources and triggers to or from the interactive application.
Additional resources may be acquired from any appropriate source, for instance, an additional database memory as illustrated in Figure 1 as memory 18.
As illustrated in Figure 4, the process first imports in step 100 the audio video content into the interactive content editor. Subsequently or at the same time in step 102, the process then imports the required interactive application by adding it to the interactive content editor in conjunction with the audio/video content. As explained above, this is achieved (using the appropriate API and development environment) either in the decomposer 10 or within the controller 2. In step 104, it is determined whether or not the input 6 has been actuated to add a new resource or trigger event to the interactive application. If the addition of a resource has been selected, then the process proceeds to step 106 in which a list of resource types/options is displayed. The list may be displayed on the display 4 in place of the display field 40 or in any other way, for instance as a window superimposed on the display 40 or on some other display device. The list of resources may correspond to any additional resources available to the controller 2. For instance, as illustrated in Figure 1, a memory 18 may be provided as a database of additional resources. Where those resources are intended for a variety of development environments, the controller can retrieve those resources via the interpretation of the decomposer 10 or apply its own interpretation in the same way as when retrieving the original interactive application.
Thus, in step 108, following selection of a particular resource from the list by means of the input 6, that resource is added to the interactive application in step 108. In step 110, the new resource is multiplied with the audio/video and application stream. Once multiplexed, the resultant stream can either be stored on a recording medium (ready for transmission) or sent directly to the broadcast chain. There, the stream may be multiplexed with different AV and data sources.
At step 112, it is determined whether or not the added resource is required to be triggered at a specific time. If no specific trigger time is required, then the process proceeds to step 114 where it is determined whether or not any further resources or triggers are required. If no further resources or triggers are required, then the process can end. However, otherwise the process returns to step 104.
At steps 104 and 112, if the input 6 indicates that a new trigger is required, then the process proceeds to step 116. At this point, a list of trigger options is displayed to the user. This display may take the same form as that discussed for step 106. The options may include known options for triggers including triggering resource display, scene change etc.
Once a trigger option has been selected by the user using the input 6, the process proceeds to step 118 in which the trigger is added to the interactive application.
In step 120, the new trigger is multiplied with the audio/video and application stream. Once multiplexed, the resultant stream can either be stored on a recording medium (ready for transmission) or sent directly to the broadcast chain. There, the stream may be multiplexed with different AV and data sources.
Referring again to Figure 2, it will be seen that a time line 60 is displayed vertically on the display field 40. In particular, the time line 60 provides an indication of a particular time for each of the displayed representations. It is then possible to display a preview of the audio/video sequence and one of the interactive applications for that indicated time. The preview image 70 may be provided in any suitable manner, for instance on a separate display screen, as a window superimposed over the display field 40 or as an image to replace selectively the display field 40 on the display 4.
Thus, by moving the time line 60 up and down the time axis, it is possible for the user to view the audio/video sequence and interactive application at any indicated time. This allows a creator to determine easily what portion of the audio/video sequence occurs at that time and to synchronise appropriate parts of the interactive application with that part of the audio/video sequence.
It will be appreciated that interactive applications by their nature often rely on interaction with input from the end users. Thus, at any particular time during operation of the interactive application, the particular state of the interactive application might be any one of a number of states depending on the earlier inputs made to it.
In order to deal with this, the controller employs a process to provide an appropriate preview. This will be explained with reference to Figure 5. At step 200, the interactive content editor contains at least part of the interactive application. Then, responsive to the input 6, the operator selects a preview option in step 202.
Assuming the system is arranged to handle a variety of development environments requiring different APIs, the process then awaits selection by the input 6 of an appropriate API. For instance, as illustrated in Figure 3, three equivalent interactive applications 50, 56 and 58 may be displayed on the display field 40 at the same time. The operator can select which of these interactive applications is to be previewed by selecting the development environment or API. For instance, by selecting open TV, the interactive application 50 will be previewed. Of course, a similar process may be used to select between different interactive applications which are to be broadcast at the same time using the same development environment.
Thus, in step 204, it is determined whether or not an API has been selected by the operator. If an API has been selected, then, in step 206, the process determines which portion of the interactive application is to be previewed on the basis of the time indicated by the time line 60.
In step 208, the interactive application is processed to determine whether or not the interactive application at that time is dependent on any data which would have been acquired previously had the full interactive application been run. If no such data is required, then the process can proceed to step 210 and step 212 in which the preview is built using the selected API and run using the appropriate virtual machine for the API.
In the preferred embodiment, the audio/video sequence is played and displayed simultaneously with the preview of the interactive application. This may be achieved by outputting previously acquired audio/video data or by instructing an external VTR device to play the appropriate portion.
If the selection portion of the interactive application does require previous data, then the process moves from step 208 to step 214 and processes the interactive application to determine the previous data. In step 216, when it is determined that previous data is required, the operator is prompted, for instance using the display 4, to choose between default data or user- choice data.
If the operator selects the use of default values, then, in step 218, the default values are loaded and, in step 220, the default values are used to create past state information based on the selected API. This past state information can then be used in step 210 to build the preview as discussed above.
On the other hand, if the operator selects the use of user-choice values in step 222, the operator is able to input the required user-choice data by means of the input 6 and/or display 4. The step 220 can then make use of these values to create the appropriate past state information. In this way, the operator can create a preview of the interactive application for any different combination of user inputs.
The prefeπed embodiment of the present invention is particularly for use in broadcast studio / post-production facilities, and can handle interactive applications (of any proprietary API) and their resource modules and trigger objects. It enables a professional editor to import an interactive application into a "standard" audio / video editing environment and alter the sequence / timing of application events (display of resources, triggers etc.) which would otherwise require software programming expertise. The editor therefore produces an audio / visual / application output stream which will create an interactive TV programme when broadcast, for submission to a digital TV transmission network.
The embodiment provides a group of tracks for an interactive application, and its resource and trigger modules. It allows the timing and sequence of application events to be modified using the multiple tracks, one for start and end point of the core application, one or more for application resources (interactive objects used by teh application) and one for trigger events (to trigger the application to perform an action). The system is aimed particularly at digital TV systems which are able to carry video, audio and application streams to consumers.
The system handles non-self contained interactive program objects, and indeed it allows the non-programmer, i.e. the professional editor, to manipulate individual parts of a program or application by splitting the application and its contents across teh several tracks, it is independent of teh language used to program the interactive application and can display simultaneously the interactivity required as implemented in multiple languages and displayed in multiple tracks. Multiple output versions can be generated, each with the interactivity programmed in a different language and suitable for different digital TV networks.

Claims

1. Editing apparatus for editing an interactive application in conjunction with an audio/video sequence with which it is to be output, the apparatus including: a decomposer for decomposing the interactive application into a core application, application resources and application triggers; a controller for driving a display to show a display field with an axis representing time and to show side by side on the display field representations of the audio/video sequence, the core application, the application resources and the application triggers, the representations being positioned on and extending along the time axis according to their availability in time and the controller being responsive to a user input to control the positions and extent of the representations with respect to the time axis; and an interpreter for reconstructing the interactive application in the edited form as represented on the display field.
2. Editing apparatus as claimed in claim 1 wherein the decomposer is configured to decompose a plurality of interactive applications, the controller driving the display to show the respective representations side by side on the display field so as to allow editing of each of the interactive applications simultaneously.
3. Editing apparatus as claimed in claim 2 wherein the decomposer is configured to decompose a plurality of interactive applications for use with different respective predetermined APIs.
4. Editing apparatus as claimed in claim 1, 2 or 3 wherein the controller is responsive to the user input to control the start and end points of the representations of the core application and the application resources.
5. Editing apparatus as claimed in any preceding claim wherein the controller is responsive to the user input to selectively delete application resources by deleting the corresponding representations from the display field and to selectively insert application resources from a database by inserting corresponding representations on the display field.
6. Editing apparatus as claimed in claim 5 further including a database for storing additional application resources.
7. Editing apparatus as claimed in any preceding claim wherein the controller is responsive to the user input to selectively delete and insert application triggers by deleting and inserting corresponding representations in the display field.
8. Editing apparatus as claimed in any preceding claim wherein the controller is responsive to the user input to drive the display to indicate a selected position on the time axis and selectively to display the state of the audio/video sequence and the interactive application for the indicated time.
9. Editing apparatus as claimed in claim 8 wherein the controller is configured to determine whether or not the state of the interactive application for the indicated time is dependent on data which, when running the interactive application, would have been calculated previously and to conduct, if so, a process to calculate said data.
10. Editing apparatus as claimed in claim 9 wherein the controller is configured to determine whether or not said data is dependent on values which, when running the interactive application, would have been input by a user and to calculate selectively said data on the basis of default values or values selected under the control of the user input.
11. Editing apparatus as claimed in any preceding claim wherein the controller is additionally for driving the display to show representations of a plurality of audio/video sequences side by side in the display field and for cutting between the respective audio/video sequences so as to produce the output sequence.
12. Editing apparatus as claimed in any preceding claim including a display to be driven by the controller and a user input to which the controller is responsive.
13. Editing apparatus as claimed in any preceding claim further including an input application memory for providing the interactive application to the decompressor.
14. Editing apparatus as claimed in any preceding claim further including an output application memory for storing the reconstructed interactive application from the interpreter.
15. A method of editing an interactive application in conjunction with an audio/video sequence with which it is to be output, the method including: decomposing the interactive application into a core application, application resources and application triggers; providing a display field with an axis representing time; displaying side by side on the display field representations of the audio/video sequence, the core application, the application resources and the application triggers, the representations being positioned on and extending along the time axis according to their availability in time; controlling the positions and extent of the representations with respect to the time axis; and reconstructing the interactive application in the edited form as represented on the display field.
16. A method as claimed in claim 15 including: decomposing a plurality of interactive applications; and displaying the respective representations side by side on the display field so as to allow editing of each of the interactive applications simultaneously.
17. A method as claimed in claim 16 wherein the plurality of interactive applications at least include interactive applications for use with different predetermined APIs.
18. A method as claimed in claim 15 , 16 or 17 wherein the step of controlling includes controlling the start and end points of the representations of the core application and the application resources.
19. A method as claimed in any one of claims 15 to 18 further including: selectively deleting application resources by deleting the corresponding representations from the display field; and selectively inserting application resources from a database by inserting corresponding representations on the display field.
20. A method as claimed in any one of claims 15 to 19 further including: selectively deleting and inserting application triggers by deleting and inserting corresponding representations in the display field.
21. A method as claimed in any one of claims 15 to 20 further including: indicating a position on the time axis; and displaying the state of the audio/video sequence and the interactive application for the indicated time.
22. A method as claimed in claim 21 further including: determining whether or not the state of the interactive application for the indicated time is dependent on data which, when running the interactive application, would have been calculated previously and, if so, conducting a process to calculate said data.
23. A method as claimed in claim 22 further including: determining whether or not said data is dependent on values which, when running the interactive application, would have been input by a user; and selectively calculating said data on the basis of default values or selected values.
24. A method as claimed in any of claims 15 to 23 further including: displaying representations of a plurality of audio/video sequences side by side in the display field; and using said step of controlling to cut between the respective audio/video sequences so as to produce the output sequence.
25. An editing system for creating or editing an interactive television programme, comprising data processing apparatus including a display on which is presented a time line with a series of tracks including at least one track for an audio / video sequence and a plurality of interactive application tracks which represent a plurality of versions of an interactive application programmed in different programming languages, the editing system being arranged to provide a plurality of interactive programme outputs for a corresponding plurality of television networks which require interactive content to be programmed using different interactive languages.
26. An editing system as claimed in claim 25, wherein for any particular programming language, components of the interactive application are split across multiple tracks.
27. An editing system as claimed in claim 26, wherein the components of an interactive application comprise a core application, application resources and application triggers.
28. A computer readable storage medium having recorded thereon code components that, when loaded on a computer and executed, will cause that computer to operate according to any one of the preceding claims.
PCT/GB2002/000969 2001-03-06 2002-03-05 Interactive broadcasting editing system WO2002071213A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002234784A AU2002234784A1 (en) 2001-03-06 2002-03-05 Interactive broadcasting editing system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0105582.1 2001-03-06
GB0105582A GB0105582D0 (en) 2001-03-06 2001-03-06 An editing apparatus and a method for editing an interactive application

Publications (3)

Publication Number Publication Date
WO2002071213A2 true WO2002071213A2 (en) 2002-09-12
WO2002071213A8 WO2002071213A8 (en) 2003-11-27
WO2002071213A3 WO2002071213A3 (en) 2004-01-08

Family

ID=9910130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2002/000969 WO2002071213A2 (en) 2001-03-06 2002-03-05 Interactive broadcasting editing system

Country Status (3)

Country Link
AU (1) AU2002234784A1 (en)
GB (2) GB0105582D0 (en)
WO (1) WO2002071213A2 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996019779A1 (en) * 1994-12-22 1996-06-27 Bell Atlantic Network Services, Inc. Authoring tools for multimedia application development and network delivery
WO1999052045A1 (en) * 1998-04-03 1999-10-14 Avid Technology, Inc. System and method for providing interactive components in motion video

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996019779A1 (en) * 1994-12-22 1996-06-27 Bell Atlantic Network Services, Inc. Authoring tools for multimedia application development and network delivery
WO1999052045A1 (en) * 1998-04-03 1999-10-14 Avid Technology, Inc. System and method for providing interactive components in motion video

Also Published As

Publication number Publication date
WO2002071213A3 (en) 2004-01-08
GB0205137D0 (en) 2002-04-17
GB2376364A (en) 2002-12-11
GB0105582D0 (en) 2001-04-25
GB2376364B (en) 2005-06-08
AU2002234784A1 (en) 2002-09-19
WO2002071213A8 (en) 2003-11-27

Similar Documents

Publication Publication Date Title
US7260782B2 (en) Method and system for generating flexible time-based control of application appearance and behavior
US6674955B2 (en) Editing device and editing method
US5801685A (en) Automatic editing of recorded video elements sychronized with a script text read or displayed
US6400378B1 (en) Home movie maker
US7506356B2 (en) Skimming continuous multimedia content
US20030091329A1 (en) Editing system and editing method
US8332886B2 (en) System allowing users to embed comments at specific points in time into media presentation
US7849406B2 (en) Apparatus and method for authoring
US20060277470A1 (en) Binding interactive multichannel digital document system
US20020118300A1 (en) Media editing method and software therefor
US20080044155A1 (en) Techniques for positioning audio and video clips
EP2172936A2 (en) Online video and audio editing
US20140301386A1 (en) Methods and systems for providing and playing videos having multiple tracks of timed text over a network
AU3118299A (en) System and method for providing interactive components in motion video
JP2007317353A (en) Editing device and editing method
US11551724B2 (en) System and method for performance-based instant assembling of video clips
WO2002071213A2 (en) Interactive broadcasting editing system
EP0911829A1 (en) Editing system and editing method
KR100603173B1 (en) Editing apparatus and editing method
KR100603161B1 (en) Editing system and editing method
JP4172525B2 (en) Editing apparatus and editing method
KR100441343B1 (en) Tool for editing a multimedia data and method for editing a multimedia data using the same
JP2007317352A (en) Editing device and editing method
JP2007274594A (en) Preview reproduction device
GB2350742A (en) Interactive video system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
CFP Corrected version of a pamphlet front page
CR1 Correction of entry in section i

Free format text: IN PCT GAZETTE 37/2002 DUE TO A TECHNICAL PROBLEM AT THE TIME OF INTERNATIONAL PUBLICATION, SOME INFORMATION WAS MISSING (81). THE MISSING INFORMATION NOW APPEARS IN THE CORRECTED VERSION.

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 69(1) EPC, F1205A DATED 18.12.03

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP