CA2301935A1 - Streaming media control and synchronization application program interface (api) for a digital television receiver - Google Patents

Streaming media control and synchronization application program interface (api) for a digital television receiver Download PDF

Info

Publication number
CA2301935A1
CA2301935A1 CA002301935A CA2301935A CA2301935A1 CA 2301935 A1 CA2301935 A1 CA 2301935A1 CA 002301935 A CA002301935 A CA 002301935A CA 2301935 A CA2301935 A CA 2301935A CA 2301935 A1 CA2301935 A1 CA 2301935A1
Authority
CA
Canada
Prior art keywords
api
presentation
video
audio
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002301935A
Other languages
French (fr)
Inventor
Ganesh Rajan
Branislav Meandzija
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Arris Technology Inc
Original Assignee
General Instrument Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Instrument Corp filed Critical General Instrument Corp
Publication of CA2301935A1 publication Critical patent/CA2301935A1/en
Abandoned legal-status Critical Current

Links

Abstract

A streaming media control and synchronization application program interface (API) for a digital television receiver. The API provides a simple alternative to the Java Media Framework for JavaTV and other television software environments, including the Advanced Television Systems Committee Digital TV
Application Software Environment (ATSC T3/S17 DASE), and the Digital Video Broadcast Multimedia Home Platform (DVB MHP). This API enables playing back media, and controlling the playback. The API includes a media presentation control package, "MediaComponentPresenter," and a service presenter hierarchy. The API controls stopping, starting, suspending and resuming the presentation of the streaming audio and/or video at a media player, e.g., at a television set-top receiver. The API also provides a start method that attempts to synchronize the presentation of audio and/or video streams according to associated locators of the streams. The API also can take checkpoints of the audio and/or video as it is playing, where the checkpoints indicate temporal locations of the audio and/or video for initiating a rewind operation.

Description

STREAMING MEDIA CONTROL AND SYNCHRONIZATION APPLICATION
' PROGRAM INTERFACE (API) FOR A DIGITAL TELEVISION
RECEIVER
This application claims the benefit of U.S.
Provisional Applications No. 60/125,786, filed March 23, 1999, and No. 60/127,753, filed April 5, 1999.
The present invention relates to software for terminals, such as television set-top terminals.
The following acronyms and terms are used:
API - Application Program Interface;
ATSC - Advanced Television Systems Committee;
AWT - Advanced Windows Toolkit CCIR - Comite Consultatif Internationale de Radio DASE - ATSC T3/S17 Digital TV Application Software Environment;
DAVIC - Digital Audio-Visual Council;
DVB - Digital Video Broadcast;
GUI - Graphical User Interface;
HAVI - Home Audio Video Environment;
IRD - Integrated Receiver Decoder;
ISO - International Standards Organization;
MHP - Multimedia Home Platform UI - User Interface;
UML - Unified Modeling Language;
URL - Uniform Resource Locator.

V
A set-top box, also referred to as an IRD or a subscriber terminal, is a device that receives and decodes television signals for presentation by a television. The signals can be delivered over a satellite, through a cable plant, or by means of _ , terrestrial broadcast . Modern,;"s_~t, tops also support video on demand (VOD), pay-per-view, interactive shopping, electronic commerce, and enable Internet connectivity and possibly Internet-based telephony. The set top functionality is enabled through specialized hardware and software.
In particular, all three of the currently emerging Java API standards for Digital Television receivers, i.e., ATSC DASE, DVB MHP, and JavaTV, rely on the Java Media Framework (JMF) for control of the streaming audio and video components of the presentation. However, the JMF is relatively large (consuming from 400 KB - 2 MB of memory) and complicated for downloadable applications (e. g., Xlets) in the embedded set-top environments.
For ATSC DASE, refer to "Advanced Television Systems Committee (ATSC) - T3/S17 Specialist Group -Digital Television Receiver Software Environment for Terrestrial and Cable Broadcast of Data and Interactive Services - Document No. 075 - February 1999."
For DVB MHP, refer to "Digital Video Broadcast (DVH) Multi-Media Home Platform (MHP) Specialist Group -Architecture of the DVB Java Platform - TAM 214 - March 1999.°
For JavaTV, refer to "Sun Microsystems, JavaTV API
Specification, March 1999."
For JMF, refer to "Sun Microsystems, Java Media Framework Programmer Guide, v0.5, December 21, 1998".
For a broadcast environment, where currently most of the content decoding and presentation is performed in hardware, a complex framework such as JMF is not necessary and also often not feasible,to implement due to processing power and memory constraints.
In such an environment, it is desirable to have a simple API for control of the presentation of different broadcast services. This API should also enable playing back media and controlling the playback. The API should provide a simple alternative to JMF for JavaTV and other television software environments that provides the above and other functions.
The present invention provides a system having the above and other advantages.
The present invention relates to software for terminals, such as television set-top terminals.
A multi-media terminal in accordance with the 5. invention includes means for receiving control information from an external'source, 'a computer readable medium having computer program code, means for executing the computer program code means to implement an application program interface (API).' For example, the code may be Java-based, although any type of code may be used.
The API is responsive to the control information for controlling at least one presentation function of a media player associated with the terminal. For example, the terminal may have media players for audio, video and other data. The presentation function is related to the presentation of streaming audio and/or video.
The presentation functions) can include stopping, starting, suspending and resuming the presentation of the streaming audio and/or video at the media player.
Moreover, the external source may be a user remote control, such as a signal from a hand-held infra-red transmitter.
Or, the external source of the control information may be an external applications program.
Preferably, the API controls the media player based on the control information regardless of a specific implementation of the media player, and is therefore compatible with different media players, e.g., from different manufacturers.
The API is compatible with the currently-proposed API standards for digital television receivers, 5 including Java TV, Advanced Television Systems Committee Digital TV Application Software Environment, and. Digital Video Broadcast Multimedia Home Platform.
In a specific feature, the API provides a start method to attempt to synchronize the presentation of a plurality of audio and/or video streams according to associated locators of the streams. The locators identify the associated streams. If the API cannot synchronize the streams within a period of time, which is implementation-dependent, it starts the streams unsynchronized. In this manner, the user receives some output. Depending on the degree of offset, the presentation of the unsynchronized streams to the user may be satisfactory.
If the associated locators are empty, the associated streams are started by default without the start method attempting to synchronize them. Again, this avoids having the user receive no output.
In a further feature, when currently-playing audio and/or video streams are already synchronized with each other, and at least one additional audio and/or video stream is to be synchronized with the already-synchronized streams, the start method attempts to synchronize the additional stream with the already-synchronized streams without stopping the playing of the already-synchronized streams. This avoids an unnecessary interruption in the output to the user.
In a further feature, the API takes checkpoints of the audio and/or video as it is playing. This occurs each time the.method "checkpointMediaStream" is called, __ and provides a marker to ena~~~=the audio and/or video to be rewound to any of the checkpoints. The term "rewind" or the like refers to terminating the current playing, and restarting the playing at a previous point.
A corresponding method is also presented.
IN THE DRAWINGS, FIG. 1 illustrates a media presentation control package, named "MediaComponentPresenter," in accordance with the present invention.
FIG. 2 illustrates a service presenter hierarchy that includes the media presentation package of FIG. 1 in accordance with the present invention.
FIG. 3 illustrates a user terminal in accordance with the present invention.
The present invention relates to software for terminals, such as television set-top terminals.
The invention defines a simple API for streaming media control and synchronization based on three interfaces, "MediaStreamControl", "VisualControl" and "AudioControl", that are to be implemented by media service component presenters within ATSC DASE or other television software environments.
As an illustration only, the invention is described in terms of the Java(tm) programming language syntax, developed by Sun Microsystems, Inc. Moreover, FIGS 1 and 2 are generated from Java code using the Unified Modeling Language (UML), developed by Rational Software Corporation, USA, to provide class diagrams. A class diagram represents the static structure of a system, and shows a pattern of behaviors that the system exhibits.
This is accomplished by showing the existence of classes and interfaces, and their relationships. Each class or interface is represented by a box with three sections. The top section lists the class or interface name. The middle section denotes a list of attributes, and the bottom section denotes a list of operations (e. g., methods). Moreover, a solid line between classes or interfaces denotes an association. A solid line with a white diamond tip denotes aggregation by reference. A
triangular arrowhead denotes a restricted navigation, e.g., inheritance of operation but not of structure.
Within the context of JavaTV, service component presenters are returned as the result of a "get"
operation on the ServiceContext.class within the serviceSelection API. The service component presenter (MediaComponentPresenter) within the system described in this invention controls the media via the MediaStreamControl, VisualControl or the AudioControl.
A visual MediaComponentPresenter needs to be associated with user interface components that will be displaying the media being controlled, such as the HAVI UI
Components as prescribed by ATSC DASE and DVB MHP (refer to the consortium of Sony, Matsushita, Philips, Sharp, Grundig, Hitachi, Thomson, Toshiba, "The HAVI
Specification", December, 1998, and to "www.havi.org").
The HAVI UI Component extends Java's AWT Component.
This API provides a simple application interface to downloaded Xlets or other set top applications and enables them to control streaming audio and visual media. Synchronization between the code in an application and the time base of a media stream is done through a mechanism such as the DAVIC (refer to "DAVIC
1.4 Specification Part 9:1998 - Information Representation") MediaTimeEvent(s). Event based synchronization is done through DAVIC media StreamEvent(s).. The media-time-event and media-stream-event synchronization mechanisms are only two mechanisms of the many included in the API of the present invention. Others are stream synchronization, stream checkpoint synchronization, audio/video/data synchronization, etc.

1. Media Presentation MediaComponentPresenter implements the Stream, Visual and Audio control interfaces. The MediaComponentPresenter controls a set of media streams, 5 which is part of a presentation. The set of media streams controlled may consist of multiple visual, audio and other media streams, which may need to be synchronized. Any visual media stream, and some audio media streams, need to be associated with the visual 10 presentation components, which can be the havi.ui.Component, that are an extension of awt.Component. There may be multiple components per MediaComponentPresenter. If there are multiple components, the control semantics are extended to work on all MediaComponentPresenter components, VisualControl for all visual components, and AudioControl for all audio components.
The following sections specify the media presentation package, and also the relationship of this package with JavaTV, which is at the core of both ATSC
DASE and DVB MHP. We also give a basic description of HAVI UI components, which are used for media presentation.
1.1 Media Presentation Package FIG. l illustrates a media presentation control package, named ~~MediaComponentPresenter,~~ in accordance with the present invention.
The class diagram 100 includes the classes PresentationException 105, Exception 100 (from the java package ~~java.lang~~), PresentationRescaleException 115, PresentationSynchronizationException 120, PresentationCheckpointException 125, VideoFormatException 130, FrameCaptureFailedException 135, RewindNotPossibleException 140, PresentationRateException 145, PresentationResourceException 150, and PresentationComponentException 155.
In accordance with the invention, the diagram 100 also includes the three interfaces of MediaComponentPresenter, namely MediaStreamControl 160, VisualControl 165, and AudioControl 170.
FIG. 2 illustrates a service presenter hierarchy that includes the media presentation package of FIG. 1 in accordance with the present invention.
Like-numbered elements correspond to one another in the figures.
Moreover, the operations (methods) of interfaces 160, 165 and 170 are not specified here to avoid redundancy, but are the same as specified in FIG. 1.
The class diagram 200 includes the following interfaces that are not in FIG. 1:
ServiceComponentPresenter 205, ApplicationComponentPresenter 210, MediaComponentPresenter 215, ApplicationProxy 220, StreamEventInterface 230, and MediaTimeEventInterface 240.
The MediaComponentPresenter 215 implements the MediaStreamControl interface 160. The purpose of this interface 215 is to enable control of the media streams specified. The interface 215 provides methods that can operate on several streams simultaneously. The MediaStreamControl interface 160 controls any single stream interface, or subset of stream interfaces, identified by the Locator[ ] that has been passed to the constructor of the media component presenter.
Locator[ ] is an array of Locators. All methods of the MediaStreamControl interface 160 (see also FIG. 1) take a subset of the Locator[ ] as an argument. This argument identifies the media streams that the method operates on.
The method startMediaStream starts the presentation of the media stream components) specified. For visual broadcast media, the meaning of this method is to make visible the associated visual components with the broadcast media presentations within them. For audio broadcast media, the meaning of the method is to unmute (allow to be audible) the audio components. The streams identified by the locators are best-effort synchronized.
If the MediaComponentPresenter 215 can not synchronize the streams, then it starts them unsynchronized. If the Locator[ ] passed to the start method (startMediaStream) is empty, all media streams are started by default.
There may be multiple start operations on the same media component presenter that affects different media streams. Each set of streams started together is best-effort synchronized. In the interface MediaTimeEventInterface 240, the method notifyWhen has a mediaTime parameter that refers to the last set of media streams started. For media where the presentation start is user-controllable, the start method (startMediaStream) additionally affects the start of the presentation.
The stopMediaStream method makes the associated visual components invisible, together with the media presentations, and stops the presentation of the media stream components) specified, if possible.
The MediaComponentPresenter interface 215 implements this VisualControl interface 165, which enables control of the visual portion of the media presentation. The VisualControl interface 165 controls any single visual stream, or subset of visual streams, identified by the Locator[ ] that has been passed to the constructor of the MediaComponentPresenter interface 215. All VisualControl 165 methods take a subset of the Locator[ ] as argument. This argument identifies the visual streams that the method operates on.
The MediaComponentPresenter interface 215 also implements the AudiolControl interface 170, which enables control of the audio portion of the media presentation. The AudioControl interface 170 controls any single audio stream, or subset of audio streams, identified by the Locator[ ] that has been passed to the constructor of the MediaComponentPresenter interface 215. All AudioControl methods take a subset of the Locator[ ] as an argument. This argument identifies the audio streams that the method operates on.
1.2 Media Presenter within JavaTV
Service presentation components are associated with channel components (i.e:, ChannelComponent) of a service (i.e. TvChannel) through the service selection process, i.e., ServiceSelectionAPI. These three components are semantically from MPEG. In the context of the present invention, they are literally from JavaTV. The select method call tunes to a channel, associates channel components with audio-visual component presenters, and creates HAVI UI, components for the presenters. The HAVI
UI component is not visible until the associated component presenter's startVideo methods are called.
Time-based and event-based synchronization can be accomplished by the application by using the DAVIC
StreamEventInterface and MediaTimeEventInterface.
Moreover, an application can subscribe to media StreamEvent(s) and MediaTimeEvent(s) through these interfaces.
1.3 HAVI Component Within HAVI, a "havlet" (comparable to an Xlet in JavaTV) must be able to be authored in some appropriate coordinate systems) that match the author's requirements, with respect to the main pieces of content that are to be displayed. In particular, there may be instances where content/portions of applications have been authored in multiple coordinate spaces, while the graphical device may only support a single (different) output coordinate system. Hence, the HAVI Level 2 GUI
provides additional capabilities to those in conventional AWT components, namely allowing an explicit transformation to be made between logical pixels and device pixels.
The capabilities provided by the Level 2 GUI API
are extremely flexible in providing the ability for authors to perform transformations on an individual Component/Container basis. For example, a window containing graphics may be authored in an effectively square pixel aspect ratio, while a window containing 5 video may be directly authored in a compatible CCIR-601 pixel space. Where an author requires exact registration between content with different aspect ratios, then .he/she may author appropriately. Note that this scaling is not only required for rendering 10 graphical output, but must also be performed for signaling any coordinate information associated with user-inputs, e.g., x, y cursor location (where appropriate), mouse-enter, etc.
The org.havi.ui.Component class extends 15 java.awt.Component to include:
1. A transformation that is used to map between logical (user-defined) pixels and the graphical device output pixels (input coordinates).
2. An association between a Component and a piece of visible Media.
3. Additional semantics related to transparency of the Component itself.
4. A mechanism for indicating the 4-way (up/down/left/right) navigation between Component objects.
A HAVI Component is associated automatically with the MediaComponentPresenter interface 215 in the process of processing the select method of the ServiceContext object within the ServiceSelection API.
FIG. 3 illustrates a user terminal in accordance with the present invention. A terminal 300 includes a demultiplexer 315 for receiving an input data signal, which may comprise a multiplex of media packet's (e. g., audio, video, data, etc.). The data packets are routed appropriately to, e.g.; an audio media player 305, a video media player 330, or a data media player 332.
The media players 305, 330 and 332 operate under a control 320 that implements an API in accordance with the invention. Code for the API can be stored in a memory 322. Generally, the code can be downloaded to the terminal 300 from the same network that the input data is received, or via some other means, such as a local installation, smart card, etc. The control 320 may receive user inputs via a user interface 325.
The media player 305 provides an output audio signal to an audio output device 310, such as a speaker or storage device, while the media player 330 provides an output video signal to a video output device 335, such as a video monitor or storage device, and the media player 332 provides an output video signal to a data output device 337, such as a personal computer (PC).
2. Media Presentation Specification In the following sections, we specify the media presentation API in accordance with the present invention.
2.1 MediaStreamControl org.atsc.presentation Interface MediaStreamControl 160.
This interface can be declared by the Java code:
"public abstract interface MediaStreamControl".
The MediaComponentPresenter interface 215 implements this MediaStreamControl interface 160, which enables control of the media streams specified. The interface provides methods that can operate on several streams simultaneously. The MediaStreamControl interface 160 controls any single stream interface, or subset of stream interfaces, identified by the Locator[
] that has been passed to the constructor of the media component presenter. All MediaStreamControl methods take a subset of the Locator( ] as argument. This argument identifies the media streams that the method operates on.
Method Sux~nary int checkpointMediaStream(javax.tv.locator.Locator[ ] I) If possible, a checkpoint is taken which enables a rewind operation.
javax.tv.locator.Locator[ ] getPresentingLocators( ) Returns locators of all streams which have been started.
void pauseMediaStream(javax.tv.locator.Locator[ ) I) Pauses the presentation of the media component(s).
2 o void resumeMediaStream(javax.tv.locator.Locator[ ) I, float rate, long mediaTime) Continues with the presentation from the PAUSE
mode.
void rewindMediaStream(int cp,javax.tv.locator.Locator[ ] I) If possible, the presentation starts from the checkpoint specified (0 is the beginning).
void setLanguage(java.lang.String tang, javax.tv.locator.Locator[ ] I) Sets the ISO language of the presentation.
void startMediaStream(javax.tv.locator.Locator[ ] I, float rate, long mediaTime) Starts the presentation of the media stream components) specified.
void stopMediaStreamQavax.tv.locator.Locator[ ] I) Makes the associated visual components invisible together with the media presentations and stops the presentation of the media stream components) specified if possible.
Method Detail startMediaStream public void startMediaStream(javax.tv.locator.Locator[ ] I, float rate, 2 0 long mediaTime) throws PresentationSynchronizationException, PresentationResourceException, PresentationCom~onentException, PresentationRateException Starts the presentation of the media stream components) specified. For visual broadcast media, the meaning of this method is to make visible the associated visual components with the broadcast media presentations within them. For audio broadcast media, the meaning of the method is to unmute the audio components. The streams identified by the locators are best-effort synchronized. If the mediaComponentPresenter can not synchronize the streams, it starts them not synchronized. If the Locator[ ] passed to the start method is empty, all media streams are started by default.
Moreover, there may be multiple start operations on the same media component presenter affecting different media streams. Each set of streams started together is best-effort synchronized. The notifyWhen method's mediaTime parameter of the MediaTimeInterface refers to the last set of media streams started. The mediaTime parameter in the method notifyWhen refers to the last set of media streams started. For media where the presentation start is user-controllable, the start method additionally affects the start of the presentation.
Adding additional stream and resynchronizing is accomplished by starting all of the streams that are to be played synchronized without stopping any of those which already were started.
stopMediaStream public void stopMediaStream(javax.tv.locator.Locator[ ] I) Makes the associated visual components invisible 5 together with the media presentations and stops the presentation of the media stream components) specified, if possible.
pauseMediaStream public void pauseMediaStream(javax.tv.locator.Locator[ J I) 10 Pauses the presentation of the media component(s).
An implementation of this method for a visual component might be to freeze the display with the current "value"
of the visual component.
resumeMediaStream 15 public void resumeMediaStream(javax.tv.locator.Locator[ ] I, float rate, long mediaTime) throws PresentationSynchronizationException, PresentationResourceException, 2 o PresentationComponentException, PresentationRateExceetion Continues with the presentation from the PAUSE mode.
rewindMediaStream public void rewindMediaStream{int cp, javax.tv.locator.Locator[ ] I) throws RewindNotPossibleException If possible, the presentation starts from the checkpoint specified (0 is the beginning).
Presentations dealing with real-time streaming content do not have to support this method.
checkpointMediaStream public int checkpointMediaStreamQavax.tv.locator.Locator[ ] I) throws PresentationCheckpointException If possible, a checkpoint is taken which enables a rewind operation. The method returns the integer value of the checkpoint taken. Presentations dealing with real-time streaming content do not have to support this me t hod.
setLanguage public void setLanguageQava.lang.String tang, javax.tv.locator.Locator[ ) I) throws PresentationComponentException Sets the ISO language of the presentation.
getPresentingLocators public javax.tv.Iocator.Locator[ ] getPresentingLocators( ) throws PresentationComponentException Gets the locators of all streams which have been started.

2.2 VisualControl org.atsc.presentation Interface VisualControl 165 This interface can be declared by the Java code:
"public abstract interface VisuaIControl"
The MediaComponentPresenter 215 implements this VisualControl interface 165, which enables control of the visual portion of the media presentation. The VisualControl interface controls any single visual stream, or subset of visual streams identified by the Locator[ ] that has been passed to the constructor of the media component presenter. All VisualControl methods take a subset of the Locator[ ] as argument.
This argument identifies the visual streams that the method operates on.
Method Summary int[ ][ ] getFrameQavax.tv.locator.Locator 1) Returns an integer array containing the raw pixels of the visual presentation.
2 o float[ ] g~etScale(javax.tv.locator.Locator I) The getScale method retrieves the width and height scale values.
j ava . awt . Component QetVisualComponents(javax.tv.locator.Locator[ ]
I) Returns the visual component attached to the class implementing this interface.
int getVisuaIFormat(javax.tv.locator.Locator I) Gets the video format to be one of the values specified.
int QetVisuaITransparenc~(javax.tv.locator.Locator I) Gets the transparency of the visual component.
float[ ] rescaleVisual(float wValue, float hValue, javax.tv.locator.Locator I) As default, the presentation has the width and height values that are specific to the visual component.
int setVisuaIFormat(int format, javax.tv.locator.Locator I) Sets the video format to be the one specified.
void setVisuaITransparency_(int value, javax.tv.locator.Locator[ ] I) Sets the transparency of the visual component as an integer value between 0 and MAX TRANSPARENCY_VALUE that is to be specified.
Method Detail setVisualTransparency public void setVisuaITransparency(int value, javax.tv.locator.Locator[ ] I) Sets the transparency of the visual component as an integer value between 0 and MAX TRANSPARENCY_VALUE that is to be specified.

getVisualTransparency public int getVisuaITransparency(javax.tv.locator.Locator I) Gets the transparency of the visual component.
rescaleVisual s public float[ ) rescaleVisual(float wValue, float hValue, javax.tv.locator.Locator I) throws PresentationRescaleException As a default, the presentation has the width and height values that are specific to the visual component.
This method scales the visual presentation size by the specified scale values wValue, hValue for the width and height, respectively. The new width and height are calculated as width=width*WValue and height=height*hValue. The scale values are positive value floats. The range of values is to be specified by the implementation. rescaleVisual returns the width and height scale values actually applied.
getScale 2o public float[ ) getScaie(javax.tv.locator.Locator I) The getScale method retrieves the width and height scale values.
setVisualFormat public int setVisuaIForrr~at(int format, 2s javax.tv.Iocator.Locator I) throws VideoFormatException Sets the video format to be the one specified.
getVisualFormat public int getVisuaIFormat(javax.tv.locator.Locator I) s throws VideoFormatException Gets the video format to be one of the values specified.
getVisualComponents public java.awt.Component[ ]
to getVisuaIComponents(javax.tv.locator.Locator[ ] I) Returns the visual component attached to the class implementing this interface.
getFrame public int[ ][ ] getFrame(javax.tv.locator.Locator I) 15 throws FrameCaptureFailedException Returns an integer array containing the raw pixels of the visual presentation. The size of the two-dimensional array is determined by the scale factor as well as the dimensions set by the visual component.
20 2.3 AudioControl org.atsc.presentation Interface AudioControl public abstract interface AudioControl The MediaComponentPresenter implements this 25 AudiolControl interface, which enables control of the audio portion of the media presentation. The AudioControl controls any single or subset of audio streams identified by the Locator[ ] that has been passed to the constructor of the media component presenter. All AudioControl methods take a subset of the Locator[ ] as argument. This argument identifies the audio streams that the method operates on.
Method Summary java.awt.Component[ ] getAudioComponents(javax.tv.locator.Locator[ ] I) Returns the audio component attached to the class implementing this interface.
int getAudioVolume(javax.tv.locator.Locator I) Gets the volume of the audio presentation.
void mute(javax.tv.locator.Locator[ ] I) This is equivalent to setVolume(0).
void setAudioVolume(int value, javax.tv.locator.Locator[ ] I) Sets the volume of the audio presentation as an integer between 0 and MAX AUDIO VOLUME that is to be specified.
Method Detail setAudioVolume public void setAudioVolume(int value, javax.tv.locator.Locator[ ] I) Sets the volume of the audio presentation as an integer between 0 and MAX AUDIO VOLUME that is to be specified.
mute s public void muteQavax.tv.locator.Locator[ ] I) This is equivalent to setVolume(0).
getAudioeomponents public java.awt.Component[ ]
getAudioComponents(javax.tv.locator.Locator[ ] I) Returns the audio component attached to the class implementing this interface.
getAudioVolume public int getAudioVolume(javax.tv.locator.Locator[ ] I) Gets the volume of the audio presentation.
2.4 Media Control Exceptions 2.4.1 PresentationException org.atsc.presentation Class PresentationException java.lang.Object +--java.lang.Throwable +- java.lang.Exception +--org.atsc.presentation.PresentationException Direct Known Subclasses:
FrameCaptureFailedException, PresentationCheck~~ointException, PresentationComponentException, PresentationRateException, PresentationRescaleExcet~tion, PresentationResourceException, PresentationSvnchronizationException, RewindNotPossibleException, VideoFormatException public class PresentationException extends java.lang.Exception The PresentationException is the base for all presentation exceptions.
See Also: Serialized Form 2.4.2 FrameCaptureFailedException org.atsc.presentation Class FrameCaptureFailedException java.lang.Object I
+--java.lang.Throwable +--java.lang. Exception 2 5 +--org.atsc.presentation.PresentationException +--org.atsc.presentation.FrameCaptureFailedException public class FrameCaptureFailedException extends PresentationException A FrameCaptureFailedException is thrown in case it is not possible capture a frame.
See Also: Serialized Form 2.4.3 PresentationCheckpointException org.atsc.presentation Class PresentationCheckpointException java.lang.Object +--java.lang.Throwable is +- java.lang.Exception +--org.atsc.presentation. PresentationException +--2 0 org.atsc.presentation.PresentationCheckpointException public class PresentationCheckpointException extends PresentationException The PresentationCheckpointException is thrown in case a presentation checkpoint can not be made.

See Also: Serialized Form 2.4.4 PresentationComponentException org.atsc.presentation Class PresentationComponentException 5 java.lang.Object I
+--java.lang.Throv~rable +--java.lang.Exception to I
+~org.atsc.presentation.PresentationException I
+--org.atsc.presentation.PresentationComponentException 15 public class PresentationComponentException extends PresentationException The PresentationComponentException is thrown in case of problems with the content being presented, i.e.
one of the channel components being presented.
20 See Also: Serialized Form 2.4.5 PresentationRateException org.atsc.presentation Class PresentationRateException java.lang.Object +-java.lang.Throwable I
+- java.lang.Exception s +--ora.atsc.presentation PresentationException +--org.atsc.presentation.PresentationRateException public class PresentationRateException extends PresentationException The PresentationRateException is thrown in case the presentation rate can not be set to the rate specified.
See Also: Serialized Form 2.4.6 PresentationRescaleException org.atsc.presentation Class PresentationRescaleException java.lang.Object +- java.lang.Throwable +- java.lang.Exception +--ora.atsc.presentation PresentationException I
+--org.atsc.presentation.PresentationRescaleException public class PresentationRescaleException extends PresentationException A PresentationRescaleException is thrown in case it is not possible to set the video scale to the format desired.
See Also: Serialized Form 2.4.7 PresentationResourceException org.atsc.presentation Class PresentationResourceException java.lang.Object +--java.lang.Throwable +--java.lang. Exception +--orq.atsc.presentation PresentationException +--org.atsc.presentation.PresentationResourceException public class PresentationResourceException extends PresentationException The PresentationResourceException is thrown in case the presentation can not be performed because of resource failures or resource allocation problems.
See Also: Serialized Form 2.4.8 PresentationSynchronizationException org.atsc.presentation Class PresentationSynchronizationException java.lang.Object I
+--java.lang.Throwable ( +- java.lang.Exception +--or4.atsc.presentation ~'resentationException I
l o +-_ org.atsc.presentation.PresentationSynchronizationException public class PresentationSynchronizationException extends PresentationException The PresentationSynchronizationException is thrown in case the media streams identified by the locator parameters in a start or resume method can not be synchronized.
See Also: Serialized Form 2.4.9 RewindNotPossibleException org.atsc.presentation Class RewindNotPossibleException java.lang.Object +- java.lang.Throwable I

+- java.lang.Exception +--ora.atsc.presentation.PresentationException +--org.atsc.presentation.RewindNotPossibleException public class RewindNotPossibleException extends PresentationExceation A RewindNotPossibleException is thrown if it is not possible to rewind the presentation.
See Also: Serialized Form 2.4.10 VideoFortnatException org.atsc.presentation Class VideoFormatException java.lang.Object I
+- java.lang.Throwabie +- java.lang.Exception 2 0 +--orct.atsc. presentation. PresentationException +--org.atsc.presentation.VideoFormatException public class VideoFormatException extends PresentationException A VideoFormatException is thrown in case it is not possible to set the video format to the format desired.
See Also: Serialized Form Accordingly, it can be seen that the present 5 invention provides a mechanism and apparatus for presentation control of multi-media services in a terminal, including but not limited to the synchronization and control of audio, visual, and data streams. The API provides a simple alternative to the 10 Java Media Framework for JavaTV and other television software environments, including ATSC DASE, and DVB MHP.
This API enables playing back media, and controlling the playback. The API includes a media presentation control package, "MediaComponentPresenter," and a service 15 presenter hierarchy. The API controls stopping, starting, suspending and resuming the presentation of the streaming audio and/or video at a media player, e.g., at a television set-top receiver. The API also provides a start method that attempts to synchronize the 20 presentation of audio and/or video streams according to associated locators of the streams. The API also can take checkpoints of the audio and/or video as it is playing, where the checkpoints indicate temporal locations of the audio and/or video for initiating a 25 rewind operation.
Although the invention has been described in connection with various specific embodiments, those skilled in the art will appreciate that numerous adaptations and modifications may be made thereto without departing from the spirit and scope of the invention as set forth in the claims.
For example, while various syntax elements have been discussed herein, note that they are examples only, and any syntax may be used.
Moreover, the invention is suitable for use with virtually any type of network, including cable or satellite television broadband communication networks, local area networks (LANs), metropolitan area networks (MANS), wide area networks (WANs), internets, intranets, and the Internet, or combinations thereof.
Furthermore, while the invention is useful for television set-top terminals, it can be implemented in any type of multi-media terminal, including a personal computer.

Claims (20)

What is claimed is:
1. A multi-media terminal, comprising:
means for receiving control information from an external source;
a computer readable medium having computer program code means; and means for executing said computer program code means to implement an application program interface (API); wherein:
the API is responsive to said control information for controlling at least one presentation function of a media player associated with the terminal; and the at least one presentation function is related to the presentation of streaming audio and/or video.
2. The terminal of claim 1, wherein:
the terminal comprises a digital television receiver.
3. The terminal of claim 1 or 2, wherein the presentation function(s) comprise(s) at least one of:
stopping, starting, suspending and resuming the presentation of the streaming audio and/or video at the media player.
4. The terminal of any one of claims 1 to 3, wherein:
the external source comprises a user remote control.
5. The terminal of any one of claims 1 to 4, wherein:
the external source comprises an external applications program.
6. The terminal of any one of claims 1 to 5, wherein:
said API controls the media player based on said control information regardless of a specific implementation of the media player.
7. The terminal of any one of claims 1 to 6, wherein said API is compatible with at least one of the following API standards for digital television receivers:
Java TV, Advanced Television Systems Committee Digital TV Application Software Environment, and Digital Video Broadcast Multimedia Home Platform.
8. The terminal of any one of claims 1 to 7, wherein:
the API provides a start method that is adapted to attempt to synchronize the presentation of a plurality of audio and/or video streams according to associated locators of the streams.
9. The terminal of claim 8, wherein:
if the API cannot synchronize the streams within a period of time, it starts them unsynchronized.
10. The terminal of any one of claims 8 to 9, wherein:
if the associated locators are empty, the associated streams are started by default without the start method attempting to synchronize them.
11. The terminal of any one of claims 8 to 10, wherein:
when the plurality of audio and/or video streams are already synchronized with each other, and at least one additional audio and/or video stream is to be synchronized with the already-synchronized streams, the start method is adapted to attempt to synchronize the additional stream with the already-synchronized streams without stopping the playing of the already-synchronized streams.
12. The terminal of any one of claims 1 to 11, wherein:
the API takes checkpoints of the audio and/or video as it is playing; and said checkpoints indicate temporal locations of the audio and/or video for initiating a rewind operation.
13. The terminal of any one of claims 1 to 12, wherein:
the API controls the presentation function of a visual stream.
14. The terminal of any one of claims 1 to 13, wherein:
the API controls the presentation function of an audio stream.
15. The terminal of any one of claims 1 to 14, wherein:
the API is responsive to said control information for controlling the at least one presentation function of the media player in isolation from other functions of the media player.
16. A method for operating a multi-media terminal, comprising the steps of:
receiving control information from an external source;
providing a computer readable medium having computer program code means; and executing said computer program code means to implement an application program interface (API);
wherein:
the API is responsive to said control information for controlling at least one presentation function of a media player associated with the terminal; and the at least one presentation function is related to the presentation of streaming audio and/or video.
17. The method of claim 16, wherein the presentation function(s) comprise(s) at least one of:

stopping, starting, suspending and resuming the presentation of the streaming audio and/or video at the media player.
18. The method of any one of claims 16 to 17, wherein said API is compatible with at least one of the following API standards for digital television receivers:
Java TV, Advanced Television Systems Committee Digital TV Application Software Environment, and Digital Video Broadcast Multimedia Home Platform.
19. The method of any one of claims 16 to 18, wherein:
the API provides a start method that is adapted to attempt to synchronize the presentation of a plurality of audio and/or video streams according to associated locators of the streams.
20. The method of any one of claims 16 to 19, wherein:
the API takes checkpoints of the audio and/or video as it is playing; and said checkpoints indicate temporal locations of the audio and/or video for initiating a rewind operation.
CA002301935A 1999-03-23 2000-03-22 Streaming media control and synchronization application program interface (api) for a digital television receiver Abandoned CA2301935A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US12578699P 1999-03-23 1999-03-23
US60/125786 1999-03-23
US12775399P 1999-04-05 1999-04-05
US60/127753 1999-04-05

Publications (1)

Publication Number Publication Date
CA2301935A1 true CA2301935A1 (en) 2000-09-23

Family

ID=26823948

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002301935A Abandoned CA2301935A1 (en) 1999-03-23 2000-03-22 Streaming media control and synchronization application program interface (api) for a digital television receiver

Country Status (1)

Country Link
CA (1) CA2301935A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005055232A1 (en) * 2003-12-01 2005-06-16 Samsung Electronics Co., Ltd. Reproducing apparatus and method and computer readable recording medium storing program for executing the method
KR100760087B1 (en) 2006-02-13 2007-09-18 엘지전자 주식회사 Method for setting media player for playing service or component
US7765330B2 (en) 2006-02-13 2010-07-27 Lg Electronics Inc. Apparatus for playing media and method of setting resources thereof
US8306391B2 (en) 2006-05-08 2012-11-06 Thomson Licensing Method for resuming content reproduction across devices
CN110740114A (en) * 2018-07-20 2020-01-31 视联动力信息技术股份有限公司 method, device and equipment for synchronizing streaming media data
US10789389B2 (en) 2002-06-10 2020-09-29 Tailstream Technologies, Llc Remote data viewer

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10789389B2 (en) 2002-06-10 2020-09-29 Tailstream Technologies, Llc Remote data viewer
WO2005055232A1 (en) * 2003-12-01 2005-06-16 Samsung Electronics Co., Ltd. Reproducing apparatus and method and computer readable recording medium storing program for executing the method
KR100760087B1 (en) 2006-02-13 2007-09-18 엘지전자 주식회사 Method for setting media player for playing service or component
US7720937B2 (en) 2006-02-13 2010-05-18 Lg Electronics Inc. Apparatus for playing media and method of setting the same
US7765330B2 (en) 2006-02-13 2010-07-27 Lg Electronics Inc. Apparatus for playing media and method of setting resources thereof
US8306391B2 (en) 2006-05-08 2012-11-06 Thomson Licensing Method for resuming content reproduction across devices
CN110740114A (en) * 2018-07-20 2020-01-31 视联动力信息技术股份有限公司 method, device and equipment for synchronizing streaming media data

Similar Documents

Publication Publication Date Title
KR100929474B1 (en) Contextual web page system and method
US20040117858A1 (en) Data enhanced multi-media system for an external device
CA2509578C (en) Data enhanced multi-media system for a set-top terminal
KR20050088414A (en) Interactive television system with partial character set generator
US8695034B2 (en) Delivering on screen display data to existing display devices
JP2001511629A (en) Digital transport stream processing
EP2784641A1 (en) User interface display method and device using same
CN112367543A (en) Display device, mobile terminal, screen projection method and screen projection system
US20050246758A1 (en) Authoring system and method for supplying tagged media content to portable devices receiving from plural disparate sources
CN111479145A (en) Display device and television program pushing method
KR100400002B1 (en) Apparatus and method for processing an adding information in the data broadcasting system
US20040098730A1 (en) Interpretation of DVD assembly language programs in Java TV-based interactive digital television environments
CA2301935A1 (en) Streaming media control and synchronization application program interface (api) for a digital television receiver
US20070200954A1 (en) Apparatus and method for controlling the screen size of real-time video
US8570440B2 (en) Method of controlling resolution of digital data broadcasting receiver, apparatus therefor, and digital data broadcasting receiver using the same
JP2009044401A (en) Receiver
JP4303884B2 (en) Modem control
CN112261463A (en) Display device and program recommendation method
KR101711840B1 (en) Image display apparatus and method for operating the same
JPH11317990A (en) Controller
KR20130033813A (en) Image display apparatus, and method for operating the same
WO2004055630A2 (en) Data enhanced multi-media system for a headend
CN113473175A (en) Content display method and display equipment
CN115119030A (en) Subtitle processing method and device
JPH11331958A (en) Controller

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued
FZDE Discontinued

Effective date: 20100907