US20050091691A1 - Client/server multimedia presentation system - Google Patents

Client/server multimedia presentation system Download PDF

Info

Publication number
US20050091691A1
US20050091691A1 US10/978,450 US97845004A US2005091691A1 US 20050091691 A1 US20050091691 A1 US 20050091691A1 US 97845004 A US97845004 A US 97845004A US 2005091691 A1 US2005091691 A1 US 2005091691A1
Authority
US
United States
Prior art keywords
quality
data
frame
data portion
quality supplement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/978,450
Inventor
Kensuke Maruya
Toshio Oka
Akino Inoue
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to US10/978,450 priority Critical patent/US20050091691A1/en
Publication of US20050091691A1 publication Critical patent/US20050091691A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4333Processing operations in response to a pause request
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability

Definitions

  • the invention relates to a client/server system in which a client terminal connected with a server through a transmission path of a limited bandwidth plays a multimedia program which is stored in the server and is comprised of multimedia objects such as moving pictures, still pictures, sounds and texts while reading the objects in real time from the server.
  • a user of a client terminal is permitted to select one of a plurality of programs and to enter commands such as a play, a stop, a head search, a jump forward and a jump backward for the selected program.
  • the server reads data of a specified program stored in a storage device and transmits the read data to the client terminal. Then, the client terminal displays the received data.
  • the stop, the head search, a jump forward and a jump backward command is executed, no data is transmitted from the server to the client terminal till the play command is executed.
  • the transmission rate of the transmission path between the server and each terminal is limited, i.e., the quantity of data transmitted for a certain period of time is limited. For this reason, in order to enable each client terminal to play a program whose data is stored in the server while having the program data transmitted from the server, video data of each program is stored in the server such that the bit rate (or the quantity of data reproduced or played per second) of video data of each program does not exceed the transmission rate of the transmission paths.
  • a method of presenting, at a client terminal, a video program stored in a server linked with the client terminal via transmission path of a limited band width is provided.
  • Each frame of the video program comprises a basic data portion and at least one level of quality supplement data portions.
  • the client terminal determines a start (or play) position in the video program according to the issued play control command.
  • the play control commands includes a play, a stop, a head search, a jump forward and a jump backward command.
  • the terminal obtains and uses the basic data portions for playing the video program.
  • the terminal obtains and uses the at least one level of quality supplement data portions of a last displayed frame for displaying a quality-enhanced version of the last displayed frame.
  • a method of presenting, at a client terminal, a multimedia program stored in a server includes a video object.
  • Each frame of the video object comprises a basic data portion and at least one level of detailed data portions.
  • the terminal in response to one of play control commands from a user, determines a time count in the multimedia program according to the issued play control command.
  • the play control commands include a play, a stop, a head search, a jump forward and a jump backward command.
  • the terminal determines whether there is a video object to be displayed at the time count in the multimedia program.
  • the terminal obtains at least one level of quality supplement data portions for a first frame to be displayed in a next, play operation for displaying a quality-enhanced version of the first frame to be displayed.
  • a test is responsively made to see if there is multimedia objects which are other than video objects and each comprise basic data and quality supplement data and which are to be displayed at said time count in said multimedia program. If so, then for each of said found multimedia objects, the terminal obtains the quality supplement data for displaying a quality-enhanced version of each object.
  • a test is made to see if there is (or are) multimedia objects which are other than video objects and each comprise basic data and quality supplement data and which are to be displayed later. If so, the terminal tries to obtain the basic data for as many of the found multimedia objects as possible in advance.
  • a terminal for presenting a video program stored in a remote server connected therewith via band-limited transmission path.
  • Each frame of the video program comprises a basic data portion and at least one level of quality supplement data portions.
  • the terminal comprises means, responsive to one of play control commands from a user, for determining a start position in the video program according to the issued play control command.
  • the play control command includes a play, a stop, a head search, a jump forward and a jump backward command.
  • the terminal includes means, responsive to the play command from the user, for obtaining and using said basic data portions for playing said video program; and means, responsive to the stop command, for obtaining and using at least one level of quality supplement data portions of a last displayed frame for displaying a quality-enhanced version of the last displayed frame.
  • FIG. 1 is a schematic block diagram showing an arrangement of a multimedia-on-demand system that can embody the present invention in various forms;
  • FIG. 2 is a schematic diagram showing the contents of the ROM 52 and the RAM;
  • FIG. 3 is a diagram showing exemplary presentation control buttons included in the control switches 70 of FIG. 1 ;
  • FIG. 4 is a table for describing how the scenario time manager 101 sets the value Ct of the scenario time register 501 in response to the executed presentation control command;
  • FIG. 5 is a diagram conceptually showing a data structure of a video object stored in the mass storage 20 of FIG. 1 ;
  • FIG. 6 is a flowchart showing an exemplary operation of an interrupt subroutine called from a main program in response to an interrupt caused by a pressing of one of the presentation control buttons of FIG. 3 after one of the available programs stored in the mass storage 20 is specified by the user at the client terminal 3 ;
  • FIGS. 7 and 8 are diagrams conceptually showing arrangements of a first and a second exemplary mass storage 20 a and 20 b using tape storage devices;
  • FIGS. 9 and 10 are diagrams showing a first storing scheme 20 c and a second storing scheme 20 d of storing a video object on a disc storage device;
  • FIG. 11 is a diagram showing an address table used in the second storing scheme 20 d of FIG. 10 ;
  • FIG. 12 is a diagram showing a third storing scheme 20 e of storing a video object on a disc storage device
  • FIG. 13 is a diagram showing a data structure obtained when a progressive JPEG video object is stored in the third storing scheme 20 e as shown in FIG. 12 ;
  • FIG. 14 is a diagram conceptually showing the H.263 video format
  • FIG. 15 is a schematic block diagram showing an exemplary arrangement of a video decoder 600 which is included in the video & audio decoder 60 of FIG. 1 and which is adapted to decode a video object of the H.263 format as shown in FIG. 14 ;
  • FIG. 16 is a diagram showing an exemplary scenario data table of a multimedia program available in a multimedia-on-demand system according to a second illustrative embodiment of the invention:
  • FIG. 17 is a diagram showing an exemplary active object table created from the scenario data table of FIG. 16 ;
  • FIG. 18 is a flowchart showing an operation of an interrupt subroutine called from a main program in response to an interrupt caused by the user at the client terminal 3 pressing one of the presentation control buttons of FIG. 3 after specifying one of the available multimedia programs stored in the mass storage 20 ;
  • FIG. 19 is a diagram showing various variable-quality objects used in an exemplary multimedia program available in the inventive multimedia-on-demand system
  • FIG. 20 is a diagram showing how the multimedia objects of FIG. 19 are presented in the exemplary multimedia program
  • FIG. 21 is a diagram showing a way of transmitting the multimedia objects to present the objects as shown in FIG. 20 ;
  • FIG. 22 is a flowchart showing an operation of an interrupt subroutine called in response to a pressing of one of the presentation control buttons of FIG. 3 according to the third embodiment of the invention.
  • FIG. 23 is a diagram showing an exemplary structure of a scenario data table according to the third embodiment of the invention.
  • FIG. 24 is a diagram showing an exemplary arrangement of a load flag storage location.
  • FIG. 1 is a schematic block diagram showing an arrangement of a multimedia-on-demand system that can embody the present invention in various forms.
  • double lines indicate bus lines.
  • the multimedia-on-demand system comprises a server 1 that stores and serves multimedia programs, at least one client terminal 3 that plays one of the multimedia programs, and a network 2 for connecting the server 1 and the client terminals 3 .
  • the server 1 comprises a controller 10 , a mass storage 20 for storing the multimedia programs, an object data transmitter 30 for transmitting object data constituting a multimedia program and a control data communication interface (IF) 40 for communicating control data with each client terminal 3 .
  • the client terminal 3 comprises an object data receiver 35 for receiving object data transmitted from the server 1 ; a control data communication IF 40 for communicating the control data with the server 1 ; a controller 50 for controlling the operation of the terminal 3 ; a video and audio decoder 60 for decoding video and audio object data into video and audio output signals; a video output portion 80 for providing a video output according to the video signal from the decoder 60 and the image data from the controller 50 ; a audio output portion 90 for providing an audio output according to the audio signal from the decoder 60 and the audio data from the controller 50 ; and control switches 70 that permit the user to specify a desired one of the multimedia programs stored in the mass storage 20 and to enter a play, stop, jump forward, jump backward and head search commands.
  • the controller 50
  • FIG. 2 is a schematic diagram showing the contents of the ROM 52 and the RAM.
  • the ROM 52 stores programs 100 necessary for the operation of the controller 50 .
  • the RAM 54 stores various data 500 necessary for the operation of the controller 50 .
  • the programs 100 include a scenario time manager 101 , which sets the value (Ct) of a scenario time register 501 in the RAM 54 in a play operation of a multimedia program such that the current scenario time or the current position in the multimedia program is given by T*Ct, where T is a frame period of the video objects.
  • the control switches 70 include presentation control buttons for head search (HS), jump forward (JF), play, stop, jump backward (JB) operations as shown in FIG. 3 .
  • FIG. 4 shows a table for describing how the scenario time manager 101 sets the value Ct of the scenario time register 501 in response to the executed presentation control command.
  • the scenario time manger 101 increments the value Ct of the scenario time register 501 for every frame period T as long as the play command is active.
  • the scenario time manger 101 sets the scenario time register value Ct to Ct+Cj, Ct ⁇ Cj or 0, respectively.
  • Cj is a predetermined jump distance for use in the JF and JB operations.
  • the scenario time manger 101 does nothing, i.e., the value Ct remains unchanged.
  • Each of the multimedia programs generally comprises video objects, still picture objects, audio objects and/or text objects.
  • the multimedia-on-demand system of FIG. 1 is a video-on-demand system, i.e., each of the programs available at each client comprises a video object.
  • each frame data Ff comprises a basic image data portion F 0 f and at least one level (e.g., 4 levels in FIG. 5 ) of quality supplement data portions F 1 f , F 2 f , F 3 f and so on.
  • F the number of frames of a video object
  • the suffix f is the frame number of the frame.
  • Zero “0” following “F” in a label given to each data portion indicates that the data portion is the basic image data.
  • the client terminal 3 uses only the basic image data F 0 1 , F 0 2 , . . . , F 0 N (hereinafter referred, en bloc, to like “F0”) in a play operation.
  • the terminal 3 detects a stop command in an arbitrary state of operations or detects one of head search (HS), jump forward (JF) and jump backward (JB) commands during a stop state
  • the terminal 3 performs a image quality enhancing operation on entering a stop state or just after the operation of the detected command by obtaining the quality supplement data F 1 f , F 2 f , F 3 f and F 4 f for the last displayed frame from the server 1 .
  • FIG. 6 is a flowchart showing an operation executed by the controller 50 of the client terminal 3 under the control of an interrupt subroutine called from a main program in response to an interrupt caused by the user at the client terminal 3 pressing one of the presentation control buttons of FIG. 3 after specifying one of the available programs stored in the mass storage 20 .
  • the controller 50 first make s a test in step 122 to see if a play operation flag (not shown) is logical “1” or indicates that the terminal is in one of the play modes: i.e., a (normal) play, a jump forward (JF) play, a jump backward (JB) play, and a head search (HS) play (or a JB play to the beginning of the program). If so (which means that the terminal 3 is in a play mode), a test is made in step 124 to see which presentation control command has been issued.
  • a play operation flag not shown
  • JF jump forward
  • JB jump backward
  • HS head search
  • the controller 50 resets the play operation flag, i.e., sets the flag logical “0” in step 136 ; ceases the play mode in step 137 ; performs an image quality enhancing operation for the frame identified by the value Ct (
  • the controller 50 transmits an image quality enhancing instruction and the register 501 value f to the server 1 .
  • the controller 10 of the server 1 responsively reads the quality supplement data F 1 f , F 2 f , F 3 f and F 4 f for the frame identified by the value f from the mass storage 20 and transmits them to the requesting client terminal 3 .
  • the terminal 3 responsively adding the received quality supplement data F 1 f , F 2 f , F 3 f and F 4 f to the basic image data F 0 f into a high quality frame data. By doing this, the quality of the currently displayed frame becomes better gradually with the receptions of the quality supplement data F 1 f , F 2 f , F 3 f and F 4 f .
  • the controller 50 ends the operation 120 .
  • step 125 If the play operation flag is not logical “1” (which means that the terminal 3 is in a stop mode), a test is made in step 125 to see which presentation control command has been issued.
  • the controller 50 sets the play operation flag logical “1” in step 132 ; and plays (or reproduces) the current program (the program the user has specified before the controller 50 has entered the operation 120 ) from the frame identified by the register 501 value Ct in steps 150 , 152 and 154 . Specifically, the controller 50 presents the frame of the register 501 value Ct in step 150 and checks the value Ct to see if the register value Ct has reached a preset end value in step 152 . If not, the controller 50 increments the value Ct in step 154 and goes back to step 150 . If the register 501 value Ct has reached the preset end value in step 152 , then the controller 50 returns to the main program that has invoked this subroutine 120 .
  • the controller 50 sets the register 501 value Ct to Ct ⁇ Cj, Ct+Cj or 0 in a JB step 140 , a JF step 142 or a HS step 144 , respectively, as shown in FIG. 4 .
  • the controller 50 performs an image quality enhancing operation for the frame identified by the scenario time register 501 value before the execution of step 140 , 142 or 144 in step 146 . This causes the quality of the currently displayed frame to get better gradually with the receptions of the quality supplement data F 1 f , F 2 f , F 3 f and F 4 f as described above.
  • the controller 50 ends the operation 120 after step 146 .
  • FIGS. 7 and 8 are diagrams conceptually showing arrangements of a first and a second exemplary mass storage 20 a and 20 b using tape storage devices.
  • the storage 20 a comprises five tape storage devices 211 through 215 .
  • the tape device 211 stores the basic image data F 0 .
  • the four tape devices 212 through 215 store the four level quality supplement data F 1 through F 4 , respectively.
  • the basic image data F 0 f and the corresponding quality supplement data F 1 f , F 2 f , F 3 f and F 4 f for each frame are recorded on the same tape positions of the five tapes.
  • the five tape storage devices are so arranged that the five reels are independently rotated only in case of image quality enhancing operation and are synchronously rotated otherwise. In an image quality enhancing operation, the tape storage devices 212 through 215 for the quality supplement data F 1 through F 4 are sequentially read one by one.
  • the storage 20 b comprises two tape storage devices 217 and 218 .
  • the tape device 217 stores the basic image data F 0
  • the tape device 218 stores the four level quality supplement data F 1 through F 4 .
  • the quality supplement data F 1 f , F 2 f , F 3 f and F 4 f for each frame are recorded on the tape position of the tape 218 which corresponds to the position of the tape 217 on which the basic image data F 0 f for the frame is recorded when the tapes 217 and 218 are rotated synchronously.
  • the two tape devices 217 and 218 are so arranged that the two reels are independently rotated only in case of image quality enhancing operation and are synchronously rotated otherwise. In an image quality enhancing operation, the tape storage device 218 portion for the quality supplement data F 1 f through F 4 f are sequentially read.
  • FIGS. 9 and 10 show a first storing scheme 20 c and a second storing scheme 20 d of storing a video object on the mass storage 20 .
  • the mass storage 20 may be any suitable disc storage device such as a hard disc, various optical discs, etc.
  • the basic image data F 0 and the quality supplement data (QSD) F 1 , F 2 , . . . are stored in two different areas: a F 0 area and a QSD area on the third mass storage media.
  • the quantity of the quality supplement data (QSD) F 1 f , F 2 f , F 3 f and F 4 f for each frame is M times the data quantity of the basic image data F 0 f for the frame, where M is a positive constant. Then, if the first data of the basic image data F 0 f is (N+1)-th byte in the F 0 area, then in order to obtain the quality supplement data F 1 f , F 2 f , F 3 f and F 4 f , the controller 40 has only to read the data of D*M bytes from the (N*M+1)-th byte in the QSD area. D is the data-size of the basic image data for each frame.
  • the basic image data F 0 f and the total quality supplement data F 1 f +F 2 f +F 3 f +F 4 f may have arbitrary sizes.
  • the start address of the total quality supplement data for an f-th frame in the QSD area is assumed to be Af.
  • the controller 10 uses an address table of FIG. 11 .
  • the address table of FIG. 11 comprises a frame number (f) field and a field of QSD address (Af) for the frame number (f).
  • FIG. 12 shows a third storing scheme 20 e of storing a video object on the mass storage 20 .
  • the mass storage 20 preferably comprises a suitable disc storage device such as a hard disc, various optical discs, etc.
  • the controller 10 reads only the basic image data skipping the quality supplement data as shown by arrows above the strip area representative of the stored video data in FIG. 12 .
  • the controller 10 reads the quality supplement data QSD f for the frame identified by the register 501 value as shown by an arrow below the strip area representative of the stored video data in FIG. 12 .
  • FIG. 13 is a diagram showing a data structure obtained when a progressive JPEG video object is stored in the third storing scheme 20 e as shown in FIG. 12 .
  • the basic image data F 0 f for each frame comprises a header F 0 Hf and a basic image data portion F 0 Df.
  • the quality supplement data QSDf comprises a first level differential data F 1 f , a second level differential data F 2 f , . . . , and an L-th level differential data FL f.
  • the controller 10 reads the quality supplement data QSDf for the frame identified by the register 501 value.
  • the controller 50 passes the frame data to be displayed to the video & audio decoder 60 of FIG. 1 in playing operations of steps 126 , 128 , 130 and 150 .
  • the video & audio decoder 60 includes a JPEG decoder.
  • FIG. 14 is a diagram conceptually showing the H.263 video format.
  • an H.263 video data comprises basic image data 210 for use in a play operation and quality supplement data 220 for use in a quality enhancing operation.
  • the frame data for the basic image data are expressed as F 0 0 , F 0 1 , F 0 2 , . . . , F 0 g , . . . , and F 0 3N+2 (g is a frame number)
  • the basic image frames identified by F 0 3f are intra-coded frames that can be decoded alone without the need of data of any other frame.
  • the basic image frames identified by F 0 f3+1 and F 0 f3+2 are first and second differences, in the time direction, from the basic image data F 0 3f which needs frame F 0 3f data for decoding.
  • the first and second differences are written as TIME DIRECTION DIF 1 and 2 , respectively in FIG. 14 .
  • the quality supplement data for the frame 3 f comprises first, second and third differences, in the quality direction, from the basic image data F 0 3f , which differences are referred to as “QUALITY DIFs 1, 2 and 3” and labeled “F1 3f ”, “F2 3f ” and “F3 3f ”, respectively.
  • the quality supplement data for the frame 3 f+ 1 comprises QUALITY DIFs 1 , 2 and 3 from the basic image data F 0 3f+1 which differences are labeled “F1 3f+1 ”, “F2 3f+1 ” and “F3 3f+1 ”, respectively.
  • the quality supplement data for the frame 3 f+2 comprises QUALITY DIFs 1 , 2 and 3 from the basic image data F 0 3f+2 which differences are labeled “F1 3f+2 ”, “F2 3f+2 ” and “F3 3f+2 ”, respectively.
  • FIG. 15 is a schematic block diagram showing an exemplary arrangement of a video decoder 600 which is included in the video & audio decoder 60 of FIG. 1 and which is adapted to decode a video object of the H.263 format as shown in FIG. 14 .
  • a video decoder 600 which is included in the video & audio decoder 60 of FIG. 1 and which is adapted to decode a video object of the H.263 format as shown in FIG. 14 .
  • the video decoder 600 comprises a local controller 602 for controlling the operation of the decoder 600 ; a time-based H.263 decoder 604 ; an adder 606 ; a frame memory 608 for storing a I-coded image data F 0 3f : a memory interface 610 for the memory 608 ; a quality-enhancing H.263 decoder 612 for decoding quality supplement data F 1 g , F 2 g and F 3 g to provide a quality-enhanced frame data F 0 g +F 1 g +F 2 g +F 3 g ; and a previous frame memory for quality enhancement.
  • the received video data is passed to the video & audio decider 60 and to the video decoder 600 or the local controller 602 through the bus lines 51 . If the received video data is basic image data F 0 g , then the local controller 602 passes the data F 0 g to the time-based H.263 decider 604 . If the received video data is quality supplement data F 1 g , F 2 g or F 3 g , then the local controller 602 passes the data F 1 g , F 2 g or F 3 g to the quality-enhancing H.263 decider 612 .
  • the local controller 602 supplies a control signal 602 a to a memory interface 610 control input 610 c .
  • the decoded video data [F 0 3f ] is stored in the frame memory 608 , where [A] represents a decoded version of data A.
  • the control signal 602 a also controls the memory interface 610 such that the data stored in the frame memory 608 , i.e., the decoded video data [F 0 3f ] is read out to a memory interface 610 data output terminal 610 b if the received video data is not I-coded image data, i.e., g ⁇ 3 f .
  • the decoded video data [F 0 3f+1 ] or [F 0 3f+2 ] is added by the adder 606 to the decoded video data [F 0 3f ] read from the memory 608 to yield the added decoded video data [F 0 3f ]+[F 0 3f+1 ] or [F 0 3f ]+[F 0 3f+2 ], respectively, which is supplied to the video output portion 80 and the previous frame memory 614 .
  • the decoded video data from the quality-enhancing H.263 decoder 612 .
  • the H.263 decoder 612 decodes the quality supplement data F 1 g , F 2 g or F 3 g from the local controller 602 , and adds the decoded data [F 1 g ], [F 2 g ] or [F 3 g ] to the data from the previous frame memory 614 to provide the quality enhanced frame data to the video output portion 80 .
  • the video decoder 600 has respective previous frame memories 608 and 614 and respective H.263 decoders 604 and 612 for a decoding in the time axis direction and a decoding in the quality axis direction, it is possible to store data decoded in the time axis direction in both of the previous frame memory 608 and 614 and to store data decoded in the quality axis direction only in the memory 614 for the quality axis direction. This reason, even if quality supplement data for a frame data F 0 g has been decoded, it is possible to resume the play of video data from the frame data F 0 g.
  • an equivalent video decoder may be implemented by using a single decoder.
  • a video decoder that decodes a video object of a format using a correlation between frames not only in the time axis direction but also in the quality axis direction has been described in conjunction with the H.263 video format.
  • a video decoder can be realized for other such video format as MPEG format by replacing the H.263 decoder(s) with a corresponding video decoder such as an MPEG decoder.
  • a multimedia-on-demand system has a feature of enhancing the picture quality of the first frame to be displayed after the execution of a stop command or the execution of a JF, JB or HS command issued during a stop state by transmitting quality supplement data from the server 1 .
  • FIG. 16 is a diagram showing an exemplary scenario data table of a multimedia program available in a multimedia-on-demand system according to a second illustrative embodiment of the invention.
  • the scenario data table contains a record for each of the multimedia objects used in the multimedia program for which the scenario data table is intended.
  • Each record of the scenario data table comprises the fields of the object ID, the kind of the object, the display position on a screen, the display size, the presentation start time and the presentation end time.
  • the presentation start and end time fields there is included corresponding value of the scenario time counter 501 , Ct.
  • the frame rate of the video objects is assumed to be 30 frames per second.
  • FIG. 17 it is preferable to create an active object table as shown in FIG. 17 from the scenario data table.
  • all of the Ct values found in the presentation start and end time fields of the scenario data table are listed in the ascending order in the first column or fields of the event list table.
  • the object IDs of multimedia objects the presentation of which is started or ongoing at the Ct value are listed in the second field.
  • each second field does not include the object the presentation of which ends at the Ct values.
  • FIG. 18 is a flowchart showing an operation executed by the controller 50 of the client terminal 3 under the control of an interrupt subroutine called from a main program in response to an interrupt caused by the user at the client terminal 3 pressing one of the presentation control buttons of FIG. 3 after specifying one of the available multimedia programs stored in the mass storage 20 . Since the operation 220 of FIG. 18 is very similarly to that of FIG. 6 , only the difference between them will be described in the following.
  • step 124 If JB, JF or HS command is detected in step 124 , then instead of executing a JB play step 126 , a JF play step 128 or a HS play step 130 , the controller 50 sets the register 501 value Ct to Ct ⁇ Cj, Ct+Cj or 0 in a JB step 240 , a JF step 242 or a HS step 244 , respectively; and returns to the main program to resumes the normal play operation of the current program from the register 501 .
  • step 250 of the normal play operation comprising steps 250 , 152 and 154 , the controller 50 presents relevant object(s) referring to the active object table of FIG. 17 . Specifically, if the current value Ct is found in any Ct field of the table, the controller 50 continues the presentation of the object(s) which is (or are) listed in both the current record whose Ct field contains the current Ct value and the just above records in the table; ceases the presentation of the object(s) which is (or are) found in the just above record but not found in the current record, and starts the presentation of the object(s) which first appears (or appear) in the current record. If the current value Ct is not found in any Ct field of the active object table, the controller 50 has only to repeat the same operation as executed for the last Ct value.
  • the controller 50 makes a test in step 238 to see if a video object exists in the record whose Ct value filed contains a largest value not exceeding the value of the scenario time register 501 , Ct. If so, the controller 50 performs the image enhancing operation for the frame identified by the current register 501 value Ct minus the register 501 value of the presentation start time SCt of the video object in step 139 . This is because the current register value Ct equals the sum of the presentation start time Ct value SCt and the frame number of the video object.
  • step 139 the controller 50 ends the operation 220 . If the test result is NO in step 238 , then the controller 50 ends the operation 220 .
  • step 139 enhances the picture quality of the frame to be displayed after the execution of a stop command or the execution of a JF, JB or HS command issued during a stop state.
  • the image enhancing operation may be performed for a plurality of frames beginning the frame identified by the value of Ct ⁇ SCt.
  • a multimedia-on-demand system adds detailed information to (or enhances the quality of) each of variable-quality objects during a stop period in a manner as illustrated by a part labeled “QUALITY ENHANCING OPERATION” in FIG. 21 .
  • a variable-quality object is a multimedia object that comprises a plurality of detail levels of data and that permits an enhancement of the presentation quality by adding a higher detail level of data.
  • the above-mentioned progressive JPEG video is one of such variable-quality objects.
  • FIG. 19 is a diagram showing examples of variable-quality objects. In FIG. 19 , still pictures A, B, C and D are variable in the display quality according to the difference data levels used for presentation. Also, the text object of FIG. 19 is said to be a variable—quality object since the text object comprises a plurality of detail levels of data.
  • the client terminal of the multimedia-on-demand system tries to collect as much object data as possible in advance during a stop period so that a random access operation such as a JF operation can be promptly executed.
  • This collection operation is shown by a part labeled “PRELOAD OPERATION” again in FIG. 21 .
  • FIG. 22 is a flowchart showing an operation of an interrupt subroutine 230 called in response to a pressing of one of the presentation control buttons of FIG. 3 according to the third embodiment of the invention.
  • the interrupt subroutine 230 is identical to that of FIG. 18 except that after step 137 , the controller 50 executes steps 260 and 270 instead of proceeding to step 238 .
  • step 260 the controller 50 performs an image quality enhancing operation for at least one frame beginning the frame identified by the value of Ct ⁇ SCt for each of the active variable-quality objects.
  • the controller 50 performs an image quality enhancing operation for at least one frame beginning the frame identified by the value of Ct ⁇ SCt for each of the active variable-quality objects.
  • the load flag for an object indicates whether the basic data of the object has been loaded or not.
  • the load flags are all reset in an initial operation.
  • step 270 the controller 50 preferably tries to load basic data of as many object to be subsequently presented as possible in advance. In order to distinguish the loaded object from not-loaded ones, the controller 50 sets the load flag each time the load operation of the basic data of an object has been completed. This enables a quick response in a random access operation such as a fast ford operation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A method of presenting, at a client terminal, a video program stored in a server linked with the client terminal via transmission path of a limited transmission band width. Each frame of the video program comprises a basic data portion and at least one level of quality supplement data portions. In the method, in response to one of play control commands from a user, the client terminal determines a start position in the video program according to the issued play control command. The play control commands includes a play, a stop, a head search, a jump forward and a jump backward command. In response to the issued play command, the terminal obtains and uses the basic data portions for playing the video program. In response to the stop command, the terminal obtains the quality supplement data portions for the last displayed frame and uses them for displaying a quality-enhanced version of the last displayed frame.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention relates to a client/server system in which a client terminal connected with a server through a transmission path of a limited bandwidth plays a multimedia program which is stored in the server and is comprised of multimedia objects such as moving pictures, still pictures, sounds and texts while reading the objects in real time from the server.
  • 2. Description of the Prior Art
  • In such a system, a user of a client terminal is permitted to select one of a plurality of programs and to enter commands such as a play, a stop, a head search, a jump forward and a jump backward for the selected program. During a play operation, the server reads data of a specified program stored in a storage device and transmits the read data to the client terminal. Then, the client terminal displays the received data. However, once one of the stop, the head search, a jump forward and a jump backward command is executed, no data is transmitted from the server to the client terminal till the play command is executed.
  • The transmission rate of the transmission path between the server and each terminal is limited, i.e., the quantity of data transmitted for a certain period of time is limited. For this reason, in order to enable each client terminal to play a program whose data is stored in the server while having the program data transmitted from the server, video data of each program is stored in the server such that the bit rate (or the quantity of data reproduced or played per second) of video data of each program does not exceed the transmission rate of the transmission paths.
  • If the transmission paths between the server and the client terminals are considerably low as in case of ordinary telephone lines, it is necessary to reduce the frame rate, the resolution and/or the frame size, which degrades the picture quality of video objects.
  • It is an object of the invention to provide a video-on-demand system that enhances the quality of image by using the period of a play stoppage.
  • SUMMARY OF THE INVENTION
  • According to an aspect of the invention, a method of presenting, at a client terminal, a video program stored in a server linked with the client terminal via transmission path of a limited band width is provided. Each frame of the video program comprises a basic data portion and at least one level of quality supplement data portions. In the method, in response to one of play control commands from a user, the client terminal determines a start (or play) position in the video program according to the issued play control command. The play control commands includes a play, a stop, a head search, a jump forward and a jump backward command. In response to the issued play command, the terminal obtains and uses the basic data portions for playing the video program. In response to the stop command, the terminal obtains and uses the at least one level of quality supplement data portions of a last displayed frame for displaying a quality-enhanced version of the last displayed frame.
  • According to another aspect of the invention, a method of presenting, at a client terminal, a multimedia program stored in a server is provided. The multimedia program includes a video object. Each frame of the video object comprises a basic data portion and at least one level of detailed data portions. In this method, in response to one of play control commands from a user, the terminal determines a time count in the multimedia program according to the issued play control command. The play control commands include a play, a stop, a head search, a jump forward and a jump backward command. In response to one of the head search, the jump forward and the jump backward commands issued during a stop period, the terminal determines whether there is a video object to be displayed at the time count in the multimedia program. In the event there is the video object to be displayed at the time count in the multimedia program, the terminal obtains at least one level of quality supplement data portions for a first frame to be displayed in a next, play operation for displaying a quality-enhanced version of the first frame to be displayed.
  • If a stop command is issued, a test is responsively made to see if there is multimedia objects which are other than video objects and each comprise basic data and quality supplement data and which are to be displayed at said time count in said multimedia program. If so, then for each of said found multimedia objects, the terminal obtains the quality supplement data for displaying a quality-enhanced version of each object.
  • Alternatively, in response to the stop command, a test is made to see if there is (or are) multimedia objects which are other than video objects and each comprise basic data and quality supplement data and which are to be displayed later. If so, the terminal tries to obtain the basic data for as many of the found multimedia objects as possible in advance.
  • According to another aspect of the invention, a terminal for presenting a video program stored in a remote server connected therewith via band-limited transmission path is provided. Each frame of the video program comprises a basic data portion and at least one level of quality supplement data portions. The terminal comprises means, responsive to one of play control commands from a user, for determining a start position in the video program according to the issued play control command. The play control command includes a play, a stop, a head search, a jump forward and a jump backward command. The terminal includes means, responsive to the play command from the user, for obtaining and using said basic data portions for playing said video program; and means, responsive to the stop command, for obtaining and using at least one level of quality supplement data portions of a last displayed frame for displaying a quality-enhanced version of the last displayed frame.
  • BRIEF DESCRIPTION OF THE DRAWING
  • The features and advantages of the present invention will be apparent from the following description of an exemplary embodiment of the invention and the accompanying drawing, in which:
  • FIG. 1 is a schematic block diagram showing an arrangement of a multimedia-on-demand system that can embody the present invention in various forms;
  • FIG. 2 is a schematic diagram showing the contents of the ROM 52 and the RAM;
  • FIG. 3 is a diagram showing exemplary presentation control buttons included in the control switches 70 of FIG. 1;
  • FIG. 4 is a table for describing how the scenario time manager 101 sets the value Ct of the scenario time register 501 in response to the executed presentation control command;
  • FIG. 5 is a diagram conceptually showing a data structure of a video object stored in the mass storage 20 of FIG. 1;
  • FIG. 6 is a flowchart showing an exemplary operation of an interrupt subroutine called from a main program in response to an interrupt caused by a pressing of one of the presentation control buttons of FIG. 3 after one of the available programs stored in the mass storage 20 is specified by the user at the client terminal 3;
  • FIGS. 7 and 8 are diagrams conceptually showing arrangements of a first and a second exemplary mass storage 20 a and 20 b using tape storage devices;
  • FIGS. 9 and 10 are diagrams showing a first storing scheme 20 c and a second storing scheme 20 d of storing a video object on a disc storage device;
  • FIG. 11 is a diagram showing an address table used in the second storing scheme 20 d of FIG. 10;
  • FIG. 12 is a diagram showing a third storing scheme 20 e of storing a video object on a disc storage device;
  • FIG. 13 is a diagram showing a data structure obtained when a progressive JPEG video object is stored in the third storing scheme 20 e as shown in FIG. 12;
  • FIG. 14 is a diagram conceptually showing the H.263 video format;
  • FIG. 15 is a schematic block diagram showing an exemplary arrangement of a video decoder 600 which is included in the video & audio decoder 60 of FIG. 1 and which is adapted to decode a video object of the H.263 format as shown in FIG. 14;
  • FIG. 16 is a diagram showing an exemplary scenario data table of a multimedia program available in a multimedia-on-demand system according to a second illustrative embodiment of the invention:
  • FIG. 17 is a diagram showing an exemplary active object table created from the scenario data table of FIG. 16;
  • FIG. 18 is a flowchart showing an operation of an interrupt subroutine called from a main program in response to an interrupt caused by the user at the client terminal 3 pressing one of the presentation control buttons of FIG. 3 after specifying one of the available multimedia programs stored in the mass storage 20;
  • FIG. 19 is a diagram showing various variable-quality objects used in an exemplary multimedia program available in the inventive multimedia-on-demand system;
  • FIG. 20 is a diagram showing how the multimedia objects of FIG. 19 are presented in the exemplary multimedia program;
  • FIG. 21 is a diagram showing a way of transmitting the multimedia objects to present the objects as shown in FIG. 20;
  • FIG. 22 is a flowchart showing an operation of an interrupt subroutine called in response to a pressing of one of the presentation control buttons of FIG. 3 according to the third embodiment of the invention;
  • FIG. 23 is a diagram showing an exemplary structure of a scenario data table according to the third embodiment of the invention; and
  • FIG. 24 is a diagram showing an exemplary arrangement of a load flag storage location.
  • Throughout the drawing, the same elements when shown in more than one figure are designated by the same reference numerals.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • FIG. 1 is a schematic block diagram showing an arrangement of a multimedia-on-demand system that can embody the present invention in various forms. In FIG. 1, double lines indicate bus lines. The multimedia-on-demand system comprises a server 1 that stores and serves multimedia programs, at least one client terminal 3 that plays one of the multimedia programs, and a network 2 for connecting the server 1 and the client terminals 3.
  • The server 1 comprises a controller 10, a mass storage 20 for storing the multimedia programs, an object data transmitter 30 for transmitting object data constituting a multimedia program and a control data communication interface (IF) 40 for communicating control data with each client terminal 3. The client terminal 3 comprises an object data receiver 35 for receiving object data transmitted from the server 1; a control data communication IF 40 for communicating the control data with the server 1; a controller 50 for controlling the operation of the terminal 3; a video and audio decoder 60 for decoding video and audio object data into video and audio output signals; a video output portion 80 for providing a video output according to the video signal from the decoder 60 and the image data from the controller 50; a audio output portion 90 for providing an audio output according to the audio signal from the decoder 60 and the audio data from the controller 50; and control switches 70 that permit the user to specify a desired one of the multimedia programs stored in the mass storage 20 and to enter a play, stop, jump forward, jump backward and head search commands. The controller 50 includes a read only memory (ROM) 52 and a random access memory (RAM) 54. The elements 35, 40, 50, 60, 70, 80 and 90 are interconnected by bus lines 51.
  • FIG. 2 is a schematic diagram showing the contents of the ROM 52 and the RAM. In FIG. 1, the ROM 52 stores programs 100 necessary for the operation of the controller 50. The RAM 54 stores various data 500 necessary for the operation of the controller 50. The programs 100 include a scenario time manager 101, which sets the value (Ct) of a scenario time register 501 in the RAM 54 in a play operation of a multimedia program such that the current scenario time or the current position in the multimedia program is given by T*Ct, where T is a frame period of the video objects.
  • The control switches 70 include presentation control buttons for head search (HS), jump forward (JF), play, stop, jump backward (JB) operations as shown in FIG. 3. FIG. 4 shows a table for describing how the scenario time manager 101 sets the value Ct of the scenario time register 501 in response to the executed presentation control command. In FIG. 4, if a play command is issued, the scenario time manger 101 increments the value Ct of the scenario time register 501 for every frame period T as long as the play command is active. If a jump forward, a jump backward or a head search is issued, the scenario time manger 101 sets the scenario time register value Ct to Ct+Cj, Ct−Cj or 0, respectively. Cj is a predetermined jump distance for use in the JF and JB operations. In case of a stop command, the scenario time manger 101 does nothing, i.e., the value Ct remains unchanged.
  • Each of the multimedia programs generally comprises video objects, still picture objects, audio objects and/or text objects.
  • Embodiment 1
  • For the sake of simplicity, it is assumed in a first illustrative embodiment of the invention that the multimedia-on-demand system of FIG. 1 is a video-on-demand system, i.e., each of the programs available at each client comprises a video object.
  • It is also assumed that the data of the video object stored in the mass storage 20 has a structure as shown in FIG. 5. Data of each frame of the video object (hereinafter referred to as “each frame data Ff”) comprises a basic image data portion F0 f and at least one level (e.g., 4 levels in FIG. 5) of quality supplement data portions F1 f, F2 f, F3 f and so on. (f=1, 2, . . . , N, where N is the total number of frames of a video object) The suffix f is the frame number of the frame. Zero “0” following “F” in a label given to each data portion indicates that the data portion is the basic image data. A non-zero numeral (1, 2, 3 . . . ) following “F” in a label given to each data portion indicates that the data portion is quality supplement data of the level specified by the non-zero numeral. The image quality of the f-th frame becomes better to the best by using not only basic image data F0 f but also quality supplement data F1 f, F2 f, F3 f and so on. In the following description, it is assumed that there are 4 levels of quality supplement data portions F1 f, F2 f, F3 f and F4 f for each frame.
  • In the first illustrative embodiment, the client terminal 3 uses only the basic image data F0 1, F0 2, . . . , F0 N (hereinafter referred, en bloc, to like “F0”) in a play operation. However, if the terminal 3 detects a stop command in an arbitrary state of operations or detects one of head search (HS), jump forward (JF) and jump backward (JB) commands during a stop state, the terminal 3 performs a image quality enhancing operation on entering a stop state or just after the operation of the detected command by obtaining the quality supplement data F1 f, F2 f, F3 f and F4 f for the last displayed frame from the server 1.
  • Since the presented program is a video object in this example, the scenario time register 501 contains the frame number f, that is, Ct=f.
  • FIG. 6 is a flowchart showing an operation executed by the controller 50 of the client terminal 3 under the control of an interrupt subroutine called from a main program in response to an interrupt caused by the user at the client terminal 3 pressing one of the presentation control buttons of FIG. 3 after specifying one of the available programs stored in the mass storage 20. In FIG. 6, the controller 50 first make s a test in step 122 to see if a play operation flag (not shown) is logical “1” or indicates that the terminal is in one of the play modes: i.e., a (normal) play, a jump forward (JF) play, a jump backward (JB) play, and a head search (HS) play (or a JB play to the beginning of the program). If so (which means that the terminal 3 is in a play mode), a test is made in step 124 to see which presentation control command has been issued.
  • If JB, JF or HS command is detected in step 124, then the controller 50 executes a JB play step 126, a JF play step 128 or a HS play step 130, respectively, in a well known manner; and returns to the main program that has invoked this subroutine 120 to resume the play operation that was being executed when this routine 120 was invoked. If the stop command is detected in step 124, the controller 50 resets the play operation flag, i.e., sets the flag logical “0” in step 136; ceases the play mode in step 137; performs an image quality enhancing operation for the frame identified by the value Ct (=f in this example) of the scenario time register 501 in step 139; and ends the operation 120.
  • Specifically, in step 139, the controller 50 transmits an image quality enhancing instruction and the register 501 value f to the server 1. The controller 10 of the server 1 responsively reads the quality supplement data F1 f, F2 f, F3 f and F4 f for the frame identified by the value f from the mass storage 20 and transmits them to the requesting client terminal 3. The terminal 3 responsively adding the received quality supplement data F1 f, F2 f, F3 f and F4 f to the basic image data F0 f into a high quality frame data. By doing this, the quality of the currently displayed frame becomes better gradually with the receptions of the quality supplement data F1 f, F2 f, F3 f and F4 f. After step 146, the controller 50 ends the operation 120.
  • If the play operation flag is not logical “1” (which means that the terminal 3 is in a stop mode), a test is made in step 125 to see which presentation control command has been issued.
  • If a play command is detected in step 125, the controller 50 sets the play operation flag logical “1” in step 132; and plays (or reproduces) the current program (the program the user has specified before the controller 50 has entered the operation 120) from the frame identified by the register 501 value Ct in steps 150, 152 and 154. Specifically, the controller 50 presents the frame of the register 501 value Ct in step 150 and checks the value Ct to see if the register value Ct has reached a preset end value in step 152. If not, the controller 50 increments the value Ct in step 154 and goes back to step 150. If the register 501 value Ct has reached the preset end value in step 152, then the controller 50 returns to the main program that has invoked this subroutine 120.
  • If JB, JF or HS command is detected in step 125, then the controller 50 sets the register 501 value Ct to Ct−Cj, Ct+Cj or 0 in a JB step 140, a JF step 142 or a HS step 144, respectively, as shown in FIG. 4. After step 140, 142 or 144, the controller 50 performs an image quality enhancing operation for the frame identified by the scenario time register 501 value before the execution of step 140, 142 or 144 in step 146. This causes the quality of the currently displayed frame to get better gradually with the receptions of the quality supplement data F1 f, F2 f, F3 f and F4 f as described above. The controller 50 ends the operation 120 after step 146.
  • Some Examples of the Mass Storage 20
  • FIGS. 7 and 8 are diagrams conceptually showing arrangements of a first and a second exemplary mass storage 20 a and 20 b using tape storage devices. The storage 20 a comprises five tape storage devices 211 through 215. The tape device 211 stores the basic image data F0. The four tape devices 212 through 215 store the four level quality supplement data F1 through F4, respectively. The basic image data F0 f and the corresponding quality supplement data F1 f, F2 f, F3 f and F4 f for each frame are recorded on the same tape positions of the five tapes. The five tape storage devices are so arranged that the five reels are independently rotated only in case of image quality enhancing operation and are synchronously rotated otherwise. In an image quality enhancing operation, the tape storage devices 212 through 215 for the quality supplement data F1 through F4 are sequentially read one by one.
  • The storage 20 b comprises two tape storage devices 217 and 218. The tape device 217 stores the basic image data F0 The tape device 218 stores the four level quality supplement data F1 through F4. The quality supplement data F1 f, F2 f, F3 f and F4 f for each frame are recorded on the tape position of the tape 218 which corresponds to the position of the tape 217 on which the basic image data F0 f for the frame is recorded when the tapes 217 and 218 are rotated synchronously. The two tape devices 217 and 218 are so arranged that the two reels are independently rotated only in case of image quality enhancing operation and are synchronously rotated otherwise. In an image quality enhancing operation, the tape storage device 218 portion for the quality supplement data F1 f through F4 f are sequentially read.
  • FIGS. 9 and 10 show a first storing scheme 20 c and a second storing scheme 20 d of storing a video object on the mass storage 20. The mass storage 20 may be any suitable disc storage device such as a hard disc, various optical discs, etc. The basic image data F0 and the quality supplement data (QSD) F1, F2, . . . are stored in two different areas: a F0 area and a QSD area on the third mass storage media.
  • In the first storing scheme 20 c of FIG. 9, it is assumed that the quantity of the quality supplement data (QSD) F1 f, F2 f, F3 f and F4 f for each frame is M times the data quantity of the basic image data F0 f for the frame, where M is a positive constant. Then, if the first data of the basic image data F0 f is (N+1)-th byte in the F0 area, then in order to obtain the quality supplement data F1 f, F2 f, F3 f and F4 f, the controller 40 has only to read the data of D*M bytes from the (N*M+1)-th byte in the QSD area. D is the data-size of the basic image data for each frame.
  • In the second storing scheme 20 d of FIG. 10, the basic image data F0 f and the total quality supplement data F1 f+F2 f+F3 f+F4 f may have arbitrary sizes. The start address of the total quality supplement data for an f-th frame in the QSD area is assumed to be Af. In order to know the quality supplement data address Af from the frame number f, the controller 10 uses an address table of FIG. 11. The address table of FIG. 11 comprises a frame number (f) field and a field of QSD address (Af) for the frame number (f).
  • FIG. 12 shows a third storing scheme 20 e of storing a video object on the mass storage 20. The mass storage 20 preferably comprises a suitable disc storage device such as a hard disc, various optical discs, etc. In this storing scheme 20 e, the basic image data F0 f and the quality supplement data QSDf (=F1 f+F2 f+ . . . ) are stored in a same area with the latter just following the former in a manner like F0 f−1, QSDf−1, F0 f, QSDf, F0 f+1, QSDf+1 and so on. In this case, in normal play step 150 of FIG. 6, the controller 10 reads only the basic image data skipping the quality supplement data as shown by arrows above the strip area representative of the stored video data in FIG. 12. In the image quality enhancing operation in step 146, the controller 10 reads the quality supplement data QSDf for the frame identified by the register 501 value as shown by an arrow below the strip area representative of the stored video data in FIG. 12.
  • Progressive JPEG Format
  • The invention is applicable to video data of formats in which some of the frames are described by using differential data between frames: e.g., the progressive JPEG format, the H.263 format, MPEG-1 format and the MPEG-2 format. FIG. 13 is a diagram showing a data structure obtained when a progressive JPEG video object is stored in the third storing scheme 20 e as shown in FIG. 12. In FIG. 13, the basic image data F0 f for each frame comprises a header F0Hf and a basic image data portion F0Df. The quality supplement data QSDf comprises a first level differential data F1 f, a second level differential data F2 f, . . . , and an L-th level differential data FLf.
  • In case of the progressive JPEG format, in normal play step 150 of FIG. 6, the controller 10 reads only the header F0Hf and the basic image data F0Df skipping the quality supplement data QSDf (=F1 f, F2 f, . . . , and FLf) for each frame f. In the image quality enhancing operation in step 146, the controller 10 reads the quality supplement data QSDf for the frame identified by the register 501 value.
  • It is noted that the controller 50 passes the frame data to be displayed to the video & audio decoder 60 of FIG. 1 in playing operations of steps 126, 128, 130 and 150. In case of the progressive JPEG format, the video & audio decoder 60 includes a JPEG decoder.
  • H.263 Format
  • FIG. 14 is a diagram conceptually showing the H.263 video format. In FIG. 14, an H.263 video data comprises basic image data 210 for use in a play operation and quality supplement data 220 for use in a quality enhancing operation. If the frame data for the basic image data are expressed as F0 0, F0 1, F0 2, . . . , F0 g, . . . , and F0 3N+2 (g is a frame number), then the frame data can be expressed as {F0 3f, F0 3f+1, F0 3f+2: f=0, 1, . . . , N}. In this case, the basic image frames identified by F0 3f are intra-coded frames that can be decoded alone without the need of data of any other frame. On the other hand, the basic image frames identified by F0 f3+1 and F0 f3+2 are first and second differences, in the time direction, from the basic image data F0 3f which needs frame F0 3f data for decoding. The first and second differences are written as TIME DIRECTION DIF 1 and 2, respectively in FIG. 14. The quality supplement data for the frame 3 f comprises first, second and third differences, in the quality direction, from the basic image data F0 3f, which differences are referred to as “ QUALITY DIFs 1, 2 and 3” and labeled “F13f”, “F23f” and “F33f”, respectively. In a similarly manner, the quality supplement data for the frame 3 f+1 comprises QUALITY DIFs 1, 2 and 3 from the basic image data F0 3f+1 which differences are labeled “F13f+1”, “F23f+1” and “F33f+1”, respectively. Also, the quality supplement data for the frame 3f+2 comprises QUALITY DIFs 1, 2 and 3 from the basic image data F0 3f+2 which differences are labeled “F13f+2”, “F23f+2” and “F33f+2”, respectively.
  • FIG. 15 is a schematic block diagram showing an exemplary arrangement of a video decoder 600 which is included in the video & audio decoder 60 of FIG. 1 and which is adapted to decode a video object of the H.263 format as shown in FIG. 14. In FIG. 15, the video decoder 600 comprises a local controller 602 for controlling the operation of the decoder 600; a time-based H.263 decoder 604; an adder 606; a frame memory 608 for storing a I-coded image data F0 3f: a memory interface 610 for the memory 608; a quality-enhancing H.263 decoder 612 for decoding quality supplement data F1 g, F2 g and F3 g to provide a quality-enhanced frame data F0 g+F1 g+F2 g+F3 g; and a previous frame memory for quality enhancement.
  • The received video data is passed to the video & audio decider 60 and to the video decoder 600 or the local controller 602 through the bus lines 51. If the received video data is basic image data F0 g, then the local controller 602 passes the data F0 g to the time-based H.263 decider 604. If the received video data is quality supplement data F1 g, F2 g or F3 g, then the local controller 602 passes the data F1 g, F2 g or F3 g to the quality-enhancing H.263 decider 612.
  • The local controller 602 supplies a control signal 602 a to a memory interface 610 control input 610 c. The control signal 602 a controls the memory interface 610 such that the data on the interface 610 data input terminal 610 a is stored in the memory 608 if the received video data is I-coded image data, i.e., g=3 f. Thus, if the received video data is F0 3f, the decoded video data [F0 3f] is stored in the frame memory 608, where [A] represents a decoded version of data A.
  • The control signal 602 a also controls the memory interface 610 such that the data stored in the frame memory 608, i.e., the decoded video data [F0 3f] is read out to a memory interface 610 data output terminal 610 b if the received video data is not I-coded image data, i.e., g≠3 f. Thus, if the received video data is F0 3f+1 or F0 3f+2, the decoded video data [F0 3f+1] or [F0 3f+2] is added by the adder 606 to the decoded video data [F0 3f] read from the memory 608 to yield the added decoded video data [F0 3f]+[F0 3f+1] or [F0 3f]+[F0 3f+2], respectively, which is supplied to the video output portion 80 and the previous frame memory 614.
  • To the previous frame memory 614, there are also supplied the decoded video data from the quality-enhancing H.263 decoder 612. The H.263 decoder 612 decodes the quality supplement data F1 g, F2 g or F3 g from the local controller 602, and adds the decoded data [F1 g], [F2 g] or [F3 g] to the data from the previous frame memory 614 to provide the quality enhanced frame data to the video output portion 80.
  • It is noted that since the video decoder 600 has respective previous frame memories 608 and 614 and respective H.263 decoders 604 and 612 for a decoding in the time axis direction and a decoding in the quality axis direction, it is possible to store data decoded in the time axis direction in both of the previous frame memory 608 and 614 and to store data decoded in the quality axis direction only in the memory 614 for the quality axis direction. This reason, even if quality supplement data for a frame data F0 g has been decoded, it is possible to resume the play of video data from the frame data F0 g.
  • Though the above-described video decoder 600 has used two H.263 decoders, an equivalent video decoder may be implemented by using a single decoder.
  • A video decoder that decodes a video object of a format using a correlation between frames not only in the time axis direction but also in the quality axis direction has been described in conjunction with the H.263 video format. However, such a video decoder can be realized for other such video format as MPEG format by replacing the H.263 decoder(s) with a corresponding video decoder such as an MPEG decoder.
  • Though the above-described embodiments has dealt with a single media program, i.e., a video object, the following embodiment deals with a multimedia program.
  • Embodiment II
  • A multimedia-on-demand system according to a second illustrative embodiment of the invention has a feature of enhancing the picture quality of the first frame to be displayed after the execution of a stop command or the execution of a JF, JB or HS command issued during a stop state by transmitting quality supplement data from the server 1.
  • FIG. 16 is a diagram showing an exemplary scenario data table of a multimedia program available in a multimedia-on-demand system according to a second illustrative embodiment of the invention. In FIG. 16, the scenario data table contains a record for each of the multimedia objects used in the multimedia program for which the scenario data table is intended. Each record of the scenario data table comprises the fields of the object ID, the kind of the object, the display position on a screen, the display size, the presentation start time and the presentation end time. For the sake of better understanding, in the presentation start and end time fields, there is included corresponding value of the scenario time counter 501, Ct. In this specific example, the frame rate of the video objects is assumed to be 30 frames per second.
  • In order to simplify the operation, it is preferable to create an active object table as shown in FIG. 17 from the scenario data table. In FIG. 17, all of the Ct values found in the presentation start and end time fields of the scenario data table are listed in the ascending order in the first column or fields of the event list table. For each of the listed Ct values, there are listed, in the second field, the object IDs of multimedia objects the presentation of which is started or ongoing at the Ct value. However, each second field does not include the object the presentation of which ends at the Ct values.
  • FIG. 18 is a flowchart showing an operation executed by the controller 50 of the client terminal 3 under the control of an interrupt subroutine called from a main program in response to an interrupt caused by the user at the client terminal 3 pressing one of the presentation control buttons of FIG. 3 after specifying one of the available multimedia programs stored in the mass storage 20. Since the operation 220 of FIG. 18 is very similarly to that of FIG. 6, only the difference between them will be described in the following.
  • If JB, JF or HS command is detected in step 124, then instead of executing a JB play step 126, a JF play step 128 or a HS play step 130, the controller 50 sets the register 501 value Ct to Ct−Cj, Ct+Cj or 0 in a JB step 240, a JF step 242 or a HS step 244, respectively; and returns to the main program to resumes the normal play operation of the current program from the register 501.
  • In step 250 of the normal play operation comprising steps 250, 152 and 154, the controller 50 presents relevant object(s) referring to the active object table of FIG. 17. Specifically, if the current value Ct is found in any Ct field of the table, the controller 50 continues the presentation of the object(s) which is (or are) listed in both the current record whose Ct field contains the current Ct value and the just above records in the table; ceases the presentation of the object(s) which is (or are) found in the just above record but not found in the current record, and starts the presentation of the object(s) which first appears (or appear) in the current record. If the current value Ct is not found in any Ct field of the active object table, the controller 50 has only to repeat the same operation as executed for the last Ct value.
      • in a manner well known in the art. In this case, if a video frame is to be displayed, the controller 50 only uses basic data for the frame.
  • After step 137, 140, 142 or 144, the controller 50 makes a test in step 238 to see if a video object exists in the record whose Ct value filed contains a largest value not exceeding the value of the scenario time register 501, Ct. If so, the controller 50 performs the image enhancing operation for the frame identified by the current register 501 value Ct minus the register 501 value of the presentation start time SCt of the video object in step 139. This is because the current register value Ct equals the sum of the presentation start time Ct value SCt and the frame number of the video object.
  • After step 139, the controller 50 ends the operation 220. If the test result is NO in step 238, then the controller 50 ends the operation 220.
  • As described above, the image enhancing operation of step 139 enhances the picture quality of the frame to be displayed after the execution of a stop command or the execution of a JF, JB or HS command issued during a stop state.
  • It should be noted that the image enhancing operation may be performed for a plurality of frames beginning the frame identified by the value of Ct−SCt.
  • Embodiment III
  • According to a third illustrative embodiment of the invention, a multimedia-on-demand system adds detailed information to (or enhances the quality of) each of variable-quality objects during a stop period in a manner as illustrated by a part labeled “QUALITY ENHANCING OPERATION” in FIG. 21. A variable-quality object is a multimedia object that comprises a plurality of detail levels of data and that permits an enhancement of the presentation quality by adding a higher detail level of data. The above-mentioned progressive JPEG video is one of such variable-quality objects. FIG. 19 is a diagram showing examples of variable-quality objects. In FIG. 19, still pictures A, B, C and D are variable in the display quality according to the difference data levels used for presentation. Also, the text object of FIG. 19 is said to be a variable—quality object since the text object comprises a plurality of detail levels of data.
  • Also, the client terminal of the multimedia-on-demand system tries to collect as much object data as possible in advance during a stop period so that a random access operation such as a JF operation can be promptly executed. This collection operation is shown by a part labeled “PRELOAD OPERATION” again in FIG. 21.
  • FIG. 22 is a flowchart showing an operation of an interrupt subroutine 230 called in response to a pressing of one of the presentation control buttons of FIG. 3 according to the third embodiment of the invention. The interrupt subroutine 230 is identical to that of FIG. 18 except that after step 137, the controller 50 executes steps 260 and 270 instead of proceeding to step 238.
  • In step 260, the controller 50 performs an image quality enhancing operation for at least one frame beginning the frame identified by the value of Ct−SCt for each of the active variable-quality objects. For this purpose, it is preferable to add an field 265 for containing a variable-quality flag indicative of whether the objet is variable in presentation quality or a loading priority code indicative of the priority order of the object in a load operation as shown in FIG. 23. If there are a plurality of active objects with an identical priority code, the controller 50 preferably processes the objects in order of presentation.
  • Also, it is preferable to keep a load flag for each object as shown in FIG. 24. The load flag for an object indicates whether the basic data of the object has been loaded or not. The load flags are all reset in an initial operation.
  • In step 270, the controller 50 preferably tries to load basic data of as many object to be subsequently presented as possible in advance. In order to distinguish the loaded object from not-loaded ones, the controller 50 sets the load flag each time the load operation of the basic data of an object has been completed. This enables a quick response in a random access operation such as a fast ford operation.
  • Many widely different embodiments of the present invention may be constructed without departing from the spirit and scope of the present invention. It should be understood that the present invention is not limited to the specific embodiments described in the specification, except as defined in the appended claims.

Claims (13)

1. A method of presenting, at a client terminal, a video program stored in a server linked with the client terminal, the method comprising the steps of:
preparing a plurality of levels of quality supplement data in the server as each of a plurality of quality supplement data portions;
storing a plurality of basic data portions on a recording medium in the server;
storing each level of the quality supplement data on a recording medium in the server;
preparing one basic data portion and one quality supplement data portion for each frame of the video program in the server, a quality of the video program at each frame played based on a combination of the basic data portion and the quality supplement data portion being higher than that based on only the basic data portion;
in response to a play command in the client terminal, obtaining the basic data portions of the frames of the video program needed to play the video program by using the basic data portions of the frames; and
in response to a stop command in the client terminal, adding the quality supplement data portion of the last displayed frame to display a quality-enhanced version of the last displayed frame by using a combination of the basic data portion and the quality supplement data portion
2. A method as defined in claim 1, further comprising the step of, in response to one of a head search command, a jump forward command and a jump backward commands in the client terminal, adding the quality supplement data portion, of the last displayed frame to display the quality-enhanced version of said last displayed frame by using the combination of the basic data portion and the quality supplement data portion.
3. (canceled)
4. A method as defined claim 1, further comprising the steps, executed by said server, of:
moving the recording media synchronously in any of a play operation, a head search operation, a jump forward operation or a jump backward operation; and
in response to a quality supplement data request from said client terminal, sending at least one level of quality supplement data of the last displayed frame to the client terminal by reading the level of the quality supplement data while synchronously moving the recording media, wherein the quality supplemental data request are prepared in response to the stop command.
5-9. (canceled)
10. A method as defined in claim 1, wherein each frame of the video program has been coded according to a coding standard, wherein the program comprises independent frames that can be decoded alone without the a need of other frame data and different frames that can not be decoded without other frame data, and wherein the step of obtaining the basic data portions said using includes passing to a decoder, and the step of adding the quality supplement data portion includes passing the quality supplement data portion to the decoder.
11. A method as defined in claim 10, wherein said coding standard is an H.263 standard, the step of obtaining the basic data portions includes passing the basic data portions to an H.263 decoder, and the step of adding the quality supplement data portion includes passing the quality supplement data portion to the H.263 decoder.
12. A method as defined in claim 10, wherein the coding standard is an MPEG standard, the step of obtaining the basic data portions includes passing the basic data portions to an MPEG decoder, and the step of adding the quality supplement data portion includes passing the quality supplement data portion to the MPEG decoder.
13-33. (canceled)
34. A method as defined in claim 1, wherein the step of adding said supplement data portion comprises:
providing a plurality of levels of quality supplement data; and
applying each level of quality supplement data in sequence to gradually increase the quality of the video program at the last displayed frame played by a combination of the basic data portion and the level of quality supplement data.
35. A method as defined in claim 1 wherein said recording medium is a tape recording medium.
36. A method as claimed in claim 1 wherein said recording medium comprises a plurality of recording media.
37. A method as claimed in claim 35 wherein said recording media comprise a plurality pf tape recording media.
US10/978,450 1998-06-26 2004-11-02 Client/server multimedia presentation system Abandoned US20050091691A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/978,450 US20050091691A1 (en) 1998-06-26 2004-11-02 Client/server multimedia presentation system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP18026998A JP2000013777A (en) 1998-06-26 1998-06-26 Video reproducing device and video storage device
JP10-180269 1998-06-26
US9336699A 1999-06-21 1999-06-21
US10/978,450 US20050091691A1 (en) 1998-06-26 2004-11-02 Client/server multimedia presentation system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/336,699 Division US6993786B1 (en) 1998-06-26 1999-06-21 Client/server multimedia presentation system

Publications (1)

Publication Number Publication Date
US20050091691A1 true US20050091691A1 (en) 2005-04-28

Family

ID=16080289

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/336,699 Expired - Fee Related US6993786B1 (en) 1998-06-26 1999-06-21 Client/server multimedia presentation system
US10/978,450 Abandoned US20050091691A1 (en) 1998-06-26 2004-11-02 Client/server multimedia presentation system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/336,699 Expired - Fee Related US6993786B1 (en) 1998-06-26 1999-06-21 Client/server multimedia presentation system

Country Status (4)

Country Link
US (2) US6993786B1 (en)
EP (1) EP0967796A3 (en)
JP (1) JP2000013777A (en)
SG (1) SG99283A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070169158A1 (en) * 2006-01-13 2007-07-19 Yahoo! Inc. Method and system for creating and applying dynamic media specification creator and applicator
US20070179979A1 (en) * 2006-01-13 2007-08-02 Yahoo! Inc. Method and system for online remixing of digital multimedia
US20070277108A1 (en) * 2006-05-21 2007-11-29 Orgill Mark S Methods and apparatus for remote motion graphics authoring
US20080215620A1 (en) * 2006-01-13 2008-09-04 Yahoo! Inc. Method and system for social remixing of media content
US20090103835A1 (en) * 2006-01-13 2009-04-23 Yahoo! Inc. Method and system for combining edit information with media content
CN101395918B (en) * 2006-01-13 2012-02-29 雅虎公司 Method and system for creating and applying dynamic media specification creator and applicator
US8984406B2 (en) 2009-04-30 2015-03-17 Yahoo! Inc! Method and system for annotating video content
US11240570B1 (en) * 2020-10-08 2022-02-01 International Business Machines Corporation Object-based video loading

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002003690A1 (en) * 2000-07-03 2002-01-10 Fujitsu Limited Digital video information apparatus
JP3860957B2 (en) * 2000-08-31 2006-12-20 株式会社日立製作所 Multimedia data transmission device
JP2004523178A (en) * 2001-03-07 2004-07-29 アイ ピー ヴィー リミテッド How to process video into encoded bitstream
EP1543682A2 (en) * 2001-08-31 2005-06-22 Koninklijke Philips Electronics N.V. Output device with select means
JP4443833B2 (en) * 2002-02-27 2010-03-31 パナソニック株式会社 Information reproducing method, transmitting apparatus and receiving apparatus
US8468570B2 (en) * 2002-09-05 2013-06-18 Thomson Licensing Method and system for memory PVR functions in a broadcast environment
US8290349B2 (en) 2006-06-22 2012-10-16 Sony Corporation Playback apparatus, method, and program
US9124767B2 (en) * 2006-10-25 2015-09-01 Microsoft Technology Licensing, Llc Multi-DVR media content arbitration
JP5493531B2 (en) * 2009-07-17 2014-05-14 三菱電機株式会社 Video / audio recording / reproducing apparatus and video / audio recording / reproducing method
US10162936B2 (en) 2016-03-10 2018-12-25 Ricoh Company, Ltd. Secure real-time healthcare information streaming

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5450122A (en) * 1991-11-22 1995-09-12 A.C. Nielsen Company In-station television program encoding and monitoring system and method
US5477263A (en) * 1994-05-26 1995-12-19 Bell Atlantic Network Services, Inc. Method and apparatus for video on demand with fast forward, reverse and channel pause
US5594911A (en) * 1994-07-13 1997-01-14 Bell Communications Research, Inc. System and method for preprocessing and delivering multimedia presentations
US5646676A (en) * 1995-05-30 1997-07-08 International Business Machines Corporation Scalable interactive multimedia server system for providing on demand data
US5659539A (en) * 1995-07-14 1997-08-19 Oracle Corporation Method and apparatus for frame accurate access of digital audio-visual information
US5734589A (en) * 1995-01-31 1998-03-31 Bell Atlantic Network Services, Inc. Digital entertainment terminal with channel mapping
US6266817B1 (en) * 1995-04-18 2001-07-24 Sun Microsystems, Inc. Decoder for a software-implemented end-to-end scalable video delivery system

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5418713A (en) * 1993-08-05 1995-05-23 Allen; Richard Apparatus and method for an on demand data delivery system for the preview, selection, retrieval and reproduction at a remote location of previously recorded or programmed materials
US5583561A (en) * 1994-06-07 1996-12-10 Unisys Corporation Multi-cast digital video data server using synchronization groups
US5720037A (en) * 1994-06-16 1998-02-17 Lucent Technologies Inc. Multimedia on-demand server
KR0150702B1 (en) * 1995-01-25 1998-10-15 구자홍 Fast forward/reverse drive control method for vod system
US5621660A (en) * 1995-04-18 1997-04-15 Sun Microsystems, Inc. Software-based encoder for a software-implemented end-to-end scalable video delivery system
JPH09116511A (en) * 1995-10-16 1997-05-02 Sony Corp Program transmission reception system
US5852565A (en) * 1996-01-30 1998-12-22 Demografx Temporal and resolution layering in advanced television
US6065050A (en) * 1996-06-05 2000-05-16 Sun Microsystems, Inc. System and method for indexing between trick play and normal play video streams in a video delivery system
US6233017B1 (en) * 1996-09-16 2001-05-15 Microsoft Corporation Multimedia compression system with adaptive block sizes
US6263507B1 (en) * 1996-12-05 2001-07-17 Interval Research Corporation Browser for use in navigating a body of information, with particular application to browsing information represented by audiovisual data
US6002440A (en) * 1996-12-10 1999-12-14 British Telcommunications Public Limited Company Video coding
US6172672B1 (en) * 1996-12-18 2001-01-09 Seeltfirst.Com Method and system for providing snapshots from a compressed digital video stream
US6014694A (en) * 1997-06-26 2000-01-11 Citrix Systems, Inc. System for adaptive video/audio transport over a network
JP3479443B2 (en) * 1997-12-16 2003-12-15 株式会社日立製作所 Moving image data compression method and output method, moving image data reproduction method, and moving image data compression device, output device, and reproduction device
US6496980B1 (en) * 1998-12-07 2002-12-17 Intel Corporation Method of providing replay on demand for streaming digital multimedia

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5450122A (en) * 1991-11-22 1995-09-12 A.C. Nielsen Company In-station television program encoding and monitoring system and method
US5477263A (en) * 1994-05-26 1995-12-19 Bell Atlantic Network Services, Inc. Method and apparatus for video on demand with fast forward, reverse and channel pause
US5594911A (en) * 1994-07-13 1997-01-14 Bell Communications Research, Inc. System and method for preprocessing and delivering multimedia presentations
US5734589A (en) * 1995-01-31 1998-03-31 Bell Atlantic Network Services, Inc. Digital entertainment terminal with channel mapping
US6266817B1 (en) * 1995-04-18 2001-07-24 Sun Microsystems, Inc. Decoder for a software-implemented end-to-end scalable video delivery system
US5646676A (en) * 1995-05-30 1997-07-08 International Business Machines Corporation Scalable interactive multimedia server system for providing on demand data
US5659539A (en) * 1995-07-14 1997-08-19 Oracle Corporation Method and apparatus for frame accurate access of digital audio-visual information

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090106093A1 (en) * 2006-01-13 2009-04-23 Yahoo! Inc. Method and system for publishing media content
US20070169158A1 (en) * 2006-01-13 2007-07-19 Yahoo! Inc. Method and system for creating and applying dynamic media specification creator and applicator
CN101395918B (en) * 2006-01-13 2012-02-29 雅虎公司 Method and system for creating and applying dynamic media specification creator and applicator
WO2007084869A3 (en) * 2006-01-13 2008-04-03 Yahoo Inc Method and system for creating and applying dynamic media specification creator and applicator
US20080215620A1 (en) * 2006-01-13 2008-09-04 Yahoo! Inc. Method and system for social remixing of media content
US20090103835A1 (en) * 2006-01-13 2009-04-23 Yahoo! Inc. Method and system for combining edit information with media content
US20070179979A1 (en) * 2006-01-13 2007-08-02 Yahoo! Inc. Method and system for online remixing of digital multimedia
KR100976887B1 (en) 2006-01-13 2010-08-18 야후! 인크. Method and system for creating and applying dynamic media specification creator and applicator
US8868465B2 (en) 2006-01-13 2014-10-21 Yahoo! Inc. Method and system for publishing media content
US8411758B2 (en) 2006-01-13 2013-04-02 Yahoo! Inc. Method and system for online remixing of digital multimedia
US20070277108A1 (en) * 2006-05-21 2007-11-29 Orgill Mark S Methods and apparatus for remote motion graphics authoring
US9601157B2 (en) 2006-05-21 2017-03-21 Mark S. Orgill Methods and apparatus for remote motion graphics authoring
US8984406B2 (en) 2009-04-30 2015-03-17 Yahoo! Inc! Method and system for annotating video content
US11240570B1 (en) * 2020-10-08 2022-02-01 International Business Machines Corporation Object-based video loading

Also Published As

Publication number Publication date
JP2000013777A (en) 2000-01-14
EP0967796A2 (en) 1999-12-29
SG99283A1 (en) 2003-10-27
US6993786B1 (en) 2006-01-31
EP0967796A3 (en) 2006-05-24

Similar Documents

Publication Publication Date Title
US6993786B1 (en) Client/server multimedia presentation system
US8649661B2 (en) Storage medium storing text-based subtitle data including style information, and apparatus and method of playing back the storage medium
RU2352982C2 (en) Data carrier for storage of interactive graphical data flow, activated in response to user command, and device for its reproduction
US20030067479A1 (en) Method of indexing image hierarchically and apparatus therefor
JP2004166253A (en) Time reference for multimedia object
US9380315B2 (en) Method of reproducing a still picture from a recording medium, method of decoding the still picture and the recording medium
US7657152B2 (en) Broadcast playback and/or recording apparatus
US10848835B2 (en) Video summary information playback device and method and video summary information providing server and method
US20060204224A1 (en) Recording medium containing thumbnail recorded thereon, recording apparatus and method therefor, and reproducing apparatus and method therefor
KR20060076192A (en) Content reproduce system, reproduce device, reproduce method, and distribution server
US20070201819A1 (en) Apparatus and method for variable speed playback of digital broadcasting stream
US6754437B1 (en) Receiver, recorder and player
US8305379B2 (en) Method for managing animation chunk data and its attribute information for use in an interactive disc
JP2009141435A (en) Content reproduction device and content distribution system
KR101033558B1 (en) Private Video Recorder and Method for Highlight Reproduction of Private Video Recorder
KR100965883B1 (en) Storage medium containing audio-visual data including mode information, display playback device and display playback method thereof
US20050094973A1 (en) Moving picture reproducing apparatus in which player mode information is set, reproducing method using the same, and storage medium
JP2006339980A (en) Image reproducer
JPH11162150A (en) Magnetic recording/reproducing device and method
US20100195980A1 (en) Information storage medium, reproducing apparatus, and reproducing method
US8750672B2 (en) Playback method and apparatus
JP3119374B2 (en) Image reverse playback method
JP2000092450A (en) Video server device
KR100965893B1 (en) Display playback method of storage medium containing audio-visual data including mode information
EP1876598A2 (en) Method and apparatus for managing animation data of an interactive DVD.

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION