WO2005029864A1 - Method and apparatus for generating graphical and media displays at a thin client - Google Patents

Method and apparatus for generating graphical and media displays at a thin client Download PDF

Info

Publication number
WO2005029864A1
WO2005029864A1 PCT/US2004/029993 US2004029993W WO2005029864A1 WO 2005029864 A1 WO2005029864 A1 WO 2005029864A1 US 2004029993 W US2004029993 W US 2004029993W WO 2005029864 A1 WO2005029864 A1 WO 2005029864A1
Authority
WO
WIPO (PCT)
Prior art keywords
client
data set
media
server
decompressed data
Prior art date
Application number
PCT/US2004/029993
Other languages
French (fr)
Other versions
WO2005029864A8 (en
Inventor
David Robinson
Lee Laborczfalvi
Pierre Semaan
Anil Roychoudry
Martin Duursma
Anatoliy Panasyuk
Georgy Momtchilov
Original Assignee
Citrix Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Citrix Systems, Inc. filed Critical Citrix Systems, Inc.
Priority to AU2004305808A priority Critical patent/AU2004305808A1/en
Priority to JP2006526396A priority patent/JP2007505580A/en
Priority to EP04784000A priority patent/EP1665798A1/en
Priority to CA002538340A priority patent/CA2538340A1/en
Publication of WO2005029864A1 publication Critical patent/WO2005029864A1/en
Priority to IL174245A priority patent/IL174245A0/en
Publication of WO2005029864A8 publication Critical patent/WO2005029864A8/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8193Monomedia components thereof involving executable data, e.g. software dedicated tools, e.g. video decoder software or IPMP tool
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9577Optimising the visualization of content, e.g. distillation of HTML documents
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • H04N21/23113Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion involving housekeeping operations for stored content, e.g. prioritizing content for deletion because of storage space restrictions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • H04N21/2312Data placement on disk arrays
    • H04N21/2318Data placement on disk arrays using striping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2347Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving video stream encryption
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2402Monitoring of the downstream path of the transmission network, e.g. bandwidth available
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25858Management of client data involving client software characteristics, e.g. OS identifier
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41415Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/4143Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a Personal Computer [PC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42653Internal components of the client ; Characteristics thereof for processing graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44209Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4424Monitoring of the internal components or processes of the client device, e.g. CPU or memory load, processing speed, timer, counter or percentage of the hard disk space used
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4425Monitoring of client processing errors or hardware failure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4431OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB characterized by the use of Application Program Interface [API] libraries
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6143Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a satellite
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6156Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
    • H04N21/6175Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6373Control signals issued by the client directed to the server or network components for rate control, e.g. request to the server to modify its transmission rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6408Unicasting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/654Transmission by server directed to the client
    • H04N21/6547Transmission by server directed to the client comprising parameters, e.g. for client setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6582Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6583Acknowledgement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17336Handling of requests in head-ends

Definitions

  • the invention generally relates to distributed processing, and, more particularly, to generating a display having graphical and/or media components at a client.
  • a thin-client protocol can be used for displaying output, produced by an application running on a server, on a client running on a computer with limited processing capabilities.
  • Two exemplary thin client protocols are ICA, Independent Computing Architecture from Citrix Systems, Inc., Ft. Lauderdale, FL and RDP, Remote Desktop Protocol f om Microsoft, Inc., Redmond, WA.
  • the client is also sometimes referred to as a remote terminal session.
  • One thin-client protocol intercepts commands by the application program to the server operating system (“OS”) to draw to a display screen. The intercepted commands are transmitted to the remote session using, for example, one or more presentation layer packets.
  • OS server operating system
  • the remote session e.g., thin- client
  • the remote session passes the received commands to the remote session OS.
  • the thin-client draws the application program output on its display using the received commands, hi this manner, the application program appears to be executing on the thin-client.
  • bitmap format of an image is generally a very large data set.
  • the thin-client protocol must transmit over the network the bitmap representation of an image, which is a large amount of data, along with the applicable commands on how to display the bitmap representation.
  • this results in a large time delay before the complete image is received and displayed on the client. This can result in inconvenience and unhappiness for the user of the client.
  • transmission of these large bitmap formats results in large costs associated with each transmission.
  • a video file is rendered as a series of bitmaps and audio information is rendered using pulse code modulation.
  • the thin-client protocol transmits the series of bitmaps representing the video file and/or the pulse code modulated signal representing the audio information over the network.
  • This transmission is inefficient, requiring excessive bandwidth and significant CPU usage.
  • an unresponsive graphical user interface may result at the client.
  • Video playback for example, is often of low quality, may appear "jerky,” and may synchronize poorly with the audio presentation.
  • the invention lowers the time and cost of transmitting images and other non-textual elements, originally represented in large bitmap formats, by substituting, prior to transmission, available compressed formats for the bitmap formats.
  • images and other multimedia content is transmitted in an already-compressed format, such as JPEG, PNG, GIF, MPEG3 or MPEG4.
  • the time and cost of transmitting the media may be lowered by intercepting the already-compressed format and substituting another version of the media stream in a, typically, more compressed format. Transmitting the compressed formats can significantly reduce the bandwidth necessary to transmit the media stream.
  • the client decompresses the received data using available libraries.
  • the client substitutes the decompressed image for the original bitmap representations using, for example, modified thin-client protocol commands with other identifying data.
  • a compressed data set representing at least a portion of a media stream, is intercepted on a first computing device before it is decompressed.
  • the compressed data set is decompressed on the first computing device
  • the resulting decompressed data set is re- compressed on the first computing device.
  • a server transmits compressed data representative of a non-textual element or media stream
  • the processing burden on the server's central processing unit (CPU) is reduced.
  • the server's CPU would otherwise expend processing effort on decompressing the compressed data and transmitting the larger volume of uncompressed data, with the attendant processing burden of the increased amount of network communications.
  • delegating decompression and rendering operations to the clients also improves server performance.
  • Client performance is further improved, in that the processing effort at the client associated with the receipt of a larger volume of uncompressed data is reduced.
  • the invention relates to a method for generating a graphical display at a client.
  • the method includes transmitting output from an application program executing on a server to the client, identifying a bitmap representation within the application output, and determining a check value for the bitmap representation.
  • the method also includes retrieving a compressed data format of the bitmap representation using at least in part the check value and transmitting to the client the compressed data format in place of the bitmap representation.
  • the invention in another aspect, relates to a method for generating a graphical display at a client.
  • the method includes transmitting output from an application program executing on a server to the client and identifying a non-textual element within the application output.
  • the method also includes retrieving a compressed data format associated with the non-textual element and transmitting to the client the compressed data format in place of the non-textual element.
  • the method includes identifying a textual element within the application output and transmitting to the client the textual element.
  • the method includes receiving the compressed data format, and optionally the textual element, at the client and generating a display at the client using the compressed data format, and optionally the textual element.
  • the method includes transmitting the compressed data format using at least one presentation layer protocol packet.
  • the method includes transmitting the at least one presentation layer protocol packet using a command for transmitting a file in its native format.
  • the method includes conforming the at least one presentation layer protocol packet to a remote access protocol, a thin-client protocol, and or a presentation protocol
  • the non-textual element is a bitmap representation and the method includes replacing the bitmap representation with the compressed data format.
  • the method includes determining the capability of the client to render the non-textual element using the compressed data format. The method further includes, upon determination that the client cannot render the non-textual element using the compressed data format, transmitting an image- rendering library capable of rendering the non-textual element using the compressed data format.
  • the method includes intercepting the application output and inspecting the intercepted output for a bitmap representation of the nontextual element.
  • the method includes calculating a first check value for a bitmap representation of the non-textual element and searching an image store for the compressed data format having a check value identical to the first check value.
  • the invention in another aspect, relates to a system for generating a graphical display at a client.
  • the system includes an output filter module and a server agent.
  • the output filter module is configured to intercept output produced by an application program, identify a non-textual element of the output, and retrieve a compressed data format associated with the non-textual element.
  • the server agent is configured to transmit to the client the compressed data format in place of the non-textual element.
  • the system includes a server node, which includes the server agent and the output filter module.
  • the system includes a client node.
  • the client node includes a client agent and a display.
  • the client agent is configured to receive the compressed data format and to generate a display of the non-textual element using the received compressed data format.
  • the system further includes a network.
  • the invention in another aspect relates to an article of manufacture having computer-readable program means embodied therein for generating a graphical display at a client.
  • the article includes computer-readable program means for performing any of the aforementioned methods.
  • the invention relates to a method for generating a media presentation at a client.
  • the method includes transmitting output from an application program executing on a server to the client, identifying a media stream within the application output, intercepting an original compressed data set representing at least a portion of the media stream before processing by the application program, and transmitting the original compressed data set to the client.
  • the invention in another aspect, relates to another method for generating a media presentation at a client.
  • This method includes transmitting output from an application program executing on a server to the client, identifying a media stream within the application output, intercepting a first decompressed data set representing at least a portion of the media stream, compressing the intercepted first decompressed data set, and transmitting the compressed data set to the client in place of the first decompressed data set.
  • the invention relates to still another method for generating a media presentation at a client.
  • This method includes informing a server of at least one media format supported by a client agent installed on the client, receiving a compressed data set of a media stream at the client, decompressing the compressed data set at the client to generate a decompressed data set, and generating the media presentation at the client using the decompressed data set.
  • the invention in a further aspect, relates to an article of manufacture that embodies computer-readable program means for generating a media presentation at a client.
  • the article includes computer-readable program means for transmitting output from an application program executing on a server to the client, computer-readable program means for identifying a media stream within the application output, computer- readable program means for intercepting an original compressed data set representing at least a portion of the media stream before processing by the application program, and computer-readable program means for transmitting the original compressed data set to the client.
  • the invention relates to another article of manufacture that embodies computer-readable means for generating a media presentation at a client.
  • This article includes computer-readable program means for transmitting output from an application program executing on a server to the client, computer-readable program means for identifying a media stream within the application output, computer-readable program means for intercepting a first decompressed data set representing at least a portion of the media stream, computer-readable program means for compressing the intercepted first decompressed data set, and computer-readable program means for transmitting the compressed data set to the client in place of the first decompressed data set.
  • the invention relates to yet another article of manufacture that embodies computer-readable means for generating a media presentation at a client.
  • This article includes computer-readable program means for informing a server of at least one media format supported by a client agent installed on the client, computer-readable program means for receiving a compressed data set of a media stream at the client, computer-readable program means for decompressing the compressed data set at the client to generate a decompressed data set, and computer- readable program means for generating the media presentation at the client using the decompressed data set.
  • the methods further include, and the articles of manufacture further include computer- readable program means for, capturing timing information associated with the media stream, transmitting the timing information to the client, receiving the compressed data set and, optionally, the timing information at the client, decompressing the compressed data set at the client to generate a decompressed data set, and generating the media presentation at the client using the decompressed data set and, optionally, the timing information.
  • the methods further include, and the articles of manufacture further include computer-readable program means for, transmitting non-media graphical information from the application output to the client, receiving the non-media graphical information at the client, and generating the media presentation at the client using the decompressed data set and the non-media graphical information.
  • the invention relates to a system for generating a media presentation at a client.
  • the system includes an application program and an output filter module.
  • the application program is configured to identify a media stream within output produced by the application program.
  • the output filter module is configured to intercept an original compressed data set representing at least a portion of the media stream before processing by the application program and transmit the original compressed data set to the client.
  • the invention relates to another system for generating a media presentation at a client.
  • This system includes an application program and an output filter module.
  • the application program is configured to identify a media stream within output produced by the application program.
  • the output filter module is configured to intercept a first decompressed data set representing at least a portion of the media stream, compress the intercepted first decompressed data set of the media stream, and transmit the compressed data set in place of the first decompressed data set to the client.
  • the invention in yet another aspect, relates to another system for generating a media presentation at a client.
  • This system includes a server and the client in communication with the server.
  • the client includes a client agent configured to inform the server of at least one media format supported by the client agent, receive a compressed data set of a media stream, decompress the compressed data set at the client to generate a decompressed data set, and generate the media presentation using the decompressed data set.
  • the output filter module of the systems is further configured to capture timing information associated with the media stream and to transmit the timing information to the client.
  • the system further includes a client agent configured to receive the compressed data set and the optional timing information, decompress the compressed data set to generate a decompressed data set, and generate the media presentation using the decompressed data set and the optional timing information.
  • the client agent is further configured to receive non-media graphical information and to generate the media presentation at the client using the decompressed data set and the non-media graphical information.
  • the invention in another aspect, relates to another system for generating a media presentation at a client.
  • This system includes a network, a server in communication with the network, and the client in communication with the network.
  • the server includes an application program and at least one output filter module.
  • the application program is configured to identify a media stream within output produced by the application program.
  • the output filter module is configured to intercept a compressed data set representing at least a portion of the media stream before processing by the application program, and transmit the compressed data set to the client.
  • the output filter module intercepts only a portion of a compressed data set representing the media stream. For example, the output filter module may intercept every other frame of a video media stream.
  • the output filter module discards a portion of the data associated with each frame of video data in order to reduce the bandwidth necessary to transmit the video stream.
  • the client includes a client agent.
  • the client agent is configured to inform the server of at least one media format supported by the client agent, receive the compressed data set, decompress the compressed data set at the client to generate a decompressed data set, and generate the media presentation at the client using the decompressed data set.
  • the invention relates to an article of manufacture that embodies computer-readable program means for generating a media presentation at a client.
  • the article includes computer-readable program means for intercepting an original compressed data set of a media stream, and computer-readable program means for transmitting the original compressed data set to the client using a thin client protocol such as ICA or RDP.
  • the invention in another aspect, relates to another article of manufacture that embodies computer-readable program means for generating a media presentation at a client.
  • the article includes computer-readable program means for intercepting a decompressed data set of a media stream, computer-readable program means for compressing the intercepted decompressed data set, and computer-readable program means for transmitting the compressed data set to the client using a thin client protocol such as ICA or RDP.
  • FIG. 1 is a block diagram of an illustrative embodiment of a system to generate a graphical display for a remote terminal session in accordance with the invention
  • FIG. 2 is a flow diagram of an illustrative embodiment of a process to generate a graphical display for a remote terminal session in accordance with the invention
  • FIG. 3 is a block diagram of an illustrative embodiment of a system for generating a media presentation at a client in accordance with the invention
  • FIGS. 4A, 4B, & 4C are a flow diagram of an illustrative embodiment of a method for generating a media presentation at a client in accordance with the invention.
  • FIGS. 5 A, 5B, & 5C are a flow diagram of another embodiment of a method for generating a media presentation at a client in accordance with the present invention.
  • FIG. 1 illustrates a system 100 to generate a display for a remote terminal session that includes a first computing system ("client node”) 105 in communication with a second computing system (“server node”) 110 over a network 115.
  • client node a first computing system
  • server node a second computing system
  • the network 115 can be a local-area network (LAN), such as a company Intranet, or a wide area network (WAN), such as the Internet or the World Wide Web.
  • a user of the client node 105 can be connected to the network 115 through a variety of connections including standard telephone lines, LAN or WAN links (e.g., Tl, T3, 56kb, X.25), broadband connections (ISDN, Frame Relay, ATM), and wireless connections.
  • the client node 105 includes a client transceiver 130 to establish communication with the network 115.
  • the server node 110 includes a server transceiver 135 to establish communication with the network 115.
  • the connections can be established using a variety of communication protocols (e.g., ICA, RDP, HTTP, TCP/LP, IPX, SPX, NetBIOS, Ethernet, RS232, and direct asynchronous connections).
  • the server node 110 can be any computing device capable of providing the requested services of the client node 105. Particularly, this includes generating and transmitting commands and data to the client node 105 that represent the output being produced by an application program 140 executing on the server 110.
  • the server node 110 includes the server transceiver 135, the executing application program 140, a server agent 150, an output filter module 155 and an image store 160.
  • the server agent 150 includes a module that interfaces with a client agent 175 and other components of the server node 110 to support the remote display and operability of the application program 140.
  • the server agent module 150 and all modules mentioned throughout the specification are implemented as a software program and or a hardware device (e.g., ASICs or FPGAs).
  • server node 110 For clarity, all of these components are shown on server node 110. It is to be understood that the server node 110 can represent a single server or can represent several servers in communication with each over the network 115 or another network (not shown), hi multiple server embodiments, the functionality of the components can be distributed over the available servers. For example, in one embodiment with multiple servers, the transceiver 135, the application program 140, the server agent 150 and the output filter module 155 are on an application server and the image store 160 is on a storage device, such as a disk in a RAID system.
  • the client node 105 can be any computing device (e.g., a personal computer, set top box, wireless mobile phone, handheld device, personal digital assistant, kiosk, etc.) used to provide a user interface to the application program 140 executing on the server node 110.
  • the client node 105 includes the client transceiver 130, a display 145, a client agent 175 and a graphics library 180 (also referred to as an image-rendering library).
  • the client agent 175 includes a module, implemented as a software program and/or a hardware device (e.g., an ASIC or an FPGA) that receives commands and data from the server node 110 and from a user (not shown) of the client node 105.
  • a hardware device e.g., an ASIC or an FPGA
  • the client agent 175 uses the received information when interacting with other components of the client node 105 (e.g., when directing the operating system to output data onto the display 145). The client agent 175 also transmits requests and data to the server node 110 in response to server-issued commands or user actions at the client node 105.
  • the server node 110 hosts one or more application programs 140 that can be accessed by the client nodes 105.
  • applications include word processing programs such as Microsoft Word® and spreadsheet programs such as Microsoft Excel®, both manufactured by Microsoft Corporation of Redmond, Washington.
  • word processing programs such as Microsoft Word®
  • spreadsheet programs such as Microsoft Excel®
  • Other examples include financial reporting programs, customer registration programs, programs providing technical support information, customer database applications, and application set managers.
  • Another example of an application program is Internet Explorer®, manufactured by Microsoft Corporation of Redmond, Washington, and this program will be used as an exemplary application program 140 in the following discussion. It is understood that other application programs can be used.
  • the server node 110 communicates with the client node 105 over a transport mechanism.
  • the transport mechanism provides multiple virtual channels 185 through the network 115 so the server agent 150 can communicate with the client agent 175.
  • One of the virtual channels 185 provides a protocol for transmitting graphical screen data from the server node 110 to the client node 105.
  • the server 110 executes a protocol driver, in one embodiment as part of the server agent 150, that intercepts graphical display interface commands generated by the application program 140 and targeted at the server's operating system.
  • the protocol driver translates the commands into a protocol packet suitable for transmission over the transport mechanism.
  • the application program 140 in this example Internet Explorer®, executing on the server 110, retrieves a web page. As explained above, the application program 140 generates graphical display commands to the server operating system, as if it was going to display the output at the server node 110. The server agent 150 intercepts these commands and transmits them to the client agent 175. The client agent 175 issues the same or similar commands to the client operating system to generate output for the display 145 of the client node 105.
  • these graphical display commands may take the form of, for example, invocations of DirectShow®, DirectX®, or Windows graphic device interface (GDI) functionality.
  • GDI Windows graphic device interface
  • a web page has both textual elements (e.g., titles, text, and ASCII characters) and non-textual elements (e.g., images, photos, icons, and splash screens) incorporated therein.
  • the non-textual elements are sometimes transmitted to the Internet Explorer® application program 140 from a web server (not shown) in a compressed data format (e.g., a file or a data stream), also referred to as the non-textual element's native format. Examples of compressed formats are JPEG, GLF, and PNG.
  • the non-textual element represented in a compressed data format may be, for example, 20 kilobytes in size. That same non-textual element decompressed into its bitmap representation is, for example, 300 kilobytes in size.
  • the application program 140 when generating the display of the web page, retrieves, for example, a JPEG data format of a non-textual element and decompresses the JPEG data format into a bitmap for display.
  • the output filter module 155 determines that the bitmap representation is from a compressed format and obtains the corresponding compressed format of the non-textual element from the image store 160, as explained in more detail below.
  • the image store 160 is persistent storage. In other embodiments, the image store 160 is temporary storage, cache, volatile memory and/or a combination of temporary and persistent storage.
  • the server agent 150 replaces the bitmap representation of the non-textual element with the compressed non-textual element that the output filter module 155 retrieved from the image store 160.
  • the server agent 150 transmits the non-textual element in the compressed format, along with the graphical display interface commands associated with the bitmap representation, to the client node 105.
  • the server agent 150 uses a unique protocol command that identifies a transmission of a non-textual element that is not in bitmap representation, even though the associated commands are applicable to a bitmap representation of a non-textual element.
  • the protocol command can have a modifier comment, or a command switch.
  • the command can also use a change of context or a combination of multiple commands.
  • the server agent 150 may retrieve the uncompressed representation of the non-textual element and compress it, for example using 2DRLE compression, so as to provide a compressed version of the non-textual element.
  • the server agent 150 may also review the data in the uncompressed representation, select an appropriate compression algorithm for application to the uncompressed representation, apply the selected algorithm to the uncompressed representation, and provide the compressed representation to the client agent 175 for display. If the non-textual element was originally compressed, the server agent 150 may choose to recompress the image using the same or a different compression technique.
  • the algorithms available to the server agent 150 for the compression of the uncompressed representation include lossless compression algorithms and lossy compression algorithms.
  • Lossless compression algorithms reduce the size of the uncompressed representation without the loss of information contained in the representation, e.g., 2DRLE compression.
  • Lossy compression algorithms reduce the size of the uncompressed representation in such a way that information contained in the uncompressed representation is discarded, e.g., JPEG or JPEG2000 compression.
  • the server agent 150 selects a compression algorithm that is appropriate for compressing the uncompressed representation using — at least in part — the contents of the uncompressed representation, hi one embodiment, the server agent 150 determines that the uncompressed representation is a continuous tone image, such as a photographic image, and applies a lossy compression algorithm to the uncompressed representation. In another embodiment, the server agent 150 determines that the uncompressed representation contains large areas of the same color, e.g., a computer-generated image, and applies a lossless compression algorithm.
  • the number of colors contained in the pixels of the uncompressed representation is enumerated and when the counted number of colors exceeds a predetermined threshold value (e.g., 256 colors), a lossy compression algorithm is applied to the uncompressed representation.
  • a predetermined threshold value e.g., 256 colors
  • the server agent 150 compresses the uncompressed representation using a lossless compression algorithm and compares the size of the compressed result to a predetermined value. When the size of the compressed result exceeds the predetermined value, the uncompressed representation is compressed using a lossy compression algorithm. If the size of the result of the lossy compression is less than a predetermined percentage of the size of the result of the lossless compression, then the lossy compression algorithm is selected.
  • the client agent 175 receives the transmission of the non-textual element file in the compressed data format, along with the graphical display interface commands associated with the bitmap representation of the non-textual element.
  • the client agent 175 determines that the non-textual element is in the compressed data format and not the bitmap representation. In one embodiment, the client agent 175 makes this determination because the non-textual element in compressed format is transmitted using a unique protocol command. In another embodiment, the size of the non-textual element data and/or other characteristics about the non-textual element included with the associated graphical display interface commands are enough to enable the client agent 175 to make the determination.
  • the client agent 175 determines whether the client node 105 contains the necessary library 180 to decompress the compressed format of the non-textual element. If the client node 105 has the appropriate graphics libraryries) 180 installed to perform the decompression algorithms, the client agent 175 uses the library 180 to decompress the compressed format of the non-textual element into its bitmap representation. The client agent 175 performs the received associated graphical display interface commands on the bitmap representation to generate the non-textual element of the output of the application program 140 on the client display 145. [0054] In one embodiment, the client agent 175 does not contain all the decompression algorithms to decompress the non-textual element from a compressed format into a bitmap representation.
  • the client agent 175 requests the needed graphics library from the server node 110. In another embodiment, the client agent 175 determines if a predetermined set of the most widely used graphics libraries 180 are installed on the client node 105 prior to receiving any non-textual elements from the server node 110. If the most widely used graphics libraries 180 are not installed on the client node 105, the client agent 175 requests the missing libraries from the server node 110 prior to receiving any non-textual elements from the server node 110.
  • the client agent 175 determines which graphics libraries 180 the client node 105 includes and transmits that library information to the server agent 150.
  • the server agent 150 determines, using the transmitted library information, whether the client node 105 can render the compressed data format. If the server agent 150 determines that the client node 105 has the applicable library, the server agent 150 substitutes the compressed data format for the bitmap representation of the non-textual element. If the server agent 150 determines that the client node 105 does not have the applicable library, the server agent 150 does not substitute the compressed data format for the bitmap representation of the non-textual element and instead transmits the bitmap representation to the client 105.
  • the output filter module 155 determines that the bitmap representation is from a compressed format contained in the image store 160.
  • the output filter module 155 may make this determination if the source of the media stream is exposed in some manner by the document (such as, for example, by a file type identifier or by an application program specifically configured to expose this information).
  • the output filter module 155 may be provided with information regarding the source application associated with the media stream by the multimedia subsystem. In still other embodiments, the output filter module 155 calculates one or more check values for a bitmap representation.
  • the output filter module 155 can calculate a single check value for the entire bitmap representation and/or the output filter module 155 can calculate four check values, one for each quadrant for the entire bitmap representation, hi another example, the output filter module 155 can calculate N check values, one for each of the N lines in the bitmap representation.
  • a check value is the result of an algorithm that generates a substantially unique value for different arrangements of data.
  • the check value is, for example, a checktag, a Cyclic Redundancy Code ("CRC"), a check sum, or a result of a hashing function.
  • the check value is based on the bitmap representation and not the data as arranged in a compressed data format. However, when the compressed data format is stored in the image store 160, it is stored with a check value attribute that corresponds to the one or more check values of the bitmap representation of the compressed data when decompressed.
  • the check value is a checktag that includes a fixed identifier and a unique identifier.
  • the fixed identifier and the unique identifier are combined together and concealed within an image.
  • the fixed identifier is used to identify the checktag as such; the unique identifier is used to identify a specific image.
  • the fixed identifier is, for example, a globally unique identifier that is statistically unlikely to be found within an image.
  • the fixed identifier is a byte sequence that is easily recognizable during debugging and that has a balanced number of zero and one bits.
  • the unique identifier is a sequential identifier uniquely allocated for each image in the cache. The sequential unique identifier is XOR masked with another value so that the image identifiers with a small value (the most likely value) will be more likely to have a balanced number of zero and one bits.
  • the checktag is encoded into RGB color components, independently of whether the RGB components are part of the image or part of the color palette. More specifically, the checktag is treated as a stream of 160 bits (i.e., 20 separate bytes, each of which starts at bit 0, the least significant, and finishes at bit 7, the most significant bit). The least significant bit of each byte is overwritten by the next bit of the checktag. The other 7 bits of each byte remain unaltered.
  • a checktag is decoded by simply reversing the encoding procedure. After the checktag is decoded, the fixed identifier and the unique identifier are retrieved from the checktag. The retrieved fixed identifier is validated against a previously stored fixed identifier to identify the checktag as such. Where a match is found, the unique identifier is then used to retrieve information that is relevant to the identified image, such as the bitmap data associated with the image.
  • the output filter module 155 searches the image store 160 for a non-textual element in compressed data format that has a check value attribute that is the same as one or more check values the output filter module 155 calculates for the bitmap representation.
  • the output filter module 155 retrieves the compressed format of nontextual element with the same check value attribute as the one or more check values and sends the compressed format of the non-textual element to the server agent 150 for transmittal to the client agent 175 in place of the bitmap representation.
  • the server node 110 stores compressed formats of non-textual elements in the image store 160 the first time the application program 140 calls a graphics library (not shown) to create a bitmap representation from a compressed format file.
  • the output filter module 155 calculates the associated check value of the bitmap representation as the application program 140 decompresses the compressed format and generates the bitmap representation. As described above, the output filter module 155 can calculate the check value when the bitmap representation is complete, when a quadrant of the bitmap representation is complete, or when a line of the bitmap representation is complete.
  • the server 110 stores the compressed format file and the associated check value attribute in the image store 160 and retrieves the compressed format file the first and any subsequent times the application program 140 generates the associated nontextual element.
  • Whether the server 110 stores the compressed format file and its associated check value attribute(s) in the image store 160 in a temporary portion (e.g., RAM memory buffer or cache) or a persistent portion (e.g., disk or non-volatile memory buffer) is based at least in part on design and hardware limitations (e.g., the size of the persistent storage).
  • One exemplary criterion used to make that determination is the number of times the application program 140 generates the non-textual element. For example, if the application program 140 generates a particular non-textual element more than a predetermined number of times, the server 110 stores the compressed format file and its associated check value attribute(s) corresponding to that particular non-textual element persistently in the image store 160.
  • the image store 160 may apply eviction algorithms (e.g., a least-recently-used [LRU] algorithm) to remove data from the image store 160.
  • eviction algorithms e.g., a least-recently-used [LRU] algorithm
  • LRU least-recently-used
  • the server 110 stores the non-textual element if it is static or complex. For example, if the application program 140 always generates a splash screen at initialization, the server 110 stores the compressed format file corresponding to that splash screen in the persistent portion of the image store 160. In another embodiment, if the non-textual element is complex, static and/or generated repeatedly but does not have a corresponding compressed format file, the output filter module 155 generates a compressed format file for that non-textual element, in a standards-based or proprietary-based format, i any subsequent transmissions, the server agent 150 transmits the generated compressed format file in place of the bitmap representation.
  • the server agent 150 determines whether the client node 105 includes the applicable proprietary-based graphics library to decompress the compressed format file into a bitmap representation. If not included on the client node 105, the server agent 150 transmits the applicable library to the client node 105 for installation.
  • the illustrated embodiment depicts the image store 160 on the server node 110
  • at least a portion of the image store is on the client node 105.
  • the output filter module 155 calculates the one or more check values of the bitmap representation and transmits the one or more check values to the server agent 150.
  • the server agent 150 transmits these one or more check values to the client agent 175.
  • the client agent 175 searches the portion of the image store on the client node 105 for a compressed data format stored with an identical one or more check values attribute.
  • the client agent 175 transmits the results of this search to the server agent 150.
  • the server-side and client-side image stores 160 may persist after the termination of a session between a client and a server.
  • the client agent 175 communicates to the server agent 150 information about the contents of its image store 160 and, after comparison with the server's image store 160, the server agent 150 may use the nontextual elements in the image stores 160-or their strips, as discussed below-to reduce the amount of bandwidth required to create graphical and media display at the client.
  • the server agent 150 does not have to send either the compressed data format or the bitmap representation over the network 115.
  • the server agent 150 only transmits the graphical display interface commands associated with the bitmap representation of the non-textual element. If the compressed data format for the non-textual element does not exist on the client node 105, the output filter module 155 obtains the corresponding compressed format of the non-textual element from the image store 160.
  • the server agent 150 replaces the bitmap representation of the non-textual element with the nontextual element in the compressed data format that the output filter module 155 retrieved from the image store 160.
  • the server agent 150 transmits the non-textual element in the compressed format, along with the graphical display interface commands associated with the bitmap representation, to the client node 105.
  • the presence of the image store 160 on the server 110 and/or the client node 105 permits the caching of non-textual elements, reducing the bandwidth required to generate graphical and media displays at a client.
  • these non-textual elements may be subdivided into sub-regions, i.e., "strips," and the image store 160 will subsequently provide non-textual element caching, as discussed above, with strip-level granularity.
  • a web page may be comprised of multiple images, only some of which are visible, either fully or partially. For the images that are only partially shown, it is possible to send the full image to the client including the part that is obscured by other windows.
  • the image store 160 contains the graphical data forming the strip and, optionally, strip-related metadata such as the image strip height, the image strip width, and an image strip identifier, such as a cyclic redundancy check (CRC) checksum.
  • strip-related metadata such as the image strip height, the image strip width, and an image strip identifier, such as a cyclic redundancy check (CRC) checksum.
  • CRC cyclic redundancy check
  • each non-textual element used in a graphical rendering operation is divided into the aforementioned strips.
  • these strips have the same width as the non-textual element and a height that is less than or equal to the height of the non-textual element.
  • the server agent 150 maintains a database of strips that have previously been transmitted to the client for rendering. If subsequent rendering operations incorporate regions contained in a stored strip, the presence of the stored strip is identified in the server-side image store 160 and an appropriate identifier associated with the stored strip is retrieved. A message instructing the client agent 175 to render the stored strip associated with the identifier is transmitted to the client agent 175, requiring less bandwidth than a message including the contents of the strip itself. The client agent 175 retrieves its copy of the stored strip associated with the transmitted identifier from its image store 160' and displays the contents of the stored strip.
  • the server agent 150 identifies the common strips and instructs the client agent 175 to display the strip from the client's own image store 160, avoiding the retransmission of the strip for display.
  • the server agent 150 will detect the modification to the strip and retransmit the strip to the client agent 175 for rendering and/or incorporation in the client-side image store 160'.
  • FIG. 2 illustrates an exemplary embodiment of a process 200 to generate a display for a remote terminal session, using the exemplary embodiment of FIG. 1.
  • the output filter module 155 monitors the output of the application program 140 by monitoring calls made to the operating system of the server node 110. When the output filter module 155 detects (step 205) a display command from the application program 140, the output module 155 determines (step 210) whether the application program 140 is generating a bitmap representation of a non-textual element.
  • the output filter module 155 transmits (step 215) the display command to the server agent 150, which transmits that command, or a representative command defined in the protocol, to the client agent 175. If the application program 140 is generating a bitmap representation of a non-textual element, the output filter module 155 calculates (step 220) one or more check values corresponding to the bitmap representation of the non-textual image.
  • the output filter module 155 uses the one or more calculated check value(s), the output filter module 155 searches the image store 160 to determine (step 225) whether a compressed data format with identical check value attribute(s) exists. If there is a compressed data format in the image store 160 with check value attribute(s) identical to the one or more check values the output filter module 155 calculates, the output module 155 replaces (step 230) the bitmap representation of the non-textual element with the compressed data format. The output module 155 transmits (step 230) the compressed data format to the server agent 150 for eventual transmission to the client agent 175. The output module 155 also transmits all of the commands associated with the replaced bitmap representation along with the compressed data format.
  • the output module 155 determines (step 235) whether the bitmap representation of a non-textual element corresponding to the compressed data format meets a predetermined criterion for persistent storage (e.g., any of the criteria described above). If the output module 155 determines (step 235) that the predetermined criterion is met, the output module 155 stores (step 240) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the persistent portion of the image store 160.
  • a predetermined criterion for persistent storage e.g., any of the criteria described above.
  • the output module 155 determines (step 235) that the predetermined criterion is not met, the output module 155 stores (step 245) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the temporary portion of the image store 160.
  • the output module 155 stores (step 240 or 245) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the image store 160, the output module 155 replaces (step 230) the bitmap representation of the non-textual element with the compressed data format.
  • the output module 155 transmits (step 230) the compressed data format to the server agent 150 for eventual transmission to the client agent 175.
  • the output module 155 continues monitoring the output generated by the application program 140 until the output module 155 detects (step 205) another display command from the application program 140.
  • the invention pertains to methods, systems, and articles of manufacture for generating a media presentation.
  • a compressed data set representing at least a portion of a media stream
  • a decompressed data set representing at least a portion of a media stream
  • a decompressed data set is intercepted and compressed on the first computing device and then transmitted, as above, to the second computing device, where it is decompressed and presented to the user.
  • FIG. 3 illustrates one embodiment of a system 300 that generates a media presentation according to this aspect of the invention.
  • the system 300 includes a first computing device, e.g., a server 310, in communication with a second computing device, e.g., a client 305, over a network 315.
  • a first computing device e.g., a server 310
  • a second computing device e.g., a client 305
  • the client 305, the server 310, and the network 315 have the same capabilities as the client 105, the server 110, and the network 115, respectively, described above.
  • the client 305 includes at least a client transceiver 330, a client agent 375, and a presentation interface 345.
  • the client agent 375 may be implemented as a software program and/or as a hardware device, such as, for example, an ASIC or an FPGA.
  • the client agent 375 uses the client transceiver 330 to communicate over the network 315 and generates a presentation having media and non-media components at the presentation interface 345.
  • the server 310 is an application server. As illustrated, the server 310 includes at least a server transceiver 335, an application program 340, a server agent 350, a first output filter module 355A, and a second output filter module 355B.
  • the server agent 350, the first output filter module 355 A, and the second output filter module 355B may be implemented as a software program and/or as a hardware device, such as, for example, an ASIC or an FPGA.
  • the server agent 350, the first output filter module 355 A, and the second output filter module 355B use the server transceiver 335 to communicate over the network 315.
  • the aforementioned components 335, 340, 350, 355A, and 355B are distributed and/or duplicated over several servers in communication with each other over the network 315, or over another network (not shown). This permits, for example, the transmission of media-related capability information, media streams, and control information to multiple clients and through multiple hops.
  • two or more of the aforementioned components 335, 350, 355A, and 355B may be combined into a single component, such that the functions, as described below, performed by two or more of the components 335, 350, 355A, and 355B are performed by the single component.
  • the aforementioned components 340, 355 A, and 355B are duplicated at the same server 310. This permits, for example, the simultaneous transmission of media-related capability information, media streams, and control information, pertaining to different/multiple applications 340, and different/multiple media streams within each application.
  • the different applications 340 may be identified as major stream contexts, and the different media streams as minor contexts within a specific major context.
  • the application program 340 illustrated in FIG. 3 is any application program 340 that renders, as part of its output, a media stream.
  • the media stream may be a video stream, an audio stream, or, alternatively, a combination of any number of instances thereof.
  • the application program 340 may output non-media graphical information.
  • non-media graphical information refers generally to all graphical information outputted by the application program 340 without the use of a codec or the equivalent, such as, for example, static graphical information, including, but not limited to, toolbars and drop-down menus.
  • Non-media graphical information also includes, for example, information for locating the static graphical information on a display screen.
  • the application program 340 may be, for example, the MICROSOFT ENCARTA application program manufactured by the Microsoft Corporation of Redmond, Washington. [0082] In one embodiment, the application program 340 uses external codecs, such as, for example, codecs installed in the operating system of the server 310, to decompress a compressed data set representing at least a portion of a media stream. In another embodiment, the codecs used by the application program 340 are embedded in the application program 340 itself.
  • the server 310 may include any number of executing application programs 340, some of which use external codecs, others of which use embedded codecs.
  • the application program 340 uses external codecs and desires to output a media stream, it requests that the operating system of the server 310 use the external codecs to decompress the compressed data set representing at least a portion of the media stream for subsequent display.
  • the codecs used by the application program 340 are embedded in the application program 340 itself, the application program 340, when desiring to output a media stream, uses the embedded codecs to decompress the compressed data set itself for subsequent display. Additionally, the application program 340 may generate and transmit graphical display commands, associated with the non-media graphical information, to the operating system of the server 310.
  • these graphical display commands may take the form of, for example, invocations of DirectShow®, DirectX®, or GDI functionality.
  • the application program 340 performs these tasks as if the application program 340 was going to generate a presentation having media and non-media components at the server 310.
  • the first output filter module 355A, the second output filter module 355B, and the server agent 350 intercept the compressed data set being passed to the external codecs, the decompressed data set generated by the embedded codecs, and the graphical display commands associated with the non-media graphical information, respectively, and (after first compressing the decompressed data set generated by the embedded codecs) transmit them, over the network 315, to the client agent 375.
  • the client agent 375 decompresses the received compressed data sets and issues the same or similar graphical display commands, associated with the non-media graphical information, to the operating system of the client 305 to generate a presentation having media and non-media components at the presentation interface 345 of the client 305.
  • the communications between the server agent 350 and the client agent 375 include information concerning media-related capabilities (e.g., buffer and media type properties, media stream priority, etc.) or control information (e.g., play, pause, flush, end-of-stream, and stop commands, parent window assignment, window positioning and clipping commands, scrolling, audio adjustment commands, volume, balance, etc.).
  • the first output filter module 355 A and the second output filter module 355B are invoked as an application program 340 using external codecs attempts to invoke an external codec to output a media stream.
  • the first output filter module 355A intercepts an original compressed data set representing at least a portion of the media stream. Instead of decompressing the data set, as an external codec would, the first output filter module 355 A transmits the original compressed data set over the network 315 to the client agent 375.
  • the second output filter module 355B acting as an OS-level renderer captures stream-independent (generic) control commands (e.g., play, pause, stop, flush, end-of-stream) and transmits the information over the network 315 to the client agent 375.
  • stream-independent (generic) control commands e.g., play, pause, stop, flush, end-of-stream
  • the second output filter module 355B also captures information for locating images of the video stream (e.g., parent window assignment, positioning and clipping commands) on a display screen and transmits the information over the network 315 to the client agent 375.
  • the second output filter module 355B may also capture information for audio adjustment (e.g., volume and balance adjustment commands) and transmit the information over the network 315 to the client agent 375.
  • the second output filter module 355B when an application program 340 that uses embedded codecs attempts to invoke, for example, an OS-level renderer to output a media stream, the second output filter module 355B is invoked.
  • the second output filter module 355B intercepts a first decompressed data set representing at least a portion of the media stream from the output of the application program 340.
  • the second output filter module 355B then compresses, as explained below, the intercepted first decompressed data set and transmits the resulting compressed data set, over the network 315, to the client agent 375.
  • the second output filter module 355B as above, also captures, where the media stream includes a video stream, information for locating images of the video stream on a display screen and transmits the information over the network 315 to the client agent 375.
  • the server agent 350 may implement this interception functionality by providing a DirectShow® Transform filter with a filter merit level that exceeds that of the other filters installed on the server 310.
  • the WINDOWS family of operating systems allows multiple filters to be installed that are capable of handling particular types of media encoding.
  • the merit level assigned to a filter determines the priority given to a filter; filters having a higher merit level, i.e., priority, are selected to handle media types before filters with lower merit levels.
  • Support for various media types may be implemented by associating the Transform filter with the desired media types through changes to the system registry.
  • the Transform filters may be associated with specific Renderer filters.
  • this framework is extensible and permits future support for new types of media programming.
  • the server agent 350 may implement its client communications functionality using modified DirectShow® Transform and Renderer i filters. Instead of transforming (decompressing) received media data, the modified Transform filters transmit the data to the client agent 375. Instead of rendering transformed (decompressed) media data, the Renderer filters transmit both stream- generic (e.g., play, pause, stop, flush, end-of-stream) and stream-specific (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment) commands.
  • stream- generic e.g., play, pause, stop, flush, end-of-stream
  • stream-specific e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment
  • unmodified Transform and rendering filters at the client 305 transform (decompressO and render the media data for viewing, hearing, etc. by a user.
  • the Renderer filters also receive stream-specific control data.
  • the server agent 350 maintains a separate media queue for each media stream.
  • each media sfream may be prioritized, permitting the, e.g., transmission of real-time video conferencing information while transparently slowing or halting the transmission of still images from a web browser.
  • Stream prioritization may be accomplished by providing separate queues for different types of media, each stream having an assigned priority.
  • Network bandwidth may be determined in any one of a number of ways known in the art such as, for example, "pinging" the expected target server and measuring response time. For embodiments in which the media stream includes embedded timing information, that information may be used to determine if media data is not transmitted fast enough, i.e., that the bandwidth of the channel cannot support the transmission.
  • the media queues corresponding to the same application program 340 might have different priorities based on the media type (video, audio, MLDI, text, etc.). If network bandwidth between the server agent 350 and the client agent 375 is insufficient to accommodate all media streams, samples may be dropped, that is, deleted, from lower priority queues so that higher priority queues may be services without interruption. For example, video transmission might be interrupted and appear in a "slide-mode," while audio is still uninterrupted.
  • control information is associated with its own control queue, permitting the prioritization of the transmission of control data ahead of any media data.
  • the transmission of control information from the server agent 350 to the client agent 375 permits control of the client rendering filters' scheduler. Latency in initial playback may be reduced by pre-filling the media queue (not shown) of the client agent 375 as fast as possible before playback commences. Similarly, subsequent variations in network latency may be addressed by readjusting the client agent's 375 media queue as necessary. Should the client agent 375 detect that one or more of its media queues have fallen below or have exceeded certain resource thresholds, then the client agent 375 may send a burst or a stop request respectively to the server agent 350 for the desired amount of media queue adjustment.
  • DirectShow® functionality leverages functionality provided by the operating system and reduces or eliminates the reirnplementation of duplicative functionality.
  • the graphical or media display provided by the client agent 375 leverages operating system functionality to provide complex clipping and audio control functionality.
  • the client agent 375 may also utilize the DirectDraw® and DirectSound® capabilities provided by the operating system, reducing required CPU resources and improving overall performance.
  • the server agent 350 intercepts and fransmits to the client agent 375, over the network 315, the graphical display commands associated with the non-media graphical information that are output from the application program 340.
  • the first output filter module 355A, the second output filter module 355B or both (where the application program 340 uses external codecs), or the second output filter module 355B (where the application program 340 uses embedded codecs), captures timing information associated with the media sfream and transmits the timing information, over the network 315, to the client agent 375. More specifically, the output filter module 355A, 355B captures, and transmits to the client agent 375, presentation times for each frame of the media stream, thereby enabling the client agent 375 to synchronize video and audio streams and to maintain the correct frame rate.
  • the server agent 350 interfaces with the server transceiver 335 and the application program 340.
  • the server agent 350 receives from the client agent 375, over the network 315, a list of media formats supported by the client agent 375.
  • the server agent 350 registers the output filter modules 355 A, 355B by manipulating the configuration of the server 310.
  • the server agent 350 registers the output filter modules 355 A, 355B by editing the registry of the server 310.
  • the server agent 350 then informs the client agent 375 that the server 310 can handle all such media formats.
  • the client agent 375 and server agent 350 negotiate supported formats and capabilities on an as-needed basis. For example, the client agent 375 would defer informing server agent 350 of its support for JPEG2000 format media until server agent 350 specifically requests the creation of a JPEG2000 display by the client agent 375.
  • the filter module 355A when an application program 340 loads a first output filter module 355 A while attempting to render a media stream, the filter module 355A communicates with the server agent 350, which in turn sends a request to the client agent 375 to create the media stream by providing the media stream properties: allocator, media type, media stream priority, etc.
  • the client agent 375 responds to the server agent 350's request to create the media stream by indicating the success or failure or the request. The success or failure of the request depends on the availability of the respective filter modules capable of decompressing the media stream, sufficient memory, etc., at the client 305.
  • the application program 340 loads the second output filter module 355B and connects it to the first output filter module 355A, thereby providing efficient streaming of compressed media.
  • the application program 340 unloads the first output filter module 355A and loads the native Microsoft or third party-supplied output filter modules, which decompress and render the stream at the server 310.
  • the native Microsoft or third party-supplied output filter modules which decompress and render the stream at the server 310.
  • the first output filter module 355A and the second output filter module 355B communicate directly with the server agent 350.
  • the client agent 375 interfaces with the client transceiver 330 and the presentation interface 345.
  • the client agent 375 initially informs the server agent 350 of the media formats supported by the client agent 375.
  • the client agent 375 also receives from the output filter modules 355B over the network 315, the compressed data set and any associated timing information.
  • the client agent 375 receives over the network 315, from the second output filter module 355B, both stream-generic (e.g., play, pause, stop, flush, end-of-stream) and stream-specific (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment) commands from the server agent 350, the graphical display commands associated with the non-media graphical information.
  • stream-generic e.g., play, pause, stop, flush, end-of-stream
  • stream-specific e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment
  • the client agent 375 uses either external or embedded codecs, decompresses the compressed data set and, together with the graphical display commands associated with the non-media graphical information, any information for locating images of a video stream on a display screen, and any timing information, generates a media presentation at the presentation interface 345.
  • the presentation interface 345 has, in one embodiment, a display screen that renders a graphical display, such as, for example, a video presentation. In another embodiment, the presentation interface 345 includes a speaker that renders an audio presentation.
  • the client 305 may include any number of presentation interfaces 345.
  • the information provided in specifying the media formats supported by the client agent 375 may determine the mode of operation at the server 310. If the compressed data set is in a format that is not supported by the client agent 375, the second output filter module 355B may recompress the decompressed data set into a supported format. In another embodiment, when the client agent 375 does not support the format of a stream, the application program 340 unloads the first output filter module 355A and loads the native Microsoft or third party-supplied output filter modules, which decompress and render the stream at the server 310.
  • FIGS. 4A, 4B, and 4C one embodiment of a method 400 that generates a media presentation at the client 305, using the exemplary embodiment of FIG. 3, is illustrated.
  • the client agent 375 informs the server agent 350 of all the media formats supported by the client agent 375.
  • the list of supported media formats is created by enumerating the external codecs installed on the client 305. For example, the codecs installed in the operating system of the client 305 are enumerated by the client agent 375, over the network 315, to the server agent 350.
  • the list of supported media formats is created by enumerating the codecs embedded in the client agent 375.
  • the codecs embedded in the software program are enumerated by the client agent 375, over the network 315, to the server agent 350.
  • the client agent 375 creates the list of supported media formats, and informs the server agent 350 of those supported media formats, by enumerating both the external codecs installed on the client 305 and the codecs embedded in the client agent 375.
  • the client agent 375 and server agent 350 negotiate supported formats and capabilities on an as-needed basis. For example, the client agent 375 would defer informing server agent 350 of its support for JPEG2000 format media until server agent 350 specifically requests the creation of a JPEG2000 display by the client agent 375.
  • capabilities are negotiated on a per-stream basis when the sfream is created.
  • the client agent 375 responds to the server agent 350's request to create a media presentation by indicating the success or failure or the request.
  • the success or failure of the request depends on the availability of the respective codecs capable of decompressing the data, sufficient memory, etc., at the client 305.
  • the application program 340 loads the codec, thereby providing efficient display of graphical data.
  • the application program 340 loads the native Microsoft or third party-supplied codecs, which decompress and render the data at the server 310.
  • the client agent 375 generates globally unique identifiers ("GUTDs") and associates each GUTD with a particular codec. The client agent 375 then fransmits the list of generated GUTDs to the server agent 350 to inform the server agent 350 of the media formats supported by the client agent 375. In another embodiment, the client agent 375 fransmits a list of four character codes, each four character code being associated with a particular codec, to the server agent 350 to inform the server agent 350 of the media formats supported by the client agent 375.
  • GUITDs globally unique identifiers
  • the server agent 350 Upon receiving the list of supported media formats from the client agent 375, the server agent 350 registers, at step 408, the first output filter module 355A and/or the second output filter module 355B on the server 310, as associated with the supported media formats. The server agent 350, at step 412, then reports back to the client agent 375 that the server 310 can handle all of the enumerated media formats.
  • an application program 340 starts executing on the server 310.
  • the application program 340 identifies within its output, at step 420, the presence of media content, such as, for example, a media sfream, the first output filter module 355A, the second output filter module 355B, or both are invoked. If, at step 424, the application program 340 uses external codecs, both the first output filter module 355A and the second output filter module 355B are invoked at step 428 as the application program 340 attempts to invoke an external codec.
  • the first output filter module 355A then intercepts, at step 432, an original compressed data set representing at least a portion of the media stream and transmits, at step 436, the original compressed data set to the client agent 375, without decompressing the data set.
  • the client agent 375 receives the original compressed data set and decompresses, at step 444, the original compressed data set to generate a decompressed data set.
  • the client agent 375 uses either external codecs installed on the client 305 or codecs embedded in the client agent 375 itself to decompress the original compressed data set.
  • the second output filter module 355B is, at step 448, invoked as the application program 340 attempts to invoke an OS-level renderer to display the decompressed data set.
  • the second output filter module 355B then intercepts, at step 452, a first decompressed data set, representing at least a portion of the media stream, from the output of the application program 340 and compresses, at step 456, the intercepted first decompressed data set.
  • a variety of compression techniques including both lossy compression techniques and lossless compression techniques, may be used by the second output filter module 355B, at step 456, to compress the media sfream.
  • the intercepted first decompressed data set may be compressed, at step 456, by the second output filter module 355B using, for example, a lightweight lossy video encoding algorithm, such as, for example, MJPEG compression.
  • the second output filter module 355B may choose the desired compression ratio or it may use a predetermined compression ratio.
  • the degree of quality loss chosen by the second output filter module 355B will, typically, depend on the available bandwidth of the network connection. For example, where a user of the client 305 uses a slow modem to connect to the network 315, the second output filter module 355B may choose to use low quality video. Where, on the other hand, a user of the client 305 uses a LAN link or a broadband connection to connect to the network 315, the second output filter module 355B may choose to use a higher quality video.
  • the second output filter module 355B transmits, at step 460, the compressed data set to the client agent 375 in place of the first decompressed data set.
  • the client agent 375 receives the compressed data set and decompresses, at step 468, the compressed data set to generate a second decompressed data set.
  • the client agent 375 uses either external codecs installed on the client 305 or codecs embedded in the client agent 375 itself to decompress the compressed data set.
  • the second output filter module 355B captures information for locating images of the video stream on a display screen and transmits the captured information over the network 315 to the client agent 375.
  • the client agent 375 receives the information for locating the images of the video stream on the display screen.
  • the server agent 350 intercepts and transmits, over the network 315, graphical display commands, associated with the non-media graphical information outputted by the application program 340, to the client agent 375.
  • the client agent 375 receives the graphical display commands associated with the non-media graphical information.
  • the output filter module 355A, 355B captures timing information associated with the media stream
  • the output filter module 355 A, 355B transmits, at step 488, the timing information to the client agent 375.
  • the client agent 375 receives, at step 492, the timing information and generates, at step 496, the media presentation at the presentation interface 345.
  • the client agent 375 uses the timing information, the graphical display commands associated with the non-media graphical information, and, where the media stream includes a video stream, the information for locating the images of the video stream on a display screen to seamlessly combine the decompressed data set (or, more specifically, where the application program 340 uses embedded codecs, the second decompressed data set) with the non-media graphical information.
  • the client agent 375 generates, at step 496, the media presentation at the presentation interface 345 using only the decompressed data set (or, more specifically, where the application program 340 uses embedded codecs, the second decompressed data set), the graphical display commands associated with the non-media graphical information, and, where the media sfream includes a video stream, the information for locating the images of the video stream on a display screen .
  • FIGS. 5A-5C present another embodiment of the present invention implemented on a server using an operating system selected from the MICROSOFT WINDOWS family of operating systems. The operating system at the client need not be the same operating system as the operating system employed by the server.
  • the appropriate first output filter modules 355 A and second output filter modules 355B are registered with the server 310 (Step 504).
  • the identified media types are exemplary and, utilizing this architecture, future media types may be added for operation in accord with the present invention.
  • An application program 340 executes at the server (Step 508) and a media stream is identified within the output of the application program 340 (Step 512).
  • the media stream is identified when the application make a system call to the WINDOWS media subsystem to locate suitable codecs for handling the media stream.
  • the application program 340 loads the first output filter module 355 A corresponding with the major media type of the media stream identified within the application output and having the highest merit of any filter registered with the system as capable of handling this major media type (Step 516).
  • the first output filter module 355A communicates with the server agent 350 to request that the client agent 375 create a media stream as specified by transmitted properties: allocator (buffer) properties, media type (major: video, audio, etc.; minor: MPEG-1, MPEG-2, etc.; format, etc.), media stream priority for on-demand quality control, etc.
  • allocator buffer
  • media type major: video, audio, etc.; minor: MPEG-1, MPEG-2, etc.; format, etc.
  • media stream priority for on-demand quality control, etc.
  • the server agent 350 organizes media streams in different contexts. Media streams, originating from different instances of an application program 340 have different major contexts. Media streams originating from the same instance of an application program 340 have different minor contexts within the same major context.
  • the server agent 350 creates a command queue for control information for each major context, and a media samples queue for each minor context.
  • the media samples queues help the server agent 350 accommodate variations in bandwidth and network latency by, for example, allowing certain frames of video data to be dropped to maintain playback speed and to select which data from various streams to drop. For example, video data may be discarded before audio data is discarded.
  • the queues of different major contexts are serviced in a round-robin fashion.
  • control queue typically has the highest priority
  • media queues corresponding to different minor contexts might have different priorities based on the nature of the sfream.
  • audio sfreams will have higher priority than video sfreams.
  • the client agent 375 attempts to create a media stream conforming to the media properties specified in the transmitted request (Step 520).
  • the client agent 375 attempts to load a generic source filter module and connect it with native or third party provided output filter modules that are capable of decompressing and rendering the media sfream.
  • Client agents 375 utilizing different operating systems may undertake different actions to achieve the same result. For example, the client might load and use the services of a multimedia application capable of decompressing and rendering a media stream, such as a MPEG-1 video stream.
  • the client agent 375 sends a reply to the server agent 350, indicating whether the request to create the specified media stream was successful (Step 524).
  • the server agent 350 may, in turn, provide this information to the application program 340. If the client succeeds in creating the stream, then using DIRECTSHOW intelligent connect logic, the application program 340 loads the second output filter module 355B and connects it to the first output filter module 355 A (Step 528), thereby enabling the processing of compressed media and control information transmitted to the client agent 375.
  • the first output filter module 355A intercepts the original compressed data set representing at least a portion of the media stream and timing information, and further detects dynamic changes in the media type, such as format changes (Step 532).
  • the second output filter module 355B intercepts media stream-generic commands (e.g., play, pause, stop, flush, end-of-stream) and stream-specific commands (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment commands) (Step 536).
  • media stream-generic commands e.g., play, pause, stop, flush, end-of-stream
  • stream-specific commands e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment commands
  • the server agent 350 transmits the original compressed data set, optionally including timing information, to the client agent 375 (Step 540).
  • the server agent 350 may drop samples from lower priority media queues to make bandwidth available for higher priority media queues. Media queues having equal priority are serviced in round robin fashion.
  • the transmission rate of the second output filter module 355B to the server agent 350 and, therefore, to the client agent 375 is controlled so that the client's media queue is prefilled as quickly as allowed by the network throughput before playback commences, reducing latency in the initial playback. Thereafter, the media queue of the client agent 375 is used to accommodate variations in network latency, as discussed below.
  • the server agent 350 also transmits control information to the client agent 375 (Step 544). Typically, control information is never discarded despite constraints on the availability of network bandwidth.
  • the client agent 375 receives the original compressed data set and optional timing information (Step 548).
  • the client agent 375 also receives the control information and any notifications of dynamic changes in the media type, e.g., format changes (Step 552).
  • the client agent 375 creates a single command queue for control information for each major context and a media samples queue for each minor context.
  • the media samples queues let the client agent 375 accommodate variations in bandwidth and network latency.
  • the control queues of the different major context are serviced in a round-robin fashion.
  • the media samples in the queues are utilized by the associated source filter module(s) based on the timing information.
  • the client agent sends status notifications to the server agent (Step 556). These notifications include indications of the client's success or failure creating a requested media sfream. Variations in bandwidth or network latency may be addressed by adjusting the media queue of the client agent 375 as necessary. Should the client agent 375 detect that one or more of its media queues have fallen below or have exceeded certain resource thresholds, then the client agent 375 may respectively send a burst or a stop request to the server agent 350 for the desired amount of media queue adjustment. The client agent 375 may also send status information concerning unexpected errors encountered in the generation of the media presentation.
  • the client agent 375 may pause re-buffer the queue or drop media samples, as required (Step 560).
  • a native or third party supplied output filter module decompresses the received compressed data set to generate a decompressed data set (Step 564).
  • the client agent 375 applies control information, timing information, and dynamic changes in the media type to the collection of filter modules (e.g., an instance of the generic source filter module, the connected instances of native or third party output filter modules, etc.) (Step 568).
  • the application program 340 running on the server 310 unloads the first output filter module 355 A and subsequently loads the native or third party-supplied output filter modules, which decompress and render the data sfream at the server 310 (Step 570).
  • a decompressed data set representing at least a portion of the media sfream is intercepted (Step 574), compressed (Step 578), and transmitted to the client agent 375 (Step 582), where they are received by the client agent 375 (Step 586).
  • the server agent 350 intercepts and fransmits graphical display commands, associated with non-media graphical information, to the client agent 375 (Step 590).
  • the client agent 375 receives the graphical display commands (Step 594).
  • the client agent 375 generates the media presentation using the native or third party provided output filter module (renderer filter) to render the decompressed data sent onto the presentation interface 245 (Step 598).
  • the present invention may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture.
  • the article of manufacture may be a floppy disk, a hard disk, a CD ROM, a flash memory card, a PROM, a RAM, a ROM, or a magnetic tape, hi general, the computer-readable programs may be implemented in any programming language. Some examples of languages that can be used include C, C++, or JAVA.
  • the software programs may be stored on or in one or more articles of manufacture as object code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Astronomy & Astrophysics (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention relates to generating at a client a display having a graphical and/or media components. A method for generating a media presentation at a client includes transmitting output from an application program executing on a server to the client, identifying a media stream within an application output, receiving an compressed data set representing at least a portion of the media stream and transmitting the compressed data set to the client. At least one of timing and control information associated with the media stream is captured and transmitted to the client.

Description

METHOD AND APPARATUS FOR GENERATING GRAPHICAL AND MEDIA DISPLAYS AT A THIN CLIENT
FIELD OF THE INVENTION [0001] The invention generally relates to distributed processing, and, more particularly, to generating a display having graphical and/or media components at a client.
BACKGROUND OF THE INVENTION [0002] A thin-client protocol can be used for displaying output, produced by an application running on a server, on a client running on a computer with limited processing capabilities. Two exemplary thin client protocols are ICA, Independent Computing Architecture from Citrix Systems, Inc., Ft. Lauderdale, FL and RDP, Remote Desktop Protocol f om Microsoft, Inc., Redmond, WA. The client is also sometimes referred to as a remote terminal session. One thin-client protocol intercepts commands by the application program to the server operating system ("OS") to draw to a display screen. The intercepted commands are transmitted to the remote session using, for example, one or more presentation layer packets. When the remote session (e.g., thin- client) receives the command, the remote session passes the received commands to the remote session OS. The thin-client draws the application program output on its display using the received commands, hi this manner, the application program appears to be executing on the thin-client.
[0003] Typically, when the application program draws images to the display screen, the image is represented as a bitmap. A bitmap format of an image is generally a very large data set. Thus, the thin-client protocol must transmit over the network the bitmap representation of an image, which is a large amount of data, along with the applicable commands on how to display the bitmap representation. For networks of low bandwidth, this results in a large time delay before the complete image is received and displayed on the client. This can result in inconvenience and unhappiness for the user of the client. Also, if the user is paying directly for bandwidth used, for example in a wireless network, transmission of these large bitmap formats results in large costs associated with each transmission.
[0004] A similar problem exists when the application program renders a media presentation. Typically, a video file is rendered as a series of bitmaps and audio information is rendered using pulse code modulation. Accordingly, the thin-client protocol transmits the series of bitmaps representing the video file and/or the pulse code modulated signal representing the audio information over the network. This transmission is inefficient, requiring excessive bandwidth and significant CPU usage. Moreover, even where sufficient bandwidth is available, an unresponsive graphical user interface may result at the client. Video playback, for example, is often of low quality, may appear "jerky," and may synchronize poorly with the audio presentation.
[0005] There is, therefore, a need for an improved approach to rendering images and media presentations in a remote terminal session.
SUMMARY OF THE INVENTION
[0006] The invention, according to one advantage, lowers the time and cost of transmitting images and other non-textual elements, originally represented in large bitmap formats, by substituting, prior to transmission, available compressed formats for the bitmap formats. In some systems, images and other multimedia content is transmitted in an already-compressed format, such as JPEG, PNG, GIF, MPEG3 or MPEG4. In these systems, the time and cost of transmitting the media may be lowered by intercepting the already-compressed format and substituting another version of the media stream in a, typically, more compressed format. Transmitting the compressed formats can significantly reduce the bandwidth necessary to transmit the media stream. The client decompresses the received data using available libraries. The client then substitutes the decompressed image for the original bitmap representations using, for example, modified thin-client protocol commands with other identifying data.
[0007] According to another advantage of the invention, a compressed data set, representing at least a portion of a media stream, is intercepted on a first computing device before it is decompressed. Alternatively, where the compressed data set is decompressed on the first computing device, the resulting decompressed data set is re- compressed on the first computing device. By transmitting the compressed data set, rather than the decompressed data set, over a network, which may have limited bandwidth, the time and cost of transmitting the data set is consequently reduced.
[0008] h addition to the time and cost advantages resulting from reducing the bandwidth required for communications, clients and servers utilizing the present invention enjoy improved performance. When, in accord with the present invention, a server transmits compressed data representative of a non-textual element or media stream, the processing burden on the server's central processing unit (CPU) is reduced. The server's CPU would otherwise expend processing effort on decompressing the compressed data and transmitting the larger volume of uncompressed data, with the attendant processing burden of the increased amount of network communications. Given that one server may interact with several clients, delegating decompression and rendering operations to the clients also improves server performance. Client performance is further improved, in that the processing effort at the client associated with the receipt of a larger volume of uncompressed data is reduced.
[0009] In one aspect, the invention relates to a method for generating a graphical display at a client. The method includes transmitting output from an application program executing on a server to the client, identifying a bitmap representation within the application output, and determining a check value for the bitmap representation. The method also includes retrieving a compressed data format of the bitmap representation using at least in part the check value and transmitting to the client the compressed data format in place of the bitmap representation.
[0010] In another aspect, the invention relates to a method for generating a graphical display at a client. The method includes transmitting output from an application program executing on a server to the client and identifying a non-textual element within the application output. The method also includes retrieving a compressed data format associated with the non-textual element and transmitting to the client the compressed data format in place of the non-textual element.
[0011] In one embodiment of this aspect of the invention, the method includes identifying a textual element within the application output and transmitting to the client the textual element. In another embodiment, the method includes receiving the compressed data format, and optionally the textual element, at the client and generating a display at the client using the compressed data format, and optionally the textual element. In another embodiment, the method includes transmitting the compressed data format using at least one presentation layer protocol packet. In yet another embodiment, the method includes transmitting the at least one presentation layer protocol packet using a command for transmitting a file in its native format.
[0012] In another embodiment, the method includes conforming the at least one presentation layer protocol packet to a remote access protocol, a thin-client protocol, and or a presentation protocol, hi still another embodiment, the non-textual element is a bitmap representation and the method includes replacing the bitmap representation with the compressed data format. In another embodiment, the method includes determining the capability of the client to render the non-textual element using the compressed data format. The method further includes, upon determination that the client cannot render the non-textual element using the compressed data format, transmitting an image- rendering library capable of rendering the non-textual element using the compressed data format.
[0013] In another embodiment, the method includes intercepting the application output and inspecting the intercepted output for a bitmap representation of the nontextual element. In yet another embodiment, the method includes calculating a first check value for a bitmap representation of the non-textual element and searching an image store for the compressed data format having a check value identical to the first check value.
[0014] In another aspect, the invention relates to a system for generating a graphical display at a client. The system includes an output filter module and a server agent. The output filter module is configured to intercept output produced by an application program, identify a non-textual element of the output, and retrieve a compressed data format associated with the non-textual element. The server agent is configured to transmit to the client the compressed data format in place of the non-textual element.
[0015] In one embodiment of this aspect of the invention, the system includes a server node, which includes the server agent and the output filter module. In another embodiment, the system includes a client node. The client node includes a client agent and a display. The client agent is configured to receive the compressed data format and to generate a display of the non-textual element using the received compressed data format. In another aspect, the system further includes a network.
[0016] In another aspect the invention relates to an article of manufacture having computer-readable program means embodied therein for generating a graphical display at a client. The article includes computer-readable program means for performing any of the aforementioned methods.
[0017] In an additional aspect, the invention relates to a method for generating a media presentation at a client. The method includes transmitting output from an application program executing on a server to the client, identifying a media stream within the application output, intercepting an original compressed data set representing at least a portion of the media stream before processing by the application program, and transmitting the original compressed data set to the client.
[0018] In another aspect, the invention relates to another method for generating a media presentation at a client. This method includes transmitting output from an application program executing on a server to the client, identifying a media stream within the application output, intercepting a first decompressed data set representing at least a portion of the media stream, compressing the intercepted first decompressed data set, and transmitting the compressed data set to the client in place of the first decompressed data set.
[0019] In yet another aspect, the invention relates to still another method for generating a media presentation at a client. This method includes informing a server of at least one media format supported by a client agent installed on the client, receiving a compressed data set of a media stream at the client, decompressing the compressed data set at the client to generate a decompressed data set, and generating the media presentation at the client using the decompressed data set.
[0020] In a further aspect, the invention relates to an article of manufacture that embodies computer-readable program means for generating a media presentation at a client. The article includes computer-readable program means for transmitting output from an application program executing on a server to the client, computer-readable program means for identifying a media stream within the application output, computer- readable program means for intercepting an original compressed data set representing at least a portion of the media stream before processing by the application program, and computer-readable program means for transmitting the original compressed data set to the client.
[0021] In still another aspect, the invention relates to another article of manufacture that embodies computer-readable means for generating a media presentation at a client. This article includes computer-readable program means for transmitting output from an application program executing on a server to the client, computer-readable program means for identifying a media stream within the application output, computer-readable program means for intercepting a first decompressed data set representing at least a portion of the media stream, computer-readable program means for compressing the intercepted first decompressed data set, and computer-readable program means for transmitting the compressed data set to the client in place of the first decompressed data set.
[0022] In yet another aspect, the invention relates to yet another article of manufacture that embodies computer-readable means for generating a media presentation at a client. This article includes computer-readable program means for informing a server of at least one media format supported by a client agent installed on the client, computer-readable program means for receiving a compressed data set of a media stream at the client, computer-readable program means for decompressing the compressed data set at the client to generate a decompressed data set, and computer- readable program means for generating the media presentation at the client using the decompressed data set.
[0023] In various embodiments of these last six aspects of the invention, the methods further include, and the articles of manufacture further include computer- readable program means for, capturing timing information associated with the media stream, transmitting the timing information to the client, receiving the compressed data set and, optionally, the timing information at the client, decompressing the compressed data set at the client to generate a decompressed data set, and generating the media presentation at the client using the decompressed data set and, optionally, the timing information. In other embodiments of the last six aspects of the invention, the methods further include, and the articles of manufacture further include computer-readable program means for, transmitting non-media graphical information from the application output to the client, receiving the non-media graphical information at the client, and generating the media presentation at the client using the decompressed data set and the non-media graphical information.
[0024] In an additional aspect, the invention relates to a system for generating a media presentation at a client. The system includes an application program and an output filter module. The application program is configured to identify a media stream within output produced by the application program. The output filter module is configured to intercept an original compressed data set representing at least a portion of the media stream before processing by the application program and transmit the original compressed data set to the client.
[0025] hi another aspect, the invention relates to another system for generating a media presentation at a client. This system includes an application program and an output filter module. The application program is configured to identify a media stream within output produced by the application program. The output filter module is configured to intercept a first decompressed data set representing at least a portion of the media stream, compress the intercepted first decompressed data set of the media stream, and transmit the compressed data set in place of the first decompressed data set to the client.
[0026] In yet another aspect, the invention relates to another system for generating a media presentation at a client. This system includes a server and the client in communication with the server. The client includes a client agent configured to inform the server of at least one media format supported by the client agent, receive a compressed data set of a media stream, decompress the compressed data set at the client to generate a decompressed data set, and generate the media presentation using the decompressed data set.
[0027] In various embodiments of these last three aspects of the invention, the output filter module of the systems is further configured to capture timing information associated with the media stream and to transmit the timing information to the client. In various other embodiments of the last three aspects of the invention, the system further includes a client agent configured to receive the compressed data set and the optional timing information, decompress the compressed data set to generate a decompressed data set, and generate the media presentation using the decompressed data set and the optional timing information. In still other embodiments of the last three aspects of the invention, the client agent is further configured to receive non-media graphical information and to generate the media presentation at the client using the decompressed data set and the non-media graphical information.
[0028] In another aspect, the invention relates to another system for generating a media presentation at a client. This system includes a network, a server in communication with the network, and the client in communication with the network. The server includes an application program and at least one output filter module. The application program is configured to identify a media stream within output produced by the application program. The output filter module is configured to intercept a compressed data set representing at least a portion of the media stream before processing by the application program, and transmit the compressed data set to the client. In some embodiments, the output filter module intercepts only a portion of a compressed data set representing the media stream. For example, the output filter module may intercept every other frame of a video media stream. In still other embodiments, the output filter module discards a portion of the data associated with each frame of video data in order to reduce the bandwidth necessary to transmit the video stream. The client includes a client agent. The client agent is configured to inform the server of at least one media format supported by the client agent, receive the compressed data set, decompress the compressed data set at the client to generate a decompressed data set, and generate the media presentation at the client using the decompressed data set.
[0029] In a further aspect, the invention relates to an article of manufacture that embodies computer-readable program means for generating a media presentation at a client. The article includes computer-readable program means for intercepting an original compressed data set of a media stream, and computer-readable program means for transmitting the original compressed data set to the client using a thin client protocol such as ICA or RDP.
[0030] In another aspect, the invention relates to another article of manufacture that embodies computer-readable program means for generating a media presentation at a client. The article includes computer-readable program means for intercepting a decompressed data set of a media stream, computer-readable program means for compressing the intercepted decompressed data set, and computer-readable program means for transmitting the compressed data set to the client using a thin client protocol such as ICA or RDP.
BRIEF DESCRIPTION OF DRAWINGS
[0031] The above and further advantages of the invention may be better understood by referring to the following description taken in conjunction with the accompanying drawings, in which:
[0032] FIG. 1 is a block diagram of an illustrative embodiment of a system to generate a graphical display for a remote terminal session in accordance with the invention;
[0033] FIG. 2 is a flow diagram of an illustrative embodiment of a process to generate a graphical display for a remote terminal session in accordance with the invention;
[0034] FIG. 3 is a block diagram of an illustrative embodiment of a system for generating a media presentation at a client in accordance with the invention;
[0035] FIGS. 4A, 4B, & 4C are a flow diagram of an illustrative embodiment of a method for generating a media presentation at a client in accordance with the invention; and
[0036] FIGS. 5 A, 5B, & 5C are a flow diagram of another embodiment of a method for generating a media presentation at a client in accordance with the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0037] In one aspect, the invention pertains to methods, systems, and articles of manufacture for generating a graphical display. A compressed data format representation, associated with a non-textual element or a bitmap representation of an image, is transmitted over a network from a server to a client, in place of the non-textual element or the bitmap representation, for subsequent display. [0038] In broad overview, FIG. 1 illustrates a system 100 to generate a display for a remote terminal session that includes a first computing system ("client node") 105 in communication with a second computing system ("server node") 110 over a network 115. For example, the network 115 can be a local-area network (LAN), such as a company Intranet, or a wide area network (WAN), such as the Internet or the World Wide Web. A user of the client node 105 can be connected to the network 115 through a variety of connections including standard telephone lines, LAN or WAN links (e.g., Tl, T3, 56kb, X.25), broadband connections (ISDN, Frame Relay, ATM), and wireless connections. The client node 105 includes a client transceiver 130 to establish communication with the network 115. The server node 110 includes a server transceiver 135 to establish communication with the network 115. The connections can be established using a variety of communication protocols (e.g., ICA, RDP, HTTP, TCP/LP, IPX, SPX, NetBIOS, Ethernet, RS232, and direct asynchronous connections).
[0039] The server node 110 can be any computing device capable of providing the requested services of the client node 105. Particularly, this includes generating and transmitting commands and data to the client node 105 that represent the output being produced by an application program 140 executing on the server 110. The server node 110 includes the server transceiver 135, the executing application program 140, a server agent 150, an output filter module 155 and an image store 160. The server agent 150 includes a module that interfaces with a client agent 175 and other components of the server node 110 to support the remote display and operability of the application program 140. The server agent module 150 and all modules mentioned throughout the specification are implemented as a software program and or a hardware device (e.g., ASICs or FPGAs).
[0040] For clarity, all of these components are shown on server node 110. It is to be understood that the server node 110 can represent a single server or can represent several servers in communication with each over the network 115 or another network (not shown), hi multiple server embodiments, the functionality of the components can be distributed over the available servers. For example, in one embodiment with multiple servers, the transceiver 135, the application program 140, the server agent 150 and the output filter module 155 are on an application server and the image store 160 is on a storage device, such as a disk in a RAID system. [0041] The client node 105 can be any computing device (e.g., a personal computer, set top box, wireless mobile phone, handheld device, personal digital assistant, kiosk, etc.) used to provide a user interface to the application program 140 executing on the server node 110. The client node 105 includes the client transceiver 130, a display 145, a client agent 175 and a graphics library 180 (also referred to as an image-rendering library). The client agent 175 includes a module, implemented as a software program and/or a hardware device (e.g., an ASIC or an FPGA) that receives commands and data from the server node 110 and from a user (not shown) of the client node 105. The client agent 175 uses the received information when interacting with other components of the client node 105 (e.g., when directing the operating system to output data onto the display 145). The client agent 175 also transmits requests and data to the server node 110 in response to server-issued commands or user actions at the client node 105.
[0042] The server node 110 hosts one or more application programs 140 that can be accessed by the client nodes 105. Examples of such applications include word processing programs such as Microsoft Word® and spreadsheet programs such as Microsoft Excel®, both manufactured by Microsoft Corporation of Redmond, Washington. Other examples include financial reporting programs, customer registration programs, programs providing technical support information, customer database applications, and application set managers. Another example of an application program is Internet Explorer®, manufactured by Microsoft Corporation of Redmond, Washington, and this program will be used as an exemplary application program 140 in the following discussion. It is understood that other application programs can be used.
[0043] During execution of the application program 140, for example Internet Explorer®, the server node 110 communicates with the client node 105 over a transport mechanism. In one embodiment, the transport mechanism provides multiple virtual channels 185 through the network 115 so the server agent 150 can communicate with the client agent 175. One of the virtual channels 185 provides a protocol for transmitting graphical screen data from the server node 110 to the client node 105. The server 110 executes a protocol driver, in one embodiment as part of the server agent 150, that intercepts graphical display interface commands generated by the application program 140 and targeted at the server's operating system. The protocol driver translates the commands into a protocol packet suitable for transmission over the transport mechanism.
[0044] The application program 140, in this example Internet Explorer®, executing on the server 110, retrieves a web page. As explained above, the application program 140 generates graphical display commands to the server operating system, as if it was going to display the output at the server node 110. The server agent 150 intercepts these commands and transmits them to the client agent 175. The client agent 175 issues the same or similar commands to the client operating system to generate output for the display 145 of the client node 105. When the operating system for the server agent 150 or the client agent 175 is a member of the Windows® family of operating systems, manufactured by Microsoft Corporation of Redmond, Washington, these graphical display commands may take the form of, for example, invocations of DirectShow®, DirectX®, or Windows graphic device interface (GDI) functionality.
[0045] In one embodiment, a web page has both textual elements (e.g., titles, text, and ASCII characters) and non-textual elements (e.g., images, photos, icons, and splash screens) incorporated therein. The non-textual elements are sometimes transmitted to the Internet Explorer® application program 140 from a web server (not shown) in a compressed data format (e.g., a file or a data stream), also referred to as the non-textual element's native format. Examples of compressed formats are JPEG, GLF, and PNG. The non-textual element represented in a compressed data format may be, for example, 20 kilobytes in size. That same non-textual element decompressed into its bitmap representation is, for example, 300 kilobytes in size.
[0046] The application program 140, when generating the display of the web page, retrieves, for example, a JPEG data format of a non-textual element and decompresses the JPEG data format into a bitmap for display. The output filter module 155 determines that the bitmap representation is from a compressed format and obtains the corresponding compressed format of the non-textual element from the image store 160, as explained in more detail below. In one embodiment, the image store 160 is persistent storage. In other embodiments, the image store 160 is temporary storage, cache, volatile memory and/or a combination of temporary and persistent storage. [0047] The server agent 150 replaces the bitmap representation of the non-textual element with the compressed non-textual element that the output filter module 155 retrieved from the image store 160. The server agent 150 transmits the non-textual element in the compressed format, along with the graphical display interface commands associated with the bitmap representation, to the client node 105. In one embodiment the server agent 150 uses a unique protocol command that identifies a transmission of a non-textual element that is not in bitmap representation, even though the associated commands are applicable to a bitmap representation of a non-textual element. In other embodiments other identifying techniques can be used. For example, the protocol command can have a modifier comment, or a command switch. The command can also use a change of context or a combination of multiple commands.
[0048] When the non-textual element is stored in a uncompressed format (e.g., Windows® bitmap (.bmp) format), the server agent 150 may retrieve the uncompressed representation of the non-textual element and compress it, for example using 2DRLE compression, so as to provide a compressed version of the non-textual element. Optionally, the server agent 150 may also review the data in the uncompressed representation, select an appropriate compression algorithm for application to the uncompressed representation, apply the selected algorithm to the uncompressed representation, and provide the compressed representation to the client agent 175 for display. If the non-textual element was originally compressed, the server agent 150 may choose to recompress the image using the same or a different compression technique.
[0049] The algorithms available to the server agent 150 for the compression of the uncompressed representation include lossless compression algorithms and lossy compression algorithms. Lossless compression algorithms reduce the size of the uncompressed representation without the loss of information contained in the representation, e.g., 2DRLE compression. Lossy compression algorithms reduce the size of the uncompressed representation in such a way that information contained in the uncompressed representation is discarded, e.g., JPEG or JPEG2000 compression.
[0050] The server agent 150 selects a compression algorithm that is appropriate for compressing the uncompressed representation using — at least in part — the contents of the uncompressed representation, hi one embodiment, the server agent 150 determines that the uncompressed representation is a continuous tone image, such as a photographic image, and applies a lossy compression algorithm to the uncompressed representation. In another embodiment, the server agent 150 determines that the uncompressed representation contains large areas of the same color, e.g., a computer-generated image, and applies a lossless compression algorithm. In yet another embodiment, the number of colors contained in the pixels of the uncompressed representation is enumerated and when the counted number of colors exceeds a predetermined threshold value (e.g., 256 colors), a lossy compression algorithm is applied to the uncompressed representation.
[0051] In still another embodiment, the server agent 150 compresses the uncompressed representation using a lossless compression algorithm and compares the size of the compressed result to a predetermined value. When the size of the compressed result exceeds the predetermined value, the uncompressed representation is compressed using a lossy compression algorithm. If the size of the result of the lossy compression is less than a predetermined percentage of the size of the result of the lossless compression, then the lossy compression algorithm is selected.
[0052] The client agent 175 receives the transmission of the non-textual element file in the compressed data format, along with the graphical display interface commands associated with the bitmap representation of the non-textual element. The client agent 175 determines that the non-textual element is in the compressed data format and not the bitmap representation. In one embodiment, the client agent 175 makes this determination because the non-textual element in compressed format is transmitted using a unique protocol command. In another embodiment, the size of the non-textual element data and/or other characteristics about the non-textual element included with the associated graphical display interface commands are enough to enable the client agent 175 to make the determination.
[0053] The client agent 175 determines whether the client node 105 contains the necessary library 180 to decompress the compressed format of the non-textual element. If the client node 105 has the appropriate graphics libraryries) 180 installed to perform the decompression algorithms, the client agent 175 uses the library 180 to decompress the compressed format of the non-textual element into its bitmap representation. The client agent 175 performs the received associated graphical display interface commands on the bitmap representation to generate the non-textual element of the output of the application program 140 on the client display 145. [0054] In one embodiment, the client agent 175 does not contain all the decompression algorithms to decompress the non-textual element from a compressed format into a bitmap representation. If the client node 105 does not have the appropriate graphics library(ies) 180 installed to perform the decompression algorithms, the client agent 175 requests the needed graphics library from the server node 110. In another embodiment, the client agent 175 determines if a predetermined set of the most widely used graphics libraries 180 are installed on the client node 105 prior to receiving any non-textual elements from the server node 110. If the most widely used graphics libraries 180 are not installed on the client node 105, the client agent 175 requests the missing libraries from the server node 110 prior to receiving any non-textual elements from the server node 110.
[0055] In yet another embodiment, the client agent 175 determines which graphics libraries 180 the client node 105 includes and transmits that library information to the server agent 150. In this embodiment, when the server agent 150 receives the compressed data format of a bitmap representation from the output filter module 155, the server agent 150 determines, using the transmitted library information, whether the client node 105 can render the compressed data format. If the server agent 150 determines that the client node 105 has the applicable library, the server agent 150 substitutes the compressed data format for the bitmap representation of the non-textual element. If the server agent 150 determines that the client node 105 does not have the applicable library, the server agent 150 does not substitute the compressed data format for the bitmap representation of the non-textual element and instead transmits the bitmap representation to the client 105.
[0056] For the server agent 150 to replace the bitmap representation of the nontextual element with the non-textual element in the compressed format, the output filter module 155 determines that the bitmap representation is from a compressed format contained in the image store 160. The output filter module 155 may make this determination if the source of the media stream is exposed in some manner by the document (such as, for example, by a file type identifier or by an application program specifically configured to expose this information). For embodiments in which the media stream is a multimedia stream, the output filter module 155 may be provided with information regarding the source application associated with the media stream by the multimedia subsystem. In still other embodiments, the output filter module 155 calculates one or more check values for a bitmap representation. For example, the output filter module 155 can calculate a single check value for the entire bitmap representation and/or the output filter module 155 can calculate four check values, one for each quadrant for the entire bitmap representation, hi another example, the output filter module 155 can calculate N check values, one for each of the N lines in the bitmap representation. A check value is the result of an algorithm that generates a substantially unique value for different arrangements of data. The check value is, for example, a checktag, a Cyclic Redundancy Code ("CRC"), a check sum, or a result of a hashing function. The check value is based on the bitmap representation and not the data as arranged in a compressed data format. However, when the compressed data format is stored in the image store 160, it is stored with a check value attribute that corresponds to the one or more check values of the bitmap representation of the compressed data when decompressed.
[0057] In one embodiment, the check value is a checktag that includes a fixed identifier and a unique identifier. The fixed identifier and the unique identifier are combined together and concealed within an image. The fixed identifier is used to identify the checktag as such; the unique identifier is used to identify a specific image. The fixed identifier is, for example, a globally unique identifier that is statistically unlikely to be found within an image. For example, the fixed identifier is a byte sequence that is easily recognizable during debugging and that has a balanced number of zero and one bits. The unique identifier is a sequential identifier uniquely allocated for each image in the cache. The sequential unique identifier is XOR masked with another value so that the image identifiers with a small value (the most likely value) will be more likely to have a balanced number of zero and one bits.
[0058] The checktag is encoded into RGB color components, independently of whether the RGB components are part of the image or part of the color palette. More specifically, the checktag is treated as a stream of 160 bits (i.e., 20 separate bytes, each of which starts at bit 0, the least significant, and finishes at bit 7, the most significant bit). The least significant bit of each byte is overwritten by the next bit of the checktag. The other 7 bits of each byte remain unaltered. [0059] A checktag is decoded by simply reversing the encoding procedure. After the checktag is decoded, the fixed identifier and the unique identifier are retrieved from the checktag. The retrieved fixed identifier is validated against a previously stored fixed identifier to identify the checktag as such. Where a match is found, the unique identifier is then used to retrieve information that is relevant to the identified image, such as the bitmap data associated with the image.
[0060] The output filter module 155 searches the image store 160 for a non-textual element in compressed data format that has a check value attribute that is the same as one or more check values the output filter module 155 calculates for the bitmap representation. The output filter module 155 retrieves the compressed format of nontextual element with the same check value attribute as the one or more check values and sends the compressed format of the non-textual element to the server agent 150 for transmittal to the client agent 175 in place of the bitmap representation.
[0061] The server node 110 stores compressed formats of non-textual elements in the image store 160 the first time the application program 140 calls a graphics library (not shown) to create a bitmap representation from a compressed format file. The output filter module 155 calculates the associated check value of the bitmap representation as the application program 140 decompresses the compressed format and generates the bitmap representation. As described above, the output filter module 155 can calculate the check value when the bitmap representation is complete, when a quadrant of the bitmap representation is complete, or when a line of the bitmap representation is complete. The server 110 stores the compressed format file and the associated check value attribute in the image store 160 and retrieves the compressed format file the first and any subsequent times the application program 140 generates the associated nontextual element.
[0062] Whether the server 110 stores the compressed format file and its associated check value attribute(s) in the image store 160 in a temporary portion (e.g., RAM memory buffer or cache) or a persistent portion (e.g., disk or non-volatile memory buffer) is based at least in part on design and hardware limitations (e.g., the size of the persistent storage). One exemplary criterion used to make that determination is the number of times the application program 140 generates the non-textual element. For example, if the application program 140 generates a particular non-textual element more than a predetermined number of times, the server 110 stores the compressed format file and its associated check value attribute(s) corresponding to that particular non-textual element persistently in the image store 160. The image store 160 may apply eviction algorithms (e.g., a least-recently-used [LRU] algorithm) to remove data from the image store 160. Eviction permits the server agent 150 and client agent 175 to observe resource or storage limitations imposed on the image stores 160.
[0063] In other embodiments, the server 110 stores the non-textual element if it is static or complex. For example, if the application program 140 always generates a splash screen at initialization, the server 110 stores the compressed format file corresponding to that splash screen in the persistent portion of the image store 160. In another embodiment, if the non-textual element is complex, static and/or generated repeatedly but does not have a corresponding compressed format file, the output filter module 155 generates a compressed format file for that non-textual element, in a standards-based or proprietary-based format, i any subsequent transmissions, the server agent 150 transmits the generated compressed format file in place of the bitmap representation. If the compressed format is a proprietary-based format, the server agent 150 determines whether the client node 105 includes the applicable proprietary-based graphics library to decompress the compressed format file into a bitmap representation. If not included on the client node 105, the server agent 150 transmits the applicable library to the client node 105 for installation.
[0064] Although the illustrated embodiment depicts the image store 160 on the server node 110, in an alternate embodiment, at least a portion of the image store (not shown) is on the client node 105. In this alternate embodiment, the output filter module 155 calculates the one or more check values of the bitmap representation and transmits the one or more check values to the server agent 150. The server agent 150 transmits these one or more check values to the client agent 175. The client agent 175 searches the portion of the image store on the client node 105 for a compressed data format stored with an identical one or more check values attribute. The client agent 175 transmits the results of this search to the server agent 150. The server-side and client-side image stores 160 may persist after the termination of a session between a client and a server. Subsequently, when a new session is initiated, the client agent 175 communicates to the server agent 150 information about the contents of its image store 160 and, after comparison with the server's image store 160, the server agent 150 may use the nontextual elements in the image stores 160-or their strips, as discussed below-to reduce the amount of bandwidth required to create graphical and media display at the client.
[0065] If the compressed data format for the non-textual element exists on the client node 105, the server agent 150 does not have to send either the compressed data format or the bitmap representation over the network 115. The server agent 150 only transmits the graphical display interface commands associated with the bitmap representation of the non-textual element. If the compressed data format for the non-textual element does not exist on the client node 105, the output filter module 155 obtains the corresponding compressed format of the non-textual element from the image store 160. The server agent 150 replaces the bitmap representation of the non-textual element with the nontextual element in the compressed data format that the output filter module 155 retrieved from the image store 160. The server agent 150 transmits the non-textual element in the compressed format, along with the graphical display interface commands associated with the bitmap representation, to the client node 105.
[0066] As discussed, the presence of the image store 160 on the server 110 and/or the client node 105 permits the caching of non-textual elements, reducing the bandwidth required to generate graphical and media displays at a client. In accord with the present invention, these non-textual elements may be subdivided into sub-regions, i.e., "strips," and the image store 160 will subsequently provide non-textual element caching, as discussed above, with strip-level granularity. For example, a web page may be comprised of multiple images, only some of which are visible, either fully or partially. For the images that are only partially shown, it is possible to send the full image to the client including the part that is obscured by other windows. However, doing so increases bandwidth consumption because more data is sent than is necessary. The portion of the image that is visible is identified as a "strip," and processed as discussed below. The image store 160 contains the graphical data forming the strip and, optionally, strip-related metadata such as the image strip height, the image strip width, and an image strip identifier, such as a cyclic redundancy check (CRC) checksum.
[0067] In this embodiment, each non-textual element used in a graphical rendering operation is divided into the aforementioned strips. In one embodiment, these strips have the same width as the non-textual element and a height that is less than or equal to the height of the non-textual element. When the client agent 175 is required to display a non-textual element composed of one or more strips that are not present in the image store 160, the required strips are transmitted to the client for rendering and storage in a client-side image store 160'.
[0068] The server agent 150 maintains a database of strips that have previously been transmitted to the client for rendering. If subsequent rendering operations incorporate regions contained in a stored strip, the presence of the stored strip is identified in the server-side image store 160 and an appropriate identifier associated with the stored strip is retrieved. A message instructing the client agent 175 to render the stored strip associated with the identifier is transmitted to the client agent 175, requiring less bandwidth than a message including the contents of the strip itself. The client agent 175 retrieves its copy of the stored strip associated with the transmitted identifier from its image store 160' and displays the contents of the stored strip. Similarly, when individual strips are shared among a plurality of non-textual elements, the server agent 150 identifies the common strips and instructs the client agent 175 to display the strip from the client's own image store 160, avoiding the retransmission of the strip for display.
[0069] If a strip has been modified between successive rendering operations, e.g., when the source bitmap in a previous rendering operation becomes the destination bitmap in a subsequent rendering operation, then the server agent 150 will detect the modification to the strip and retransmit the strip to the client agent 175 for rendering and/or incorporation in the client-side image store 160'.
[0070] FIG. 2 illustrates an exemplary embodiment of a process 200 to generate a display for a remote terminal session, using the exemplary embodiment of FIG. 1. The output filter module 155 monitors the output of the application program 140 by monitoring calls made to the operating system of the server node 110. When the output filter module 155 detects (step 205) a display command from the application program 140, the output module 155 determines (step 210) whether the application program 140 is generating a bitmap representation of a non-textual element.
[0071] If the application program 140 is not generating a bitmap representation of a non-textual element, the output filter module 155 transmits (step 215) the display command to the server agent 150, which transmits that command, or a representative command defined in the protocol, to the client agent 175. If the application program 140 is generating a bitmap representation of a non-textual element, the output filter module 155 calculates (step 220) one or more check values corresponding to the bitmap representation of the non-textual image.
[0072] Using the one or more calculated check value(s), the output filter module 155 searches the image store 160 to determine (step 225) whether a compressed data format with identical check value attribute(s) exists. If there is a compressed data format in the image store 160 with check value attribute(s) identical to the one or more check values the output filter module 155 calculates, the output module 155 replaces (step 230) the bitmap representation of the non-textual element with the compressed data format. The output module 155 transmits (step 230) the compressed data format to the server agent 150 for eventual transmission to the client agent 175. The output module 155 also transmits all of the commands associated with the replaced bitmap representation along with the compressed data format.
[0073] If there is not a compressed data format with identical one or more check value attributes in the image store 160, the output module 155 determines (step 235) whether the bitmap representation of a non-textual element corresponding to the compressed data format meets a predetermined criterion for persistent storage (e.g., any of the criteria described above). If the output module 155 determines (step 235) that the predetermined criterion is met, the output module 155 stores (step 240) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the persistent portion of the image store 160. If the output module 155 determines (step 235) that the predetermined criterion is not met, the output module 155 stores (step 245) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the temporary portion of the image store 160.
[0074] Once the output module 155 stores (step 240 or 245) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the image store 160, the output module 155 replaces (step 230) the bitmap representation of the non-textual element with the compressed data format. The output module 155 transmits (step 230) the compressed data format to the server agent 150 for eventual transmission to the client agent 175. The output module 155 continues monitoring the output generated by the application program 140 until the output module 155 detects (step 205) another display command from the application program 140.
[0075] In another aspect, the invention pertains to methods, systems, and articles of manufacture for generating a media presentation. In one embodiment, a compressed data set, representing at least a portion of a media stream, is intercepted on a first computing device and transmitted, over a network, to a second computing device, where it is decompressed and presented to a user. In another embodiment, a decompressed data set, representing at least a portion of a media stream, is intercepted and compressed on the first computing device and then transmitted, as above, to the second computing device, where it is decompressed and presented to the user.
[0076] FIG. 3 illustrates one embodiment of a system 300 that generates a media presentation according to this aspect of the invention. The system 300 includes a first computing device, e.g., a server 310, in communication with a second computing device, e.g., a client 305, over a network 315. Generally speaking, except as set forth below, the client 305, the server 310, and the network 315 have the same capabilities as the client 105, the server 110, and the network 115, respectively, described above.
[0077] As shown, the client 305 includes at least a client transceiver 330, a client agent 375, and a presentation interface 345. The client agent 375 may be implemented as a software program and/or as a hardware device, such as, for example, an ASIC or an FPGA. The client agent 375 uses the client transceiver 330 to communicate over the network 315 and generates a presentation having media and non-media components at the presentation interface 345.
[0078] In one embodiment, the server 310 is an application server. As illustrated, the server 310 includes at least a server transceiver 335, an application program 340, a server agent 350, a first output filter module 355A, and a second output filter module 355B. The server agent 350, the first output filter module 355 A, and the second output filter module 355B may be implemented as a software program and/or as a hardware device, such as, for example, an ASIC or an FPGA. The server agent 350, the first output filter module 355 A, and the second output filter module 355B use the server transceiver 335 to communicate over the network 315.
[0079] hi another embodiment, the aforementioned components 335, 340, 350, 355A, and 355B are distributed and/or duplicated over several servers in communication with each other over the network 315, or over another network (not shown). This permits, for example, the transmission of media-related capability information, media streams, and control information to multiple clients and through multiple hops. Alternatively, in yet another embodiment, two or more of the aforementioned components 335, 350, 355A, and 355B may be combined into a single component, such that the functions, as described below, performed by two or more of the components 335, 350, 355A, and 355B are performed by the single component.
[0080] In still another embodiment, the aforementioned components 340, 355 A, and 355B are duplicated at the same server 310. This permits, for example, the simultaneous transmission of media-related capability information, media streams, and control information, pertaining to different/multiple applications 340, and different/multiple media streams within each application. The different applications 340 may be identified as major stream contexts, and the different media streams as minor contexts within a specific major context.
[0081] The application program 340 illustrated in FIG. 3 is any application program 340 that renders, as part of its output, a media stream. The media stream may be a video stream, an audio stream, or, alternatively, a combination of any number of instances thereof. In addition, the application program 340 may output non-media graphical information. In this context, non-media graphical information refers generally to all graphical information outputted by the application program 340 without the use of a codec or the equivalent, such as, for example, static graphical information, including, but not limited to, toolbars and drop-down menus. Non-media graphical information also includes, for example, information for locating the static graphical information on a display screen. The application program 340 may be, for example, the MICROSOFT ENCARTA application program manufactured by the Microsoft Corporation of Redmond, Washington. [0082] In one embodiment, the application program 340 uses external codecs, such as, for example, codecs installed in the operating system of the server 310, to decompress a compressed data set representing at least a portion of a media stream. In another embodiment, the codecs used by the application program 340 are embedded in the application program 340 itself. The server 310 may include any number of executing application programs 340, some of which use external codecs, others of which use embedded codecs.
[0083] Where the application program 340 uses external codecs and desires to output a media stream, it requests that the operating system of the server 310 use the external codecs to decompress the compressed data set representing at least a portion of the media stream for subsequent display. Where the codecs used by the application program 340 are embedded in the application program 340 itself, the application program 340, when desiring to output a media stream, uses the embedded codecs to decompress the compressed data set itself for subsequent display. Additionally, the application program 340 may generate and transmit graphical display commands, associated with the non-media graphical information, to the operating system of the server 310. When the operating system for the server agent 150 or the client agent 175 is a member of the Windows® family of operating systems, manufactured by Microsoft Corporation of Redmond, Washington, these graphical display commands may take the form of, for example, invocations of DirectShow®, DirectX®, or GDI functionality.
[0084] In accordance with the present invention, the application program 340 performs these tasks as if the application program 340 was going to generate a presentation having media and non-media components at the server 310. As explained below, the first output filter module 355A, the second output filter module 355B, and the server agent 350 intercept the compressed data set being passed to the external codecs, the decompressed data set generated by the embedded codecs, and the graphical display commands associated with the non-media graphical information, respectively, and (after first compressing the decompressed data set generated by the embedded codecs) transmit them, over the network 315, to the client agent 375. The client agent 375, as explained below, then decompresses the received compressed data sets and issues the same or similar graphical display commands, associated with the non-media graphical information, to the operating system of the client 305 to generate a presentation having media and non-media components at the presentation interface 345 of the client 305. In some embodiments, e.g., the DirectShow® embodiment discussed below, the communications between the server agent 350 and the client agent 375 include information concerning media-related capabilities (e.g., buffer and media type properties, media stream priority, etc.) or control information (e.g., play, pause, flush, end-of-stream, and stop commands, parent window assignment, window positioning and clipping commands, scrolling, audio adjustment commands, volume, balance, etc.).
[0085] The first output filter module 355 A and the second output filter module 355B are invoked as an application program 340 using external codecs attempts to invoke an external codec to output a media stream. The first output filter module 355A intercepts an original compressed data set representing at least a portion of the media stream. Instead of decompressing the data set, as an external codec would, the first output filter module 355 A transmits the original compressed data set over the network 315 to the client agent 375. The second output filter module 355B acting as an OS-level renderer captures stream-independent (generic) control commands (e.g., play, pause, stop, flush, end-of-stream) and transmits the information over the network 315 to the client agent 375. Where the media stream includes a video stream, the second output filter module 355B also captures information for locating images of the video stream (e.g., parent window assignment, positioning and clipping commands) on a display screen and transmits the information over the network 315 to the client agent 375. Where the media stream includes an audio stream, the second output filter module 355B may also capture information for audio adjustment (e.g., volume and balance adjustment commands) and transmit the information over the network 315 to the client agent 375.
[0086] In another embodiment, when an application program 340 that uses embedded codecs attempts to invoke, for example, an OS-level renderer to output a media stream, the second output filter module 355B is invoked. The second output filter module 355B intercepts a first decompressed data set representing at least a portion of the media stream from the output of the application program 340. The second output filter module 355B then compresses, as explained below, the intercepted first decompressed data set and transmits the resulting compressed data set, over the network 315, to the client agent 375. The second output filter module 355B, as above, also captures, where the media stream includes a video stream, information for locating images of the video stream on a display screen and transmits the information over the network 315 to the client agent 375.
[0087] When the operating system for the server agent 350 or the client agent 375 is a member of the Windows® family of operating systems, manufactured by Microsoft Corporation of Redmond, Washington, the server agent 350 may implement this interception functionality by providing a DirectShow® Transform filter with a filter merit level that exceeds that of the other filters installed on the server 310. As is known, the WINDOWS family of operating systems allows multiple filters to be installed that are capable of handling particular types of media encoding. The merit level assigned to a filter determines the priority given to a filter; filters having a higher merit level, i.e., priority, are selected to handle media types before filters with lower merit levels. Support for various media types, e.g., video, audio, MIDI, text, etc., may be implemented by associating the Transform filter with the desired media types through changes to the system registry. Furthermore, through changes to the registry, the Transform filters may be associated with specific Renderer filters. Notably, this framework is extensible and permits future support for new types of media programming.
[0088] In a Windows® environment, the server agent 350 may implement its client communications functionality using modified DirectShow® Transform and Renderer i filters. Instead of transforming (decompressing) received media data, the modified Transform filters transmit the data to the client agent 375. Instead of rendering transformed (decompressed) media data, the Renderer filters transmit both stream- generic (e.g., play, pause, stop, flush, end-of-stream) and stream-specific (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment) commands. In turn, if the client 305 utilizes a Windows® operating system, then a modified DirectShow® source filter at the client 305 may act to receive the transmitted media and stream-generic control data. Eventually, unmodified Transform and rendering filters at the client 305 transform (decompressO and render the media data for viewing, hearing, etc. by a user. The Renderer filters also receive stream-specific control data.
[0089] In one embodiment, the server agent 350 maintains a separate media queue for each media stream. When network bandwidth between the server agent 350 and the client agent 375 is scarce, each media sfream may be prioritized, permitting the, e.g., transmission of real-time video conferencing information while transparently slowing or halting the transmission of still images from a web browser. Stream prioritization may be accomplished by providing separate queues for different types of media, each stream having an assigned priority. Network bandwidth may be determined in any one of a number of ways known in the art such as, for example, "pinging" the expected target server and measuring response time. For embodiments in which the media stream includes embedded timing information, that information may be used to determine if media data is not transmitted fast enough, i.e., that the bandwidth of the channel cannot support the transmission.
[0090] In another embodiment, the media queues corresponding to the same application program 340 might have different priorities based on the media type (video, audio, MLDI, text, etc.). If network bandwidth between the server agent 350 and the client agent 375 is insufficient to accommodate all media streams, samples may be dropped, that is, deleted, from lower priority queues so that higher priority queues may be services without interruption. For example, video transmission might be interrupted and appear in a "slide-mode," while audio is still uninterrupted. In another embodiment, control information is associated with its own control queue, permitting the prioritization of the transmission of control data ahead of any media data.
[0091] In this environment, the transmission of control information from the server agent 350 to the client agent 375 permits control of the client rendering filters' scheduler. Latency in initial playback may be reduced by pre-filling the media queue (not shown) of the client agent 375 as fast as possible before playback commences. Similarly, subsequent variations in network latency may be addressed by readjusting the client agent's 375 media queue as necessary. Should the client agent 375 detect that one or more of its media queues have fallen below or have exceeded certain resource thresholds, then the client agent 375 may send a burst or a stop request respectively to the server agent 350 for the desired amount of media queue adjustment.
[0092] The use of DirectShow® functionality in a Windows® environment leverages functionality provided by the operating system and reduces or eliminates the reirnplementation of duplicative functionality. For example, the graphical or media display provided by the client agent 375 leverages operating system functionality to provide complex clipping and audio control functionality. The client agent 375 may also utilize the DirectDraw® and DirectSound® capabilities provided by the operating system, reducing required CPU resources and improving overall performance.
[0093] In another embodiment, the server agent 350 intercepts and fransmits to the client agent 375, over the network 315, the graphical display commands associated with the non-media graphical information that are output from the application program 340.
[0094] Additionally, in one embodiment, the first output filter module 355A, the second output filter module 355B or both (where the application program 340 uses external codecs), or the second output filter module 355B (where the application program 340 uses embedded codecs), captures timing information associated with the media sfream and transmits the timing information, over the network 315, to the client agent 375. More specifically, the output filter module 355A, 355B captures, and transmits to the client agent 375, presentation times for each frame of the media stream, thereby enabling the client agent 375 to synchronize video and audio streams and to maintain the correct frame rate.
[0095] As shown, the server agent 350 interfaces with the server transceiver 335 and the application program 340. In one embodiment, as explained below, the server agent 350 receives from the client agent 375, over the network 315, a list of media formats supported by the client agent 375. Upon receiving the list of supported media formats, the server agent 350 registers the output filter modules 355 A, 355B by manipulating the configuration of the server 310. In one embodiment, for example, the server agent 350 registers the output filter modules 355 A, 355B by editing the registry of the server 310. The server agent 350 then informs the client agent 375 that the server 310 can handle all such media formats. In another embodiment, the client agent 375 and server agent 350 negotiate supported formats and capabilities on an as-needed basis. For example, the client agent 375 would defer informing server agent 350 of its support for JPEG2000 format media until server agent 350 specifically requests the creation of a JPEG2000 display by the client agent 375.
[0096] In still another embodiment, when an application program 340 loads a first output filter module 355 A while attempting to render a media stream, the filter module 355A communicates with the server agent 350, which in turn sends a request to the client agent 375 to create the media stream by providing the media stream properties: allocator, media type, media stream priority, etc. The client agent 375 responds to the server agent 350's request to create the media stream by indicating the success or failure or the request. The success or failure of the request depends on the availability of the respective filter modules capable of decompressing the media stream, sufficient memory, etc., at the client 305. When successful, the application program 340 loads the second output filter module 355B and connects it to the first output filter module 355A, thereby providing efficient streaming of compressed media. Upon failure, the application program 340 unloads the first output filter module 355A and loads the native Microsoft or third party-supplied output filter modules, which decompress and render the stream at the server 310. For example, when the underlying operating system at the client 305 is a member of the MICROSOFT WINDOWS family, implementing the DIRECTX and DIRECTSHOW technologies, the first output filter module 355A and the second output filter module 355B communicate directly with the server agent 350.
[0097] At the client 305, the client agent 375 interfaces with the client transceiver 330 and the presentation interface 345. The client agent 375, as described below, initially informs the server agent 350 of the media formats supported by the client agent 375. The client agent 375 also receives from the output filter modules 355B over the network 315, the compressed data set and any associated timing information. Moreover, the client agent 375 receives over the network 315, from the second output filter module 355B, both stream-generic (e.g., play, pause, stop, flush, end-of-stream) and stream- specific (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment) commands from the server agent 350, the graphical display commands associated with the non-media graphical information.
[0098] The client agent 375, using either external or embedded codecs, decompresses the compressed data set and, together with the graphical display commands associated with the non-media graphical information, any information for locating images of a video stream on a display screen, and any timing information, generates a media presentation at the presentation interface 345. The presentation interface 345 has, in one embodiment, a display screen that renders a graphical display, such as, for example, a video presentation. In another embodiment, the presentation interface 345 includes a speaker that renders an audio presentation. The client 305 may include any number of presentation interfaces 345.
[0099] The information provided in specifying the media formats supported by the client agent 375 may determine the mode of operation at the server 310. If the compressed data set is in a format that is not supported by the client agent 375, the second output filter module 355B may recompress the decompressed data set into a supported format. In another embodiment, when the client agent 375 does not support the format of a stream, the application program 340 unloads the first output filter module 355A and loads the native Microsoft or third party-supplied output filter modules, which decompress and render the stream at the server 310.
[00100] Referring now to FIGS. 4A, 4B, and 4C, one embodiment of a method 400 that generates a media presentation at the client 305, using the exemplary embodiment of FIG. 3, is illustrated. When the client 305 and the server 310 are both connected to the network 315, the client agent 375, at step 404, informs the server agent 350 of all the media formats supported by the client agent 375. In one embodiment, the list of supported media formats is created by enumerating the external codecs installed on the client 305. For example, the codecs installed in the operating system of the client 305 are enumerated by the client agent 375, over the network 315, to the server agent 350. In another embodiment, the list of supported media formats is created by enumerating the codecs embedded in the client agent 375. For example, where the client agent 375 is implemented as a software program, the codecs embedded in the software program are enumerated by the client agent 375, over the network 315, to the server agent 350. Alternatively, the client agent 375 creates the list of supported media formats, and informs the server agent 350 of those supported media formats, by enumerating both the external codecs installed on the client 305 and the codecs embedded in the client agent 375. In still another embodiment, the client agent 375 and server agent 350 negotiate supported formats and capabilities on an as-needed basis. For example, the client agent 375 would defer informing server agent 350 of its support for JPEG2000 format media until server agent 350 specifically requests the creation of a JPEG2000 display by the client agent 375.
[0100] In another embodiment, capabilities are negotiated on a per-stream basis when the sfream is created. The client agent 375 responds to the server agent 350's request to create a media presentation by indicating the success or failure or the request. The success or failure of the request depends on the availability of the respective codecs capable of decompressing the data, sufficient memory, etc., at the client 305. When successful, the application program 340 loads the codec, thereby providing efficient display of graphical data. Upon failure, the application program 340 loads the native Microsoft or third party-supplied codecs, which decompress and render the data at the server 310.
[0101] hi one embodiment, the client agent 375 generates globally unique identifiers ("GUTDs") and associates each GUTD with a particular codec. The client agent 375 then fransmits the list of generated GUTDs to the server agent 350 to inform the server agent 350 of the media formats supported by the client agent 375. In another embodiment, the client agent 375 fransmits a list of four character codes, each four character code being associated with a particular codec, to the server agent 350 to inform the server agent 350 of the media formats supported by the client agent 375.
[0102] Upon receiving the list of supported media formats from the client agent 375, the server agent 350 registers, at step 408, the first output filter module 355A and/or the second output filter module 355B on the server 310, as associated with the supported media formats. The server agent 350, at step 412, then reports back to the client agent 375 that the server 310 can handle all of the enumerated media formats.
[0103] At step 416, an application program 340 starts executing on the server 310. When the application program 340 identifies within its output, at step 420, the presence of media content, such as, for example, a media sfream, the first output filter module 355A, the second output filter module 355B, or both are invoked. If, at step 424, the application program 340 uses external codecs, both the first output filter module 355A and the second output filter module 355B are invoked at step 428 as the application program 340 attempts to invoke an external codec. The first output filter module 355A then intercepts, at step 432, an original compressed data set representing at least a portion of the media stream and transmits, at step 436, the original compressed data set to the client agent 375, without decompressing the data set. The client agent 375, at step 440, receives the original compressed data set and decompresses, at step 444, the original compressed data set to generate a decompressed data set. The client agent 375 uses either external codecs installed on the client 305 or codecs embedded in the client agent 375 itself to decompress the original compressed data set.
[0104] If, instead, at step 424, the application program 340 uses codecs embedded in the application program 340 itself, the second output filter module 355B is, at step 448, invoked as the application program 340 attempts to invoke an OS-level renderer to display the decompressed data set. The second output filter module 355B then intercepts, at step 452, a first decompressed data set, representing at least a portion of the media stream, from the output of the application program 340 and compresses, at step 456, the intercepted first decompressed data set. A variety of compression techniques, including both lossy compression techniques and lossless compression techniques, may be used by the second output filter module 355B, at step 456, to compress the media sfream.
[0105] Where the media stream is a video sfream, the intercepted first decompressed data set may be compressed, at step 456, by the second output filter module 355B using, for example, a lightweight lossy video encoding algorithm, such as, for example, MJPEG compression. In using the lightweight lossy video encoding algorithm, the second output filter module 355B may choose the desired compression ratio or it may use a predetermined compression ratio. The degree of quality loss chosen by the second output filter module 355B will, typically, depend on the available bandwidth of the network connection. For example, where a user of the client 305 uses a slow modem to connect to the network 315, the second output filter module 355B may choose to use low quality video. Where, on the other hand, a user of the client 305 uses a LAN link or a broadband connection to connect to the network 315, the second output filter module 355B may choose to use a higher quality video.
[0106] Following compression of the intercepted first decompressed data set at step 456, the second output filter module 355B transmits, at step 460, the compressed data set to the client agent 375 in place of the first decompressed data set. The client agent 375, at step 464, receives the compressed data set and decompresses, at step 468, the compressed data set to generate a second decompressed data set. Again, the client agent 375 uses either external codecs installed on the client 305 or codecs embedded in the client agent 375 itself to decompress the compressed data set. [0107] Regardless of whether the application program 340 uses external or embedded codecs, where the media stream, at step 470, includes a video stream, the second output filter module 355B, at step 472, captures information for locating images of the video stream on a display screen and transmits the captured information over the network 315 to the client agent 375. The client agent 375, at step 474, receives the information for locating the images of the video stream on the display screen.
[0108] Regardless, again, of whether the application program 340 uses external or embedded codecs and regardless, moreover, of whether the media stream, at step 470, includes a video stream, the server agent 350, at step 476, intercepts and transmits, over the network 315, graphical display commands, associated with the non-media graphical information outputted by the application program 340, to the client agent 375. The client agent 375, at step 480, receives the graphical display commands associated with the non-media graphical information.
[0109] Where, at step 484, the output filter module 355A, 355B captures timing information associated with the media stream, the output filter module 355 A, 355B transmits, at step 488, the timing information to the client agent 375. The client agent 375 receives, at step 492, the timing information and generates, at step 496, the media presentation at the presentation interface 345. To generate the media presentation at the presentation interface 345, the client agent 375 uses the timing information, the graphical display commands associated with the non-media graphical information, and, where the media stream includes a video stream, the information for locating the images of the video stream on a display screen to seamlessly combine the decompressed data set (or, more specifically, where the application program 340 uses embedded codecs, the second decompressed data set) with the non-media graphical information.
[0110] Where the output filter module 355A, 355B does not capture timing information associated with the media stream, the client agent 375 generates, at step 496, the media presentation at the presentation interface 345 using only the decompressed data set (or, more specifically, where the application program 340 uses embedded codecs, the second decompressed data set), the graphical display commands associated with the non-media graphical information, and, where the media sfream includes a video stream, the information for locating the images of the video stream on a display screen . [0111] FIGS. 5A-5C present another embodiment of the present invention implemented on a server using an operating system selected from the MICROSOFT WINDOWS family of operating systems. The operating system at the client need not be the same operating system as the operating system employed by the server.
[0112] For each major media type supported (e.g., video, audio, MIDI, text), the appropriate first output filter modules 355 A and second output filter modules 355B are registered with the server 310 (Step 504). As discussed above, the identified media types are exemplary and, utilizing this architecture, future media types may be added for operation in accord with the present invention.
[0113] An application program 340 executes at the server (Step 508) and a media stream is identified within the output of the application program 340 (Step 512). In one embodiment the media stream is identified when the application make a system call to the WINDOWS media subsystem to locate suitable codecs for handling the media stream. The application program 340 loads the first output filter module 355 A corresponding with the major media type of the media stream identified within the application output and having the highest merit of any filter registered with the system as capable of handling this major media type (Step 516). The first output filter module 355A communicates with the server agent 350 to request that the client agent 375 create a media stream as specified by transmitted properties: allocator (buffer) properties, media type (major: video, audio, etc.; minor: MPEG-1, MPEG-2, etc.; format, etc.), media stream priority for on-demand quality control, etc.
[0114] For each terminal services session, the server agent 350 organizes media streams in different contexts. Media streams, originating from different instances of an application program 340 have different major contexts. Media streams originating from the same instance of an application program 340 have different minor contexts within the same major context. The server agent 350 creates a command queue for control information for each major context, and a media samples queue for each minor context. The media samples queues help the server agent 350 accommodate variations in bandwidth and network latency by, for example, allowing certain frames of video data to be dropped to maintain playback speed and to select which data from various streams to drop. For example, video data may be discarded before audio data is discarded. [0115] The queues of different major contexts are serviced in a round-robin fashion. Within a major context, the control queue typically has the highest priority, while the media queues corresponding to different minor contexts might have different priorities based on the nature of the sfream. Generally, audio sfreams will have higher priority than video sfreams.
[0116] The client agent 375 attempts to create a media stream conforming to the media properties specified in the transmitted request (Step 520). When the operating system for the client agent 374 is a member of the MICROSOFT WINDOWS family of operating systems, the client agent 375 attempts to load a generic source filter module and connect it with native or third party provided output filter modules that are capable of decompressing and rendering the media sfream. Client agents 375 utilizing different operating systems may undertake different actions to achieve the same result. For example, the client might load and use the services of a multimedia application capable of decompressing and rendering a media stream, such as a MPEG-1 video stream.
[0117] The client agent 375 sends a reply to the server agent 350, indicating whether the request to create the specified media stream was successful (Step 524). The server agent 350 may, in turn, provide this information to the application program 340. If the client succeeds in creating the stream, then using DIRECTSHOW intelligent connect logic, the application program 340 loads the second output filter module 355B and connects it to the first output filter module 355 A (Step 528), thereby enabling the processing of compressed media and control information transmitted to the client agent 375.
[0118] The first output filter module 355A intercepts the original compressed data set representing at least a portion of the media stream and timing information, and further detects dynamic changes in the media type, such as format changes (Step 532). The second output filter module 355B intercepts media stream-generic commands (e.g., play, pause, stop, flush, end-of-stream) and stream-specific commands (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment commands) (Step 536).
[0119] The server agent 350 transmits the original compressed data set, optionally including timing information, to the client agent 375 (Step 540). When bandwidth is limited the server agent 350 may drop samples from lower priority media queues to make bandwidth available for higher priority media queues. Media queues having equal priority are serviced in round robin fashion. Additionally, the transmission rate of the second output filter module 355B to the server agent 350 and, therefore, to the client agent 375, is controlled so that the client's media queue is prefilled as quickly as allowed by the network throughput before playback commences, reducing latency in the initial playback. Thereafter, the media queue of the client agent 375 is used to accommodate variations in network latency, as discussed below. The server agent 350 also transmits control information to the client agent 375 (Step 544). Typically, control information is never discarded despite constraints on the availability of network bandwidth.
[0120] The client agent 375 receives the original compressed data set and optional timing information (Step 548). The client agent 375 also receives the control information and any notifications of dynamic changes in the media type, e.g., format changes (Step 552). Like the server agent 350, the client agent 375 creates a single command queue for control information for each major context and a media samples queue for each minor context. The media samples queues let the client agent 375 accommodate variations in bandwidth and network latency. The control queues of the different major context are serviced in a round-robin fashion. In this embodiment, using DIRECTSHOW technologies, the media samples in the queues are utilized by the associated source filter module(s) based on the timing information.
[0121] The client agent sends status notifications to the server agent (Step 556). These notifications include indications of the client's success or failure creating a requested media sfream. Variations in bandwidth or network latency may be addressed by adjusting the media queue of the client agent 375 as necessary. Should the client agent 375 detect that one or more of its media queues have fallen below or have exceeded certain resource thresholds, then the client agent 375 may respectively send a burst or a stop request to the server agent 350 for the desired amount of media queue adjustment. The client agent 375 may also send status information concerning unexpected errors encountered in the generation of the media presentation. If the client's flow-adjustment notifications do not have a timely effect and, consequently, a media queue is depleted or overflows, then the client agent 375 may pause re-buffer the queue or drop media samples, as required (Step 560).
[0122] A native or third party supplied output filter module (transform filter) decompresses the received compressed data set to generate a decompressed data set (Step 564). The client agent 375 applies control information, timing information, and dynamic changes in the media type to the collection of filter modules (e.g., an instance of the generic source filter module, the connected instances of native or third party output filter modules, etc.) (Step 568).
[0123] If the client fails to create the requested media sfream, e.g., because it lacks appropriate filter modules, processing resources, or memory — or some other reason, then using DIRECTSHOW intelligent connect logic, the application program 340 running on the server 310 unloads the first output filter module 355 A and subsequently loads the native or third party-supplied output filter modules, which decompress and render the data sfream at the server 310 (Step 570).
[0124] With the native or third party supplied output filter modules loaded and operating, a decompressed data set representing at least a portion of the media sfream is intercepted (Step 574), compressed (Step 578), and transmitted to the client agent 375 (Step 582), where they are received by the client agent 375 (Step 586).
[0125] Regardless of whether the client succeeds or fails in the creation of a media stream, the server agent 350 intercepts and fransmits graphical display commands, associated with non-media graphical information, to the client agent 375 (Step 590). The client agent 375 receives the graphical display commands (Step 594). The client agent 375 generates the media presentation using the native or third party provided output filter module (renderer filter) to render the decompressed data sent onto the presentation interface 245 (Step 598).
[0126] The present invention may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture. The article of manufacture may be a floppy disk, a hard disk, a CD ROM, a flash memory card, a PROM, a RAM, a ROM, or a magnetic tape, hi general, the computer-readable programs may be implemented in any programming language. Some examples of languages that can be used include C, C++, or JAVA. The software programs may be stored on or in one or more articles of manufacture as object code.
Equivalents
[0127] The invention can be embodied in other specific forms without departing from the spirit or essential characteristics thereof. For example, it should be understood that a single server may implement both the invention of FIGS. 1 & 2 and the invention of FIGS. 3, 4A-4C, and 5A-5C. The foregoing embodiments are therefore to be considered in all respects illustrative rather than limiting on the invention described herein. Scope of the invention is thus indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims

What is claimed is:
1. A method for generating a media presentation at a client, the method comprising: (a) transmitting output from an application program executing on a server to the client; (b) identifying a media stream within the application output; (c) intercepting an original compressed data set representing at least a portion of the media stream before processing by the application program; (d) transmitting the original compressed data set to the client; (e) capturing at least one of timing information and control information associated with the media stream; and (f) transmitting the captured information to the client.
2. The method of claim 1 further comprising: receiving the original compressed data set and the captured information at the client; decompressing the original compressed data set at the client to generate a decompressed data set; and generating the media presentation at the client using the decompressed data set and the captured information.
3. The method of claim 1 further comprising transmitting the original compressed data set to a second client substantially simultaneously with the fransmission of (d).
4. The method of claim 1 wherein the transmission to the client occurs through at least one intermediary node.
5. A method for generating a media presentation at a client, the method comprising: (a) transmitting output from an application program executing on a server to the client; (b) identifying a media stream within the application output; (c) intercepting a first decompressed data set representing at least a portion of the media stream; (d) compressing the intercepted first decompressed data set; and (e) transmitting the compressed data set to the client in place of the first decompressed data set; (f) capturing at least one of timing information and control information associated with the media stream; and (g) transmitting the captured information to the client.
6. The method of claim 5 further comprising: receiving the compressed data set and the captured information at the client; decompressing the compressed data set at the client to generate a second decompressed data set; and generating the media presentation at the client using the second decompressed data set and the captured information.
7. The method of claim 5 further comprising transmitting the compressed data set to a second client in place of the first decompressed data set substantially simultaneously with the transmission of (e).
8. The method of claim 5 wherein the fransmission to the client occurs through at least one intermediary node.
9. A method for generating a media presentation at a client, the method comprising: (a) informing a server of at least one media format supported by a client agent installed on the client; (b) receiving a compressed data set representing at least a portion of a media sfream at the client; (c) decompressing the compressed data set at the client to generate a decompressed data set; (d) generating the media presentation at the client using the decompressed data set; (e) receiving at least one of timing information and control information associated with the media stream; and (f) using the received information to generate the media presentation.
10. A system for generating a media presentation at a client, the system comprising: an application program configured to identify a media stream within output produced by the application program; and an output filter module configured to intercept an original compressed data set representing at least a portion of the media stream before processing by the application program and transmit the original compressed data set to the client wherein the output filter module is further configured to capture at least one of timing information and control information associated with the media stream and to transmit the captured information to the client.
11. The system of claim 10 further comprising a client agent configured to receive the original compressed data set and the captured information, decompress the original compressed data set to generate a decompressed data set, and generate the media presentation using the decompressed data set and the captured information.
12. The system of claim 10 wherein the output filter module substantially simultaneously fransmits the original compressed data to a plurality of clients.
13. The system of claim 10 wherein the output filter module transmits the original compressed data to the client through at least one intermediary node.
14. The system of claim 10 wherein the filter merit of the output filter module exceeds the filter merit of the other filter modules on the system.
15. A system for generating a media presentation at a client, the system comprising: an application program configured to identify a media stream within output produced by the application program; and an output filter module configured to intercept a first decompressed data set representing at least a portion of the media stream, compress the intercepted first decompressed data set of the media stream, and transmit the compressed data set in place of the first decompressed data set to the client, wherein the output filter module is further configured to capture at least one of timing information and control information associated with the media stream and to transmit the captured information to the client.
16. The system of claim 15 further comprising a client agent configured to receive the compressed data set and the captured information, decompress the compressed data set to generate a second decompressed data set, and generate the media presentation using the second decompressed data set and the captured information.
17. The system of claim 15 wherein the output filter module substantially simultaneously transmits the compressed data set in place of the first decompressed data set to a plurality of clients.
18. The system of claim 15 wherein the output filter module transmits the compressed data set in place of the first decompressed data set to the client through at least one intermediary node.
19. The system of claim 15 wherein the filter merit of the output filter module exceeds the filter merit of the other filter modules on the system.
20. A system for generating a media presentation at a client, the system comprising: a server; and the client in communication with the server, the client comprising a client agent configured to inform the server of at least one media format supported by the client agent, receive a compressed data set representing at least a portion of a media sfream; decompress the compressed data set at the client to generate a decompressed data set, and generate the media presentation using the decompressed data set, wherein the client agent is further configured to receive at least one of timing information and control information associated with the media stream and to use the received information to generate the media presentation.
21. A method for the display of graphical output at a client from an application program executing on a server, the method comprising: identifying a non-textual element within the application output; retrieving a decompressed data set associated with the non-textual element; identifying a compression technique for use with the decompressed data set in response to the contents of the decompressed data set; compressing the decompressed data set with the identified compression technique to form a recompressed data set; and transmitting the recompressed data set to the client in place of the decompressed data set.
22. The method of claim 21 wherein the identified compression technique is a lossless compression technique.
23. The method of claim 22 wherein the non-textual element is an image having large areas of the same color.
24. The method of claim 22 wherein the non-textual element is a computer-generated image.
25. The method of claim 22 wherein the lossless compression technique is 2DRLE compression.
26. The method of claim 21 wherein the identified compression technique is a lossy compression technique.
27. The method of claim 26 wherein the non-textual element is a continuous tone image.
28. The method of claim 26 wherein the non-textual element is a photographic image.
29. The method of claim 26 wherein the lossy compression technique is JPEG compression.
30. The method of claim 21 wherein identifying a compression technique comprises: compressing the decompressed data set using a lossless compression technique to form a first test data set; comparing the size of the first test data set to a predetermined value; compressing the decompressed data set using a lossy compression technique to form a second test data set when the size of the first test data set exceeds the predetermined value; and selecting a lossy compression technique when the size of the second test data set is less than a predetermined percentage of the size of the first test data set.
31. The method of claim 21 wherein identifying a compression technique comprises applying image processing algorithms to the decompressed data set to determine if the non-textual element is photographic and selecting a lossy compression technique when the non-textual element is photographic.
32. The method of claim 21 wherein identifying a compression technique comprises applying image processing algorithms to the decompressed data set to determine if the non-textual element is continuous tone and selecting a lossy compression technique when the non-textual element is continuous tone.
33. The method of claim 21 wherein identifying a compression technique comprises enumerating the number of pixel colors in the decompressed data set and selecting a lossy compression technique when the number of enumerated pixel colors exceeds a predetermined value.
34. A system for the display of graphical output at a client from an application program executing on a server, the system comprising: an application program; and an output filter module configured to identify a non-textual element within output produced by the application program, retrieve a decompressed data set associated with the non-textual element, identify a compression technique for use with the decompressed data set in response to the contents of the decompressed data set, compress the decompressed data with the identified compression technique to form a recompressed data set, and transmit the recompressed data set to the client in place of the decompressed data set.
35. The system of claim 34 wherein the identified compression technique is a lossless compression technique.
36. The system of claim 35 wherein the non-textual element is an image having large areas of the same color.
37. The system of claim 35 wherein non-textual element is a computer-generated image.
38. The system of claim 35 wherein the lossless compression technique is 2DRLE compression.
39. The system of claim 34 wherein the identified compression technique is a lossy compression technique.
40. The system of claim 39 wherein the non-textual element is a continuous tone image.
41. The system of claim 39 wherein the non-textual element is a photographic image.
42. The system of claim 39 wherein the lossy compression technique is JPEG compression.
43. The system of claim 34 wherein the output filter module identifies a compression technique by compressing the decompressed data set using a lossless compression technique to form a first test data set, comparing the size of the first test data set to a predetermined value, compressing the decompressed data set using a lossy compression technique to form a second test data set when the size of the first data set exceeds the predetermined value, and selecting a lossy compression technique when the size of the second test data set is less than a predetermined percentage of the first test data set.
44. The system of claim 34 wherein the output filter module identifies a compression technique by applying image processing algorithms to the decompressed data set to determine if the non-textual element is photographic and selecting a lossy compression technique when the non-textual element is photographic.
45. The system of claim 34 wherein the output filter module identifies a compression technique by applying image processing algorithms to the decompressed data set to determine if the non-textual element is continuous tone and selecting a lossy compression technique when the non-textual element is continuous tone.
46. The method of claim 34 wherein identifying a compression technique comprises enumerating the number of pixel colors in the decompressed data and selecting a lossy compression technique when the number of enumerated pixel colors exceeds a predetermined value.
47. A method for generating a media presentation at a client, the method comprising: (a) transmitting output from an application program executing on a server to the client; (b) identifying a plurality of media sfreams within the application output; (c) associating each media sfream with a server media queue, each server media queue having an assigned priority; (d) assigning a priority to a server command queue; (e) transmitting the contents of the server queues to the client; and (f) when network bandwidth between the server and the client is insufficient to carry the transmissions of all of the server queues, dropping samples from server queues having lower assigned priorities and transmitting samples from server queues having higher assigned priorities.
48. The method of claim 47 wherein the priorities of the media queues are based on the media type stored in each queue.
49. The method of claim 47 further comprising: (g) receiving the transmissions at a client having a plurality of client media queues, each client media queue associated with a server media queue; and (h) when a client media queue is full, discarding transmissions associated with the full client media queue.
50. The method of claim 47 wherein the command queue has the highest priority.
51. A method for generating a media presentation at a client, the method comprising: (a) transmitting output from an application program executing on a server to the client; (b) identifying a media stream within the application output; and (c) transmitting data representative of the media stream to the client, wherein a client media queue is filled by the transmissions as quickly as permitted by network throughput before client playback begins.
52. The method of claim 51 further comprising receiving, at the server, flow control commands from the client issued depending on the data level in a corresponding media queue at the client.
53. A method for the display of graphical output at a client from an application program executing on a server, the method comprising: identifying a non-textual element within the application output; retrieving a decompressed data set associated with the non-textual element; apportioning the decompressed data set into at least one strip; and transmitting the at least one strip to the client.
54. The vine method of claim 53 wherein the at least one strip is present in an second decompressed data set representing another non-textual element.
55. The method of claim 53 further comprising storing the at least one strip in a database at the server.
56. The method of claim 55 further comprising storing the height, width, and cyclic redundancy check (CRC) information of the at least one strip in the server database.
57. The method of claim 55 further comprising storing the at least one strip in a database at the client.
58. The method of claim 57 further comprising storing the height, width, and cyclic redundancy check (CRC) information of the at least one strip in the client database.
59. The method of claim 57 further comprising: intercepting a second decompressed data set representing another non-textual element; apportioning the second decompressed data set into a second strip; identifying the second strip as previously stored in the server database; retrieving an identifier associated with the previously stored second strip; and transmitting the identifier to the client instead of transmitting the second strip to the client.
60. The method of claim 59 further comprising: receiving the identifier at the client; identifying the strip stored in the client database associated with the received identifier; and displaying the stored strip associated with the received identifier at the client.
61. The method of claim 57 further comprising retaining the contents of the client database after the disconnection of the client from the server.
62. The method of claim 61 further comprising: reconnecting the client to the server; and transmitting a list of the contents of the client database to the server.
63. The method of claim 62 wherein the list of contents includes the height, width, and CRC information for the strips stored in the client database.
64. The method of claim 57 further comprising transmitting an instruction from the server to the client to delete a strip stored in the client database from the client database.
65. The method of claim 64 wherein the strip selected for removal is the strip having the least probability of being used in the future.
66. A method for transmitting a plurality of media sfreams to a client, the method comprising the steps of: identifying a first media sfream and a second media sfream within an application program output; associating the first media stream with a first queue having an assigned priority and the second media sfream with a second queue having an assigned priority lower than that of the first queue; intercepting a first compressed data set representing at least a portion of the first media stream, the first compressed data set comprising a first plurality of packets; intercepting a second compressed data set representing at least a portion of the second media stream, the second compressed data set comprising a second plurality of packets; storing the first plurality of packets in the first queue associated with the first media stream; storing the second plurality of packets in the second queue associated with the second media stream; transmitting the first plurality of packets from the first queue; transmitting at least some of the second plurality of packets from the second queue; and dropping others of the second plurality of packets from the second queue.
67. The method of claim 66 further comprising a step of executing the application program.
68. The method of claim 66 further comprising a step of creating a control queue storing a control command.
69. The method of claim 68, wherein the control command comprises one of stream- generic and stream-specific commands.
70. The method of claim 68, wherein the control queue is assigned the highest priority of all the queues.
71. The method of claim 66, wherein the priorities of the media queues are based on corresponding media types stored in each queue.
72. The method of claim 71 further comprising a step of detecting dynamic changes in the media type.
73. The method of claim 71, further comprising a step of invoking a first output filter module.
74. The method of claim 73 further comprising a step of invoking a second output filter in communication with the first output module.
75. The method of claim 74, wherein the first output module intercepts compressed data sets and the second output module intercepts control information.
76. The method of claim 73, wherein the first output filter module requests the client to create a media stream.
77. The method of claim 76 further comprises the steps of: (a) receiving a failure notice of creating the media sfream from the client; (b) unloading the first output filter module; and (c) loading one of a native and third party output filter modules.
78. The method of claim 66 further comprising a step of intercepting optional timing information.
PCT/US2004/029993 2003-09-12 2004-09-13 Method and apparatus for generating graphical and media displays at a thin client WO2005029864A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
AU2004305808A AU2004305808A1 (en) 2003-09-12 2004-09-13 Method and apparatus for generating graphical and media displays at a thin client
JP2006526396A JP2007505580A (en) 2003-09-12 2004-09-13 Method and apparatus for generating graphical and media displays in a thin client
EP04784000A EP1665798A1 (en) 2003-09-12 2004-09-13 Method and apparatus for generating graphical and media displays at a thin client
CA002538340A CA2538340A1 (en) 2003-09-12 2004-09-13 Method and apparatus for generating graphical and media displays at a thin client
IL174245A IL174245A0 (en) 2003-09-12 2006-03-12 Method and apparatus for generating graphical and media displays at a thin client

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US50257803P 2003-09-12 2003-09-12
US60/502,578 2003-09-12
US51046103P 2003-10-10 2003-10-10
US60/510,461 2003-10-10

Publications (2)

Publication Number Publication Date
WO2005029864A1 true WO2005029864A1 (en) 2005-03-31
WO2005029864A8 WO2005029864A8 (en) 2006-11-09

Family

ID=34381050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/029993 WO2005029864A1 (en) 2003-09-12 2004-09-13 Method and apparatus for generating graphical and media displays at a thin client

Country Status (6)

Country Link
EP (1) EP1665798A1 (en)
JP (1) JP2007505580A (en)
KR (1) KR20060110267A (en)
AU (1) AU2004305808A1 (en)
CA (1) CA2538340A1 (en)
WO (1) WO2005029864A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009006564A2 (en) * 2007-07-05 2009-01-08 Mediaport Entertainment, Inc. Systems and methods for monitoring devices, systems, users, and users activity at remote locations
WO2009146938A2 (en) * 2008-06-06 2009-12-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Portable user interface having access to a host computer
EP2403251A1 (en) 2010-07-01 2012-01-04 Fujitsu Limited Transmission of image updates from a server to a thin client
US8411972B2 (en) 2010-12-03 2013-04-02 Fujitsu Limited Information processing device, method, and program
GB2514777A (en) * 2013-06-03 2014-12-10 Displaylink Uk Ltd Management of memory for storing display data
US8953676B2 (en) 2010-07-01 2015-02-10 Fujitsu Limited Information processing apparatus, computer-readable storage medium storing image transmission program, and computer-readable non transitory storage medium storing image display program
US8982135B2 (en) 2011-01-31 2015-03-17 Fujitsu Limited Information processing apparatus and image display method
EP3160156A1 (en) * 2015-10-21 2017-04-26 Nagravision S.A. System, device and method to enhance audio-video content using application images

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9198084B2 (en) 2006-05-26 2015-11-24 Qualcomm Incorporated Wireless architecture for a traditional wire-based protocol
JP2009054097A (en) * 2007-08-29 2009-03-12 Casio Comput Co Ltd Drawing data processing device and drawing data processing program
KR101026759B1 (en) * 2008-08-26 2011-04-08 최백준 Apparatus amd method for providing a distributed processing of moving picture in server based computing system of terminal environment
US9398089B2 (en) 2008-12-11 2016-07-19 Qualcomm Incorporated Dynamic resource sharing among multiple wireless devices
JP5476734B2 (en) * 2009-02-19 2014-04-23 日本電気株式会社 Server, remote operation system, transmission method selection method, program, and recording medium
US9264248B2 (en) 2009-07-02 2016-02-16 Qualcomm Incorporated System and method for avoiding and resolving conflicts in a wireless mobile display digital interface multicast environment
US9582238B2 (en) 2009-12-14 2017-02-28 Qualcomm Incorporated Decomposed multi-stream (DMS) techniques for video display systems
JP5471794B2 (en) 2010-05-10 2014-04-16 富士通株式会社 Information processing apparatus, image transmission program, and image display method
KR101312268B1 (en) 2010-12-24 2013-09-25 주식회사 케이티 Method, cloud computing server, and cloud computing system for providing game service in cloud computing environment
US9413803B2 (en) 2011-01-21 2016-08-09 Qualcomm Incorporated User input back channel for wireless displays
US9582239B2 (en) 2011-01-21 2017-02-28 Qualcomm Incorporated User input back channel for wireless displays
US9787725B2 (en) 2011-01-21 2017-10-10 Qualcomm Incorporated User input back channel for wireless displays
US8964783B2 (en) 2011-01-21 2015-02-24 Qualcomm Incorporated User input back channel for wireless displays
US10135900B2 (en) 2011-01-21 2018-11-20 Qualcomm Incorporated User input back channel for wireless displays
US9065876B2 (en) 2011-01-21 2015-06-23 Qualcomm Incorporated User input back channel from a wireless sink device to a wireless source device for multi-touch gesture wireless displays
US10108386B2 (en) 2011-02-04 2018-10-23 Qualcomm Incorporated Content provisioning for wireless back channel
US9503771B2 (en) 2011-02-04 2016-11-22 Qualcomm Incorporated Low latency wireless display for graphics
TW201251429A (en) * 2011-06-08 2012-12-16 Hon Hai Prec Ind Co Ltd System and method for sending streaming of desktop sharing
US9525998B2 (en) 2012-01-06 2016-12-20 Qualcomm Incorporated Wireless display with multiscreen service
TW201419868A (en) * 2012-09-11 2014-05-16 Nec Corp Communication system and method, and server device and terminal
WO2020061797A1 (en) * 2018-09-26 2020-04-02 华为技术有限公司 Method and apparatus for compressing and decompressing 3d graphic data

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000033217A1 (en) * 1998-11-30 2000-06-08 Siebel Systems, Inc. Client server system with thin client architecture
WO2001075610A1 (en) * 2000-03-31 2001-10-11 Siebel Systems, Inc. Thin client method and system for generating page delivery language output from applets, views, and screen definitions
WO2001092973A2 (en) * 2000-05-26 2001-12-06 Citrix Systems, Inc. Method and system for efficiently reducing graphical display data for transmission over a low bandwidth transport protocol mechanism
US20030014476A1 (en) * 2001-01-03 2003-01-16 Peterson David Allen Thin client computer operating system
US20030055889A1 (en) * 2001-08-27 2003-03-20 Meng-Cheng Chen Cache method
US20030065715A1 (en) * 2001-08-20 2003-04-03 Burdick William R. System and method of a wireless thin-client, server-centric framework
EP1320240A2 (en) * 2000-05-26 2003-06-18 Citrix Systems, Inc. Method and system for efficiently reducing graphical display data for transmission over a low bandwidth transport protocol mechanism
US20040103438A1 (en) * 2002-11-27 2004-05-27 Yong Yan Methods and systems for transferring events including multimedia data

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4120711B2 (en) * 1996-11-15 2008-07-16 株式会社日立製作所 Video display system
JPH11341027A (en) * 1998-05-26 1999-12-10 Canon Inc Method and device for bus management
JP4600875B2 (en) * 2000-08-28 2010-12-22 ソニー株式会社 Multimedia information processing apparatus and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000033217A1 (en) * 1998-11-30 2000-06-08 Siebel Systems, Inc. Client server system with thin client architecture
WO2001075610A1 (en) * 2000-03-31 2001-10-11 Siebel Systems, Inc. Thin client method and system for generating page delivery language output from applets, views, and screen definitions
WO2001092973A2 (en) * 2000-05-26 2001-12-06 Citrix Systems, Inc. Method and system for efficiently reducing graphical display data for transmission over a low bandwidth transport protocol mechanism
EP1320240A2 (en) * 2000-05-26 2003-06-18 Citrix Systems, Inc. Method and system for efficiently reducing graphical display data for transmission over a low bandwidth transport protocol mechanism
US20030014476A1 (en) * 2001-01-03 2003-01-16 Peterson David Allen Thin client computer operating system
US20030065715A1 (en) * 2001-08-20 2003-04-03 Burdick William R. System and method of a wireless thin-client, server-centric framework
US20030055889A1 (en) * 2001-08-27 2003-03-20 Meng-Cheng Chen Cache method
US20040103438A1 (en) * 2002-11-27 2004-05-27 Yong Yan Methods and systems for transferring events including multimedia data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHEN J ET AL: "Multimedia over ip for thin clients : building a collaborative resource-sharing prototype", MULTIMEDIA AND EXPO, 2001. ICME 2001. IEEE INTERNATIONAL CONFERENCE ON 22-25 AUG. 2001, PISCATAWAY, NJ, USA,IEEE, 22 August 2001 (2001-08-22), pages 431 - 434, XP010661867, ISBN: 0-7695-1198-8 *
CHIA-CHEN KUO ET AL: "Design and implementation of a network application architecture for thin clients", PROCEEDINGS OF THE 26TH. ANNUAL INTERNATIONAL COMPUTER SOFTWARE AND APPLICATIONS CONFERENCE. COMPSAC 2002. OXFORD, ENGLAND, AUG. 26 - 29, 2002, ANNUAL INTERNATIONAL COMPUTER SOFTWARE AND APPLICATIONS CONFERENCE, LOS ALAMITOS, CA : IEEE COMP. SOC, US, vol. CONF. 26, 26 August 2002 (2002-08-26), pages 193 - 198, XP010611116, ISBN: 0-7695-1727-7 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009006564A2 (en) * 2007-07-05 2009-01-08 Mediaport Entertainment, Inc. Systems and methods for monitoring devices, systems, users, and users activity at remote locations
WO2009006564A3 (en) * 2007-07-05 2009-02-26 Mediaport Entertainment Inc Systems and methods for monitoring devices, systems, users, and users activity at remote locations
WO2009146938A2 (en) * 2008-06-06 2009-12-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Portable user interface having access to a host computer
WO2009146938A3 (en) * 2008-06-06 2010-05-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Portable user interface having access to a host computer
EP2403251A1 (en) 2010-07-01 2012-01-04 Fujitsu Limited Transmission of image updates from a server to a thin client
US8819270B2 (en) 2010-07-01 2014-08-26 Fujitsu Limited Information processing apparatus, computer-readable non transitory storage medium storing image transmission program, and computer-readable storage medium storing image display program
US8953676B2 (en) 2010-07-01 2015-02-10 Fujitsu Limited Information processing apparatus, computer-readable storage medium storing image transmission program, and computer-readable non transitory storage medium storing image display program
US8411972B2 (en) 2010-12-03 2013-04-02 Fujitsu Limited Information processing device, method, and program
US8982135B2 (en) 2011-01-31 2015-03-17 Fujitsu Limited Information processing apparatus and image display method
GB2514777A (en) * 2013-06-03 2014-12-10 Displaylink Uk Ltd Management of memory for storing display data
GB2514777B (en) * 2013-06-03 2018-12-19 Displaylink Uk Ltd Management of memory for storing display data
EP3160156A1 (en) * 2015-10-21 2017-04-26 Nagravision S.A. System, device and method to enhance audio-video content using application images

Also Published As

Publication number Publication date
EP1665798A1 (en) 2006-06-07
WO2005029864A8 (en) 2006-11-09
KR20060110267A (en) 2006-10-24
JP2007505580A (en) 2007-03-08
AU2004305808A1 (en) 2005-03-31
CA2538340A1 (en) 2005-03-31

Similar Documents

Publication Publication Date Title
EP1665798A1 (en) Method and apparatus for generating graphical and media displays at a thin client
AU2009251123B2 (en) Methods and apparatus for generating graphical and media displays at a client
US5838927A (en) Method and apparatus for compressing a continuous, indistinct data stream
JP4716645B2 (en) Document viewing method
US7853648B1 (en) System and method for providing interactive images
US7653749B2 (en) Remote protocol support for communication of large objects in arbitrary format
US20100005187A1 (en) Enhanced Streaming Operations in Distributed Communication Systems
US20040080533A1 (en) Accessing rendered graphics over the internet
US9325759B2 (en) Methods and apparatus for generating graphical and media displays at a client
US7769900B1 (en) System and method for providing interframe compression in a graphics session
EP1821490A1 (en) Method for transmitting graphical data to a thin client
US7659907B1 (en) System and method for providing dynamic control of a graphics session
WO2023040825A1 (en) Media information transmission method, computing device and storage medium

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MK MN MW MX MZ NA NI NO NZ PG PH PL PT RO RU SC SD SE SG SK SY TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IT MC NL PL PT RO SE SI SK TR BF CF CG CI CM GA GN GQ GW ML MR SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2538340

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2004784000

Country of ref document: EP

Ref document number: 2004305808

Country of ref document: AU

Ref document number: 2006526396

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 174245

Country of ref document: IL

WWE Wipo information: entry into national phase

Ref document number: 1020067005104

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 649/KOLNP/2006

Country of ref document: IN

ENP Entry into the national phase

Ref document number: 2004305808

Country of ref document: AU

Date of ref document: 20040913

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2004305808

Country of ref document: AU

WWP Wipo information: published in national office

Ref document number: 2004784000

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1020067005104

Country of ref document: KR