WO2005029864A1 - Procede et dispositif destines a generer des affichages graphiques et multimedia au niveau d'un client - Google Patents

Procede et dispositif destines a generer des affichages graphiques et multimedia au niveau d'un client Download PDF

Info

Publication number
WO2005029864A1
WO2005029864A1 PCT/US2004/029993 US2004029993W WO2005029864A1 WO 2005029864 A1 WO2005029864 A1 WO 2005029864A1 US 2004029993 W US2004029993 W US 2004029993W WO 2005029864 A1 WO2005029864 A1 WO 2005029864A1
Authority
WO
WIPO (PCT)
Prior art keywords
client
data set
media
server
decompressed data
Prior art date
Application number
PCT/US2004/029993
Other languages
English (en)
Other versions
WO2005029864A8 (fr
Inventor
David Robinson
Lee Laborczfalvi
Pierre Semaan
Anil Roychoudry
Martin Duursma
Anatoliy Panasyuk
Georgy Momtchilov
Original Assignee
Citrix Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Citrix Systems, Inc. filed Critical Citrix Systems, Inc.
Priority to AU2004305808A priority Critical patent/AU2004305808A1/en
Priority to CA002538340A priority patent/CA2538340A1/fr
Priority to EP04784000A priority patent/EP1665798A1/fr
Priority to JP2006526396A priority patent/JP2007505580A/ja
Publication of WO2005029864A1 publication Critical patent/WO2005029864A1/fr
Priority to IL174245A priority patent/IL174245A0/en
Publication of WO2005029864A8 publication Critical patent/WO2005029864A8/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8193Monomedia components thereof involving executable data, e.g. software dedicated tools, e.g. video decoder software or IPMP tool
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9577Optimising the visualization of content, e.g. distillation of HTML documents
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • H04N21/23113Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion involving housekeeping operations for stored content, e.g. prioritizing content for deletion because of storage space restrictions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • H04N21/2312Data placement on disk arrays
    • H04N21/2318Data placement on disk arrays using striping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2347Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving video stream encryption
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2402Monitoring of the downstream path of the transmission network, e.g. bandwidth available
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25858Management of client data involving client software characteristics, e.g. OS identifier
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41415Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/4143Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a Personal Computer [PC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42653Internal components of the client ; Characteristics thereof for processing graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44209Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4424Monitoring of the internal components or processes of the client device, e.g. CPU or memory load, processing speed, timer, counter or percentage of the hard disk space used
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4425Monitoring of client processing errors or hardware failure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4431OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB characterized by the use of Application Program Interface [API] libraries
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6143Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a satellite
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6156Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
    • H04N21/6175Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6373Control signals issued by the client directed to the server or network components for rate control, e.g. request to the server to modify its transmission rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6408Unicasting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/654Transmission by server directed to the client
    • H04N21/6547Transmission by server directed to the client comprising parameters, e.g. for client setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6582Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6583Acknowledgement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17336Handling of requests in head-ends

Definitions

  • the invention generally relates to distributed processing, and, more particularly, to generating a display having graphical and/or media components at a client.
  • a thin-client protocol can be used for displaying output, produced by an application running on a server, on a client running on a computer with limited processing capabilities.
  • Two exemplary thin client protocols are ICA, Independent Computing Architecture from Citrix Systems, Inc., Ft. Lauderdale, FL and RDP, Remote Desktop Protocol f om Microsoft, Inc., Redmond, WA.
  • the client is also sometimes referred to as a remote terminal session.
  • One thin-client protocol intercepts commands by the application program to the server operating system (“OS”) to draw to a display screen. The intercepted commands are transmitted to the remote session using, for example, one or more presentation layer packets.
  • OS server operating system
  • the remote session e.g., thin- client
  • the remote session passes the received commands to the remote session OS.
  • the thin-client draws the application program output on its display using the received commands, hi this manner, the application program appears to be executing on the thin-client.
  • bitmap format of an image is generally a very large data set.
  • the thin-client protocol must transmit over the network the bitmap representation of an image, which is a large amount of data, along with the applicable commands on how to display the bitmap representation.
  • this results in a large time delay before the complete image is received and displayed on the client. This can result in inconvenience and unhappiness for the user of the client.
  • transmission of these large bitmap formats results in large costs associated with each transmission.
  • a video file is rendered as a series of bitmaps and audio information is rendered using pulse code modulation.
  • the thin-client protocol transmits the series of bitmaps representing the video file and/or the pulse code modulated signal representing the audio information over the network.
  • This transmission is inefficient, requiring excessive bandwidth and significant CPU usage.
  • an unresponsive graphical user interface may result at the client.
  • Video playback for example, is often of low quality, may appear "jerky,” and may synchronize poorly with the audio presentation.
  • the invention lowers the time and cost of transmitting images and other non-textual elements, originally represented in large bitmap formats, by substituting, prior to transmission, available compressed formats for the bitmap formats.
  • images and other multimedia content is transmitted in an already-compressed format, such as JPEG, PNG, GIF, MPEG3 or MPEG4.
  • the time and cost of transmitting the media may be lowered by intercepting the already-compressed format and substituting another version of the media stream in a, typically, more compressed format. Transmitting the compressed formats can significantly reduce the bandwidth necessary to transmit the media stream.
  • the client decompresses the received data using available libraries.
  • the client substitutes the decompressed image for the original bitmap representations using, for example, modified thin-client protocol commands with other identifying data.
  • a compressed data set representing at least a portion of a media stream, is intercepted on a first computing device before it is decompressed.
  • the compressed data set is decompressed on the first computing device
  • the resulting decompressed data set is re- compressed on the first computing device.
  • a server transmits compressed data representative of a non-textual element or media stream
  • the processing burden on the server's central processing unit (CPU) is reduced.
  • the server's CPU would otherwise expend processing effort on decompressing the compressed data and transmitting the larger volume of uncompressed data, with the attendant processing burden of the increased amount of network communications.
  • delegating decompression and rendering operations to the clients also improves server performance.
  • Client performance is further improved, in that the processing effort at the client associated with the receipt of a larger volume of uncompressed data is reduced.
  • the invention relates to a method for generating a graphical display at a client.
  • the method includes transmitting output from an application program executing on a server to the client, identifying a bitmap representation within the application output, and determining a check value for the bitmap representation.
  • the method also includes retrieving a compressed data format of the bitmap representation using at least in part the check value and transmitting to the client the compressed data format in place of the bitmap representation.
  • the invention in another aspect, relates to a method for generating a graphical display at a client.
  • the method includes transmitting output from an application program executing on a server to the client and identifying a non-textual element within the application output.
  • the method also includes retrieving a compressed data format associated with the non-textual element and transmitting to the client the compressed data format in place of the non-textual element.
  • the method includes identifying a textual element within the application output and transmitting to the client the textual element.
  • the method includes receiving the compressed data format, and optionally the textual element, at the client and generating a display at the client using the compressed data format, and optionally the textual element.
  • the method includes transmitting the compressed data format using at least one presentation layer protocol packet.
  • the method includes transmitting the at least one presentation layer protocol packet using a command for transmitting a file in its native format.
  • the method includes conforming the at least one presentation layer protocol packet to a remote access protocol, a thin-client protocol, and or a presentation protocol
  • the non-textual element is a bitmap representation and the method includes replacing the bitmap representation with the compressed data format.
  • the method includes determining the capability of the client to render the non-textual element using the compressed data format. The method further includes, upon determination that the client cannot render the non-textual element using the compressed data format, transmitting an image- rendering library capable of rendering the non-textual element using the compressed data format.
  • the method includes intercepting the application output and inspecting the intercepted output for a bitmap representation of the nontextual element.
  • the method includes calculating a first check value for a bitmap representation of the non-textual element and searching an image store for the compressed data format having a check value identical to the first check value.
  • the invention in another aspect, relates to a system for generating a graphical display at a client.
  • the system includes an output filter module and a server agent.
  • the output filter module is configured to intercept output produced by an application program, identify a non-textual element of the output, and retrieve a compressed data format associated with the non-textual element.
  • the server agent is configured to transmit to the client the compressed data format in place of the non-textual element.
  • the system includes a server node, which includes the server agent and the output filter module.
  • the system includes a client node.
  • the client node includes a client agent and a display.
  • the client agent is configured to receive the compressed data format and to generate a display of the non-textual element using the received compressed data format.
  • the system further includes a network.
  • the invention in another aspect relates to an article of manufacture having computer-readable program means embodied therein for generating a graphical display at a client.
  • the article includes computer-readable program means for performing any of the aforementioned methods.
  • the invention relates to a method for generating a media presentation at a client.
  • the method includes transmitting output from an application program executing on a server to the client, identifying a media stream within the application output, intercepting an original compressed data set representing at least a portion of the media stream before processing by the application program, and transmitting the original compressed data set to the client.
  • the invention in another aspect, relates to another method for generating a media presentation at a client.
  • This method includes transmitting output from an application program executing on a server to the client, identifying a media stream within the application output, intercepting a first decompressed data set representing at least a portion of the media stream, compressing the intercepted first decompressed data set, and transmitting the compressed data set to the client in place of the first decompressed data set.
  • the invention relates to still another method for generating a media presentation at a client.
  • This method includes informing a server of at least one media format supported by a client agent installed on the client, receiving a compressed data set of a media stream at the client, decompressing the compressed data set at the client to generate a decompressed data set, and generating the media presentation at the client using the decompressed data set.
  • the invention in a further aspect, relates to an article of manufacture that embodies computer-readable program means for generating a media presentation at a client.
  • the article includes computer-readable program means for transmitting output from an application program executing on a server to the client, computer-readable program means for identifying a media stream within the application output, computer- readable program means for intercepting an original compressed data set representing at least a portion of the media stream before processing by the application program, and computer-readable program means for transmitting the original compressed data set to the client.
  • the invention relates to another article of manufacture that embodies computer-readable means for generating a media presentation at a client.
  • This article includes computer-readable program means for transmitting output from an application program executing on a server to the client, computer-readable program means for identifying a media stream within the application output, computer-readable program means for intercepting a first decompressed data set representing at least a portion of the media stream, computer-readable program means for compressing the intercepted first decompressed data set, and computer-readable program means for transmitting the compressed data set to the client in place of the first decompressed data set.
  • the invention relates to yet another article of manufacture that embodies computer-readable means for generating a media presentation at a client.
  • This article includes computer-readable program means for informing a server of at least one media format supported by a client agent installed on the client, computer-readable program means for receiving a compressed data set of a media stream at the client, computer-readable program means for decompressing the compressed data set at the client to generate a decompressed data set, and computer- readable program means for generating the media presentation at the client using the decompressed data set.
  • the methods further include, and the articles of manufacture further include computer- readable program means for, capturing timing information associated with the media stream, transmitting the timing information to the client, receiving the compressed data set and, optionally, the timing information at the client, decompressing the compressed data set at the client to generate a decompressed data set, and generating the media presentation at the client using the decompressed data set and, optionally, the timing information.
  • the methods further include, and the articles of manufacture further include computer-readable program means for, transmitting non-media graphical information from the application output to the client, receiving the non-media graphical information at the client, and generating the media presentation at the client using the decompressed data set and the non-media graphical information.
  • the invention relates to a system for generating a media presentation at a client.
  • the system includes an application program and an output filter module.
  • the application program is configured to identify a media stream within output produced by the application program.
  • the output filter module is configured to intercept an original compressed data set representing at least a portion of the media stream before processing by the application program and transmit the original compressed data set to the client.
  • the invention relates to another system for generating a media presentation at a client.
  • This system includes an application program and an output filter module.
  • the application program is configured to identify a media stream within output produced by the application program.
  • the output filter module is configured to intercept a first decompressed data set representing at least a portion of the media stream, compress the intercepted first decompressed data set of the media stream, and transmit the compressed data set in place of the first decompressed data set to the client.
  • the invention in yet another aspect, relates to another system for generating a media presentation at a client.
  • This system includes a server and the client in communication with the server.
  • the client includes a client agent configured to inform the server of at least one media format supported by the client agent, receive a compressed data set of a media stream, decompress the compressed data set at the client to generate a decompressed data set, and generate the media presentation using the decompressed data set.
  • the output filter module of the systems is further configured to capture timing information associated with the media stream and to transmit the timing information to the client.
  • the system further includes a client agent configured to receive the compressed data set and the optional timing information, decompress the compressed data set to generate a decompressed data set, and generate the media presentation using the decompressed data set and the optional timing information.
  • the client agent is further configured to receive non-media graphical information and to generate the media presentation at the client using the decompressed data set and the non-media graphical information.
  • the invention in another aspect, relates to another system for generating a media presentation at a client.
  • This system includes a network, a server in communication with the network, and the client in communication with the network.
  • the server includes an application program and at least one output filter module.
  • the application program is configured to identify a media stream within output produced by the application program.
  • the output filter module is configured to intercept a compressed data set representing at least a portion of the media stream before processing by the application program, and transmit the compressed data set to the client.
  • the output filter module intercepts only a portion of a compressed data set representing the media stream. For example, the output filter module may intercept every other frame of a video media stream.
  • the output filter module discards a portion of the data associated with each frame of video data in order to reduce the bandwidth necessary to transmit the video stream.
  • the client includes a client agent.
  • the client agent is configured to inform the server of at least one media format supported by the client agent, receive the compressed data set, decompress the compressed data set at the client to generate a decompressed data set, and generate the media presentation at the client using the decompressed data set.
  • the invention relates to an article of manufacture that embodies computer-readable program means for generating a media presentation at a client.
  • the article includes computer-readable program means for intercepting an original compressed data set of a media stream, and computer-readable program means for transmitting the original compressed data set to the client using a thin client protocol such as ICA or RDP.
  • the invention in another aspect, relates to another article of manufacture that embodies computer-readable program means for generating a media presentation at a client.
  • the article includes computer-readable program means for intercepting a decompressed data set of a media stream, computer-readable program means for compressing the intercepted decompressed data set, and computer-readable program means for transmitting the compressed data set to the client using a thin client protocol such as ICA or RDP.
  • FIG. 1 is a block diagram of an illustrative embodiment of a system to generate a graphical display for a remote terminal session in accordance with the invention
  • FIG. 2 is a flow diagram of an illustrative embodiment of a process to generate a graphical display for a remote terminal session in accordance with the invention
  • FIG. 3 is a block diagram of an illustrative embodiment of a system for generating a media presentation at a client in accordance with the invention
  • FIGS. 4A, 4B, & 4C are a flow diagram of an illustrative embodiment of a method for generating a media presentation at a client in accordance with the invention.
  • FIGS. 5 A, 5B, & 5C are a flow diagram of another embodiment of a method for generating a media presentation at a client in accordance with the present invention.
  • FIG. 1 illustrates a system 100 to generate a display for a remote terminal session that includes a first computing system ("client node”) 105 in communication with a second computing system (“server node”) 110 over a network 115.
  • client node a first computing system
  • server node a second computing system
  • the network 115 can be a local-area network (LAN), such as a company Intranet, or a wide area network (WAN), such as the Internet or the World Wide Web.
  • a user of the client node 105 can be connected to the network 115 through a variety of connections including standard telephone lines, LAN or WAN links (e.g., Tl, T3, 56kb, X.25), broadband connections (ISDN, Frame Relay, ATM), and wireless connections.
  • the client node 105 includes a client transceiver 130 to establish communication with the network 115.
  • the server node 110 includes a server transceiver 135 to establish communication with the network 115.
  • the connections can be established using a variety of communication protocols (e.g., ICA, RDP, HTTP, TCP/LP, IPX, SPX, NetBIOS, Ethernet, RS232, and direct asynchronous connections).
  • the server node 110 can be any computing device capable of providing the requested services of the client node 105. Particularly, this includes generating and transmitting commands and data to the client node 105 that represent the output being produced by an application program 140 executing on the server 110.
  • the server node 110 includes the server transceiver 135, the executing application program 140, a server agent 150, an output filter module 155 and an image store 160.
  • the server agent 150 includes a module that interfaces with a client agent 175 and other components of the server node 110 to support the remote display and operability of the application program 140.
  • the server agent module 150 and all modules mentioned throughout the specification are implemented as a software program and or a hardware device (e.g., ASICs or FPGAs).
  • server node 110 For clarity, all of these components are shown on server node 110. It is to be understood that the server node 110 can represent a single server or can represent several servers in communication with each over the network 115 or another network (not shown), hi multiple server embodiments, the functionality of the components can be distributed over the available servers. For example, in one embodiment with multiple servers, the transceiver 135, the application program 140, the server agent 150 and the output filter module 155 are on an application server and the image store 160 is on a storage device, such as a disk in a RAID system.
  • the client node 105 can be any computing device (e.g., a personal computer, set top box, wireless mobile phone, handheld device, personal digital assistant, kiosk, etc.) used to provide a user interface to the application program 140 executing on the server node 110.
  • the client node 105 includes the client transceiver 130, a display 145, a client agent 175 and a graphics library 180 (also referred to as an image-rendering library).
  • the client agent 175 includes a module, implemented as a software program and/or a hardware device (e.g., an ASIC or an FPGA) that receives commands and data from the server node 110 and from a user (not shown) of the client node 105.
  • a hardware device e.g., an ASIC or an FPGA
  • the client agent 175 uses the received information when interacting with other components of the client node 105 (e.g., when directing the operating system to output data onto the display 145). The client agent 175 also transmits requests and data to the server node 110 in response to server-issued commands or user actions at the client node 105.
  • the server node 110 hosts one or more application programs 140 that can be accessed by the client nodes 105.
  • applications include word processing programs such as Microsoft Word® and spreadsheet programs such as Microsoft Excel®, both manufactured by Microsoft Corporation of Redmond, Washington.
  • word processing programs such as Microsoft Word®
  • spreadsheet programs such as Microsoft Excel®
  • Other examples include financial reporting programs, customer registration programs, programs providing technical support information, customer database applications, and application set managers.
  • Another example of an application program is Internet Explorer®, manufactured by Microsoft Corporation of Redmond, Washington, and this program will be used as an exemplary application program 140 in the following discussion. It is understood that other application programs can be used.
  • the server node 110 communicates with the client node 105 over a transport mechanism.
  • the transport mechanism provides multiple virtual channels 185 through the network 115 so the server agent 150 can communicate with the client agent 175.
  • One of the virtual channels 185 provides a protocol for transmitting graphical screen data from the server node 110 to the client node 105.
  • the server 110 executes a protocol driver, in one embodiment as part of the server agent 150, that intercepts graphical display interface commands generated by the application program 140 and targeted at the server's operating system.
  • the protocol driver translates the commands into a protocol packet suitable for transmission over the transport mechanism.
  • the application program 140 in this example Internet Explorer®, executing on the server 110, retrieves a web page. As explained above, the application program 140 generates graphical display commands to the server operating system, as if it was going to display the output at the server node 110. The server agent 150 intercepts these commands and transmits them to the client agent 175. The client agent 175 issues the same or similar commands to the client operating system to generate output for the display 145 of the client node 105.
  • these graphical display commands may take the form of, for example, invocations of DirectShow®, DirectX®, or Windows graphic device interface (GDI) functionality.
  • GDI Windows graphic device interface
  • a web page has both textual elements (e.g., titles, text, and ASCII characters) and non-textual elements (e.g., images, photos, icons, and splash screens) incorporated therein.
  • the non-textual elements are sometimes transmitted to the Internet Explorer® application program 140 from a web server (not shown) in a compressed data format (e.g., a file or a data stream), also referred to as the non-textual element's native format. Examples of compressed formats are JPEG, GLF, and PNG.
  • the non-textual element represented in a compressed data format may be, for example, 20 kilobytes in size. That same non-textual element decompressed into its bitmap representation is, for example, 300 kilobytes in size.
  • the application program 140 when generating the display of the web page, retrieves, for example, a JPEG data format of a non-textual element and decompresses the JPEG data format into a bitmap for display.
  • the output filter module 155 determines that the bitmap representation is from a compressed format and obtains the corresponding compressed format of the non-textual element from the image store 160, as explained in more detail below.
  • the image store 160 is persistent storage. In other embodiments, the image store 160 is temporary storage, cache, volatile memory and/or a combination of temporary and persistent storage.
  • the server agent 150 replaces the bitmap representation of the non-textual element with the compressed non-textual element that the output filter module 155 retrieved from the image store 160.
  • the server agent 150 transmits the non-textual element in the compressed format, along with the graphical display interface commands associated with the bitmap representation, to the client node 105.
  • the server agent 150 uses a unique protocol command that identifies a transmission of a non-textual element that is not in bitmap representation, even though the associated commands are applicable to a bitmap representation of a non-textual element.
  • the protocol command can have a modifier comment, or a command switch.
  • the command can also use a change of context or a combination of multiple commands.
  • the server agent 150 may retrieve the uncompressed representation of the non-textual element and compress it, for example using 2DRLE compression, so as to provide a compressed version of the non-textual element.
  • the server agent 150 may also review the data in the uncompressed representation, select an appropriate compression algorithm for application to the uncompressed representation, apply the selected algorithm to the uncompressed representation, and provide the compressed representation to the client agent 175 for display. If the non-textual element was originally compressed, the server agent 150 may choose to recompress the image using the same or a different compression technique.
  • the algorithms available to the server agent 150 for the compression of the uncompressed representation include lossless compression algorithms and lossy compression algorithms.
  • Lossless compression algorithms reduce the size of the uncompressed representation without the loss of information contained in the representation, e.g., 2DRLE compression.
  • Lossy compression algorithms reduce the size of the uncompressed representation in such a way that information contained in the uncompressed representation is discarded, e.g., JPEG or JPEG2000 compression.
  • the server agent 150 selects a compression algorithm that is appropriate for compressing the uncompressed representation using — at least in part — the contents of the uncompressed representation, hi one embodiment, the server agent 150 determines that the uncompressed representation is a continuous tone image, such as a photographic image, and applies a lossy compression algorithm to the uncompressed representation. In another embodiment, the server agent 150 determines that the uncompressed representation contains large areas of the same color, e.g., a computer-generated image, and applies a lossless compression algorithm.
  • the number of colors contained in the pixels of the uncompressed representation is enumerated and when the counted number of colors exceeds a predetermined threshold value (e.g., 256 colors), a lossy compression algorithm is applied to the uncompressed representation.
  • a predetermined threshold value e.g., 256 colors
  • the server agent 150 compresses the uncompressed representation using a lossless compression algorithm and compares the size of the compressed result to a predetermined value. When the size of the compressed result exceeds the predetermined value, the uncompressed representation is compressed using a lossy compression algorithm. If the size of the result of the lossy compression is less than a predetermined percentage of the size of the result of the lossless compression, then the lossy compression algorithm is selected.
  • the client agent 175 receives the transmission of the non-textual element file in the compressed data format, along with the graphical display interface commands associated with the bitmap representation of the non-textual element.
  • the client agent 175 determines that the non-textual element is in the compressed data format and not the bitmap representation. In one embodiment, the client agent 175 makes this determination because the non-textual element in compressed format is transmitted using a unique protocol command. In another embodiment, the size of the non-textual element data and/or other characteristics about the non-textual element included with the associated graphical display interface commands are enough to enable the client agent 175 to make the determination.
  • the client agent 175 determines whether the client node 105 contains the necessary library 180 to decompress the compressed format of the non-textual element. If the client node 105 has the appropriate graphics libraryries) 180 installed to perform the decompression algorithms, the client agent 175 uses the library 180 to decompress the compressed format of the non-textual element into its bitmap representation. The client agent 175 performs the received associated graphical display interface commands on the bitmap representation to generate the non-textual element of the output of the application program 140 on the client display 145. [0054] In one embodiment, the client agent 175 does not contain all the decompression algorithms to decompress the non-textual element from a compressed format into a bitmap representation.
  • the client agent 175 requests the needed graphics library from the server node 110. In another embodiment, the client agent 175 determines if a predetermined set of the most widely used graphics libraries 180 are installed on the client node 105 prior to receiving any non-textual elements from the server node 110. If the most widely used graphics libraries 180 are not installed on the client node 105, the client agent 175 requests the missing libraries from the server node 110 prior to receiving any non-textual elements from the server node 110.
  • the client agent 175 determines which graphics libraries 180 the client node 105 includes and transmits that library information to the server agent 150.
  • the server agent 150 determines, using the transmitted library information, whether the client node 105 can render the compressed data format. If the server agent 150 determines that the client node 105 has the applicable library, the server agent 150 substitutes the compressed data format for the bitmap representation of the non-textual element. If the server agent 150 determines that the client node 105 does not have the applicable library, the server agent 150 does not substitute the compressed data format for the bitmap representation of the non-textual element and instead transmits the bitmap representation to the client 105.
  • the output filter module 155 determines that the bitmap representation is from a compressed format contained in the image store 160.
  • the output filter module 155 may make this determination if the source of the media stream is exposed in some manner by the document (such as, for example, by a file type identifier or by an application program specifically configured to expose this information).
  • the output filter module 155 may be provided with information regarding the source application associated with the media stream by the multimedia subsystem. In still other embodiments, the output filter module 155 calculates one or more check values for a bitmap representation.
  • the output filter module 155 can calculate a single check value for the entire bitmap representation and/or the output filter module 155 can calculate four check values, one for each quadrant for the entire bitmap representation, hi another example, the output filter module 155 can calculate N check values, one for each of the N lines in the bitmap representation.
  • a check value is the result of an algorithm that generates a substantially unique value for different arrangements of data.
  • the check value is, for example, a checktag, a Cyclic Redundancy Code ("CRC"), a check sum, or a result of a hashing function.
  • the check value is based on the bitmap representation and not the data as arranged in a compressed data format. However, when the compressed data format is stored in the image store 160, it is stored with a check value attribute that corresponds to the one or more check values of the bitmap representation of the compressed data when decompressed.
  • the check value is a checktag that includes a fixed identifier and a unique identifier.
  • the fixed identifier and the unique identifier are combined together and concealed within an image.
  • the fixed identifier is used to identify the checktag as such; the unique identifier is used to identify a specific image.
  • the fixed identifier is, for example, a globally unique identifier that is statistically unlikely to be found within an image.
  • the fixed identifier is a byte sequence that is easily recognizable during debugging and that has a balanced number of zero and one bits.
  • the unique identifier is a sequential identifier uniquely allocated for each image in the cache. The sequential unique identifier is XOR masked with another value so that the image identifiers with a small value (the most likely value) will be more likely to have a balanced number of zero and one bits.
  • the checktag is encoded into RGB color components, independently of whether the RGB components are part of the image or part of the color palette. More specifically, the checktag is treated as a stream of 160 bits (i.e., 20 separate bytes, each of which starts at bit 0, the least significant, and finishes at bit 7, the most significant bit). The least significant bit of each byte is overwritten by the next bit of the checktag. The other 7 bits of each byte remain unaltered.
  • a checktag is decoded by simply reversing the encoding procedure. After the checktag is decoded, the fixed identifier and the unique identifier are retrieved from the checktag. The retrieved fixed identifier is validated against a previously stored fixed identifier to identify the checktag as such. Where a match is found, the unique identifier is then used to retrieve information that is relevant to the identified image, such as the bitmap data associated with the image.
  • the output filter module 155 searches the image store 160 for a non-textual element in compressed data format that has a check value attribute that is the same as one or more check values the output filter module 155 calculates for the bitmap representation.
  • the output filter module 155 retrieves the compressed format of nontextual element with the same check value attribute as the one or more check values and sends the compressed format of the non-textual element to the server agent 150 for transmittal to the client agent 175 in place of the bitmap representation.
  • the server node 110 stores compressed formats of non-textual elements in the image store 160 the first time the application program 140 calls a graphics library (not shown) to create a bitmap representation from a compressed format file.
  • the output filter module 155 calculates the associated check value of the bitmap representation as the application program 140 decompresses the compressed format and generates the bitmap representation. As described above, the output filter module 155 can calculate the check value when the bitmap representation is complete, when a quadrant of the bitmap representation is complete, or when a line of the bitmap representation is complete.
  • the server 110 stores the compressed format file and the associated check value attribute in the image store 160 and retrieves the compressed format file the first and any subsequent times the application program 140 generates the associated nontextual element.
  • Whether the server 110 stores the compressed format file and its associated check value attribute(s) in the image store 160 in a temporary portion (e.g., RAM memory buffer or cache) or a persistent portion (e.g., disk or non-volatile memory buffer) is based at least in part on design and hardware limitations (e.g., the size of the persistent storage).
  • One exemplary criterion used to make that determination is the number of times the application program 140 generates the non-textual element. For example, if the application program 140 generates a particular non-textual element more than a predetermined number of times, the server 110 stores the compressed format file and its associated check value attribute(s) corresponding to that particular non-textual element persistently in the image store 160.
  • the image store 160 may apply eviction algorithms (e.g., a least-recently-used [LRU] algorithm) to remove data from the image store 160.
  • eviction algorithms e.g., a least-recently-used [LRU] algorithm
  • LRU least-recently-used
  • the server 110 stores the non-textual element if it is static or complex. For example, if the application program 140 always generates a splash screen at initialization, the server 110 stores the compressed format file corresponding to that splash screen in the persistent portion of the image store 160. In another embodiment, if the non-textual element is complex, static and/or generated repeatedly but does not have a corresponding compressed format file, the output filter module 155 generates a compressed format file for that non-textual element, in a standards-based or proprietary-based format, i any subsequent transmissions, the server agent 150 transmits the generated compressed format file in place of the bitmap representation.
  • the server agent 150 determines whether the client node 105 includes the applicable proprietary-based graphics library to decompress the compressed format file into a bitmap representation. If not included on the client node 105, the server agent 150 transmits the applicable library to the client node 105 for installation.
  • the illustrated embodiment depicts the image store 160 on the server node 110
  • at least a portion of the image store is on the client node 105.
  • the output filter module 155 calculates the one or more check values of the bitmap representation and transmits the one or more check values to the server agent 150.
  • the server agent 150 transmits these one or more check values to the client agent 175.
  • the client agent 175 searches the portion of the image store on the client node 105 for a compressed data format stored with an identical one or more check values attribute.
  • the client agent 175 transmits the results of this search to the server agent 150.
  • the server-side and client-side image stores 160 may persist after the termination of a session between a client and a server.
  • the client agent 175 communicates to the server agent 150 information about the contents of its image store 160 and, after comparison with the server's image store 160, the server agent 150 may use the nontextual elements in the image stores 160-or their strips, as discussed below-to reduce the amount of bandwidth required to create graphical and media display at the client.
  • the server agent 150 does not have to send either the compressed data format or the bitmap representation over the network 115.
  • the server agent 150 only transmits the graphical display interface commands associated with the bitmap representation of the non-textual element. If the compressed data format for the non-textual element does not exist on the client node 105, the output filter module 155 obtains the corresponding compressed format of the non-textual element from the image store 160.
  • the server agent 150 replaces the bitmap representation of the non-textual element with the nontextual element in the compressed data format that the output filter module 155 retrieved from the image store 160.
  • the server agent 150 transmits the non-textual element in the compressed format, along with the graphical display interface commands associated with the bitmap representation, to the client node 105.
  • the presence of the image store 160 on the server 110 and/or the client node 105 permits the caching of non-textual elements, reducing the bandwidth required to generate graphical and media displays at a client.
  • these non-textual elements may be subdivided into sub-regions, i.e., "strips," and the image store 160 will subsequently provide non-textual element caching, as discussed above, with strip-level granularity.
  • a web page may be comprised of multiple images, only some of which are visible, either fully or partially. For the images that are only partially shown, it is possible to send the full image to the client including the part that is obscured by other windows.
  • the image store 160 contains the graphical data forming the strip and, optionally, strip-related metadata such as the image strip height, the image strip width, and an image strip identifier, such as a cyclic redundancy check (CRC) checksum.
  • strip-related metadata such as the image strip height, the image strip width, and an image strip identifier, such as a cyclic redundancy check (CRC) checksum.
  • CRC cyclic redundancy check
  • each non-textual element used in a graphical rendering operation is divided into the aforementioned strips.
  • these strips have the same width as the non-textual element and a height that is less than or equal to the height of the non-textual element.
  • the server agent 150 maintains a database of strips that have previously been transmitted to the client for rendering. If subsequent rendering operations incorporate regions contained in a stored strip, the presence of the stored strip is identified in the server-side image store 160 and an appropriate identifier associated with the stored strip is retrieved. A message instructing the client agent 175 to render the stored strip associated with the identifier is transmitted to the client agent 175, requiring less bandwidth than a message including the contents of the strip itself. The client agent 175 retrieves its copy of the stored strip associated with the transmitted identifier from its image store 160' and displays the contents of the stored strip.
  • the server agent 150 identifies the common strips and instructs the client agent 175 to display the strip from the client's own image store 160, avoiding the retransmission of the strip for display.
  • the server agent 150 will detect the modification to the strip and retransmit the strip to the client agent 175 for rendering and/or incorporation in the client-side image store 160'.
  • FIG. 2 illustrates an exemplary embodiment of a process 200 to generate a display for a remote terminal session, using the exemplary embodiment of FIG. 1.
  • the output filter module 155 monitors the output of the application program 140 by monitoring calls made to the operating system of the server node 110. When the output filter module 155 detects (step 205) a display command from the application program 140, the output module 155 determines (step 210) whether the application program 140 is generating a bitmap representation of a non-textual element.
  • the output filter module 155 transmits (step 215) the display command to the server agent 150, which transmits that command, or a representative command defined in the protocol, to the client agent 175. If the application program 140 is generating a bitmap representation of a non-textual element, the output filter module 155 calculates (step 220) one or more check values corresponding to the bitmap representation of the non-textual image.
  • the output filter module 155 uses the one or more calculated check value(s), the output filter module 155 searches the image store 160 to determine (step 225) whether a compressed data format with identical check value attribute(s) exists. If there is a compressed data format in the image store 160 with check value attribute(s) identical to the one or more check values the output filter module 155 calculates, the output module 155 replaces (step 230) the bitmap representation of the non-textual element with the compressed data format. The output module 155 transmits (step 230) the compressed data format to the server agent 150 for eventual transmission to the client agent 175. The output module 155 also transmits all of the commands associated with the replaced bitmap representation along with the compressed data format.
  • the output module 155 determines (step 235) whether the bitmap representation of a non-textual element corresponding to the compressed data format meets a predetermined criterion for persistent storage (e.g., any of the criteria described above). If the output module 155 determines (step 235) that the predetermined criterion is met, the output module 155 stores (step 240) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the persistent portion of the image store 160.
  • a predetermined criterion for persistent storage e.g., any of the criteria described above.
  • the output module 155 determines (step 235) that the predetermined criterion is not met, the output module 155 stores (step 245) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the temporary portion of the image store 160.
  • the output module 155 stores (step 240 or 245) the compressed data format and the corresponding check value attribute, identical to the one or more calculated check values, in the image store 160, the output module 155 replaces (step 230) the bitmap representation of the non-textual element with the compressed data format.
  • the output module 155 transmits (step 230) the compressed data format to the server agent 150 for eventual transmission to the client agent 175.
  • the output module 155 continues monitoring the output generated by the application program 140 until the output module 155 detects (step 205) another display command from the application program 140.
  • the invention pertains to methods, systems, and articles of manufacture for generating a media presentation.
  • a compressed data set representing at least a portion of a media stream
  • a decompressed data set representing at least a portion of a media stream
  • a decompressed data set is intercepted and compressed on the first computing device and then transmitted, as above, to the second computing device, where it is decompressed and presented to the user.
  • FIG. 3 illustrates one embodiment of a system 300 that generates a media presentation according to this aspect of the invention.
  • the system 300 includes a first computing device, e.g., a server 310, in communication with a second computing device, e.g., a client 305, over a network 315.
  • a first computing device e.g., a server 310
  • a second computing device e.g., a client 305
  • the client 305, the server 310, and the network 315 have the same capabilities as the client 105, the server 110, and the network 115, respectively, described above.
  • the client 305 includes at least a client transceiver 330, a client agent 375, and a presentation interface 345.
  • the client agent 375 may be implemented as a software program and/or as a hardware device, such as, for example, an ASIC or an FPGA.
  • the client agent 375 uses the client transceiver 330 to communicate over the network 315 and generates a presentation having media and non-media components at the presentation interface 345.
  • the server 310 is an application server. As illustrated, the server 310 includes at least a server transceiver 335, an application program 340, a server agent 350, a first output filter module 355A, and a second output filter module 355B.
  • the server agent 350, the first output filter module 355 A, and the second output filter module 355B may be implemented as a software program and/or as a hardware device, such as, for example, an ASIC or an FPGA.
  • the server agent 350, the first output filter module 355 A, and the second output filter module 355B use the server transceiver 335 to communicate over the network 315.
  • the aforementioned components 335, 340, 350, 355A, and 355B are distributed and/or duplicated over several servers in communication with each other over the network 315, or over another network (not shown). This permits, for example, the transmission of media-related capability information, media streams, and control information to multiple clients and through multiple hops.
  • two or more of the aforementioned components 335, 350, 355A, and 355B may be combined into a single component, such that the functions, as described below, performed by two or more of the components 335, 350, 355A, and 355B are performed by the single component.
  • the aforementioned components 340, 355 A, and 355B are duplicated at the same server 310. This permits, for example, the simultaneous transmission of media-related capability information, media streams, and control information, pertaining to different/multiple applications 340, and different/multiple media streams within each application.
  • the different applications 340 may be identified as major stream contexts, and the different media streams as minor contexts within a specific major context.
  • the application program 340 illustrated in FIG. 3 is any application program 340 that renders, as part of its output, a media stream.
  • the media stream may be a video stream, an audio stream, or, alternatively, a combination of any number of instances thereof.
  • the application program 340 may output non-media graphical information.
  • non-media graphical information refers generally to all graphical information outputted by the application program 340 without the use of a codec or the equivalent, such as, for example, static graphical information, including, but not limited to, toolbars and drop-down menus.
  • Non-media graphical information also includes, for example, information for locating the static graphical information on a display screen.
  • the application program 340 may be, for example, the MICROSOFT ENCARTA application program manufactured by the Microsoft Corporation of Redmond, Washington. [0082] In one embodiment, the application program 340 uses external codecs, such as, for example, codecs installed in the operating system of the server 310, to decompress a compressed data set representing at least a portion of a media stream. In another embodiment, the codecs used by the application program 340 are embedded in the application program 340 itself.
  • the server 310 may include any number of executing application programs 340, some of which use external codecs, others of which use embedded codecs.
  • the application program 340 uses external codecs and desires to output a media stream, it requests that the operating system of the server 310 use the external codecs to decompress the compressed data set representing at least a portion of the media stream for subsequent display.
  • the codecs used by the application program 340 are embedded in the application program 340 itself, the application program 340, when desiring to output a media stream, uses the embedded codecs to decompress the compressed data set itself for subsequent display. Additionally, the application program 340 may generate and transmit graphical display commands, associated with the non-media graphical information, to the operating system of the server 310.
  • these graphical display commands may take the form of, for example, invocations of DirectShow®, DirectX®, or GDI functionality.
  • the application program 340 performs these tasks as if the application program 340 was going to generate a presentation having media and non-media components at the server 310.
  • the first output filter module 355A, the second output filter module 355B, and the server agent 350 intercept the compressed data set being passed to the external codecs, the decompressed data set generated by the embedded codecs, and the graphical display commands associated with the non-media graphical information, respectively, and (after first compressing the decompressed data set generated by the embedded codecs) transmit them, over the network 315, to the client agent 375.
  • the client agent 375 decompresses the received compressed data sets and issues the same or similar graphical display commands, associated with the non-media graphical information, to the operating system of the client 305 to generate a presentation having media and non-media components at the presentation interface 345 of the client 305.
  • the communications between the server agent 350 and the client agent 375 include information concerning media-related capabilities (e.g., buffer and media type properties, media stream priority, etc.) or control information (e.g., play, pause, flush, end-of-stream, and stop commands, parent window assignment, window positioning and clipping commands, scrolling, audio adjustment commands, volume, balance, etc.).
  • the first output filter module 355 A and the second output filter module 355B are invoked as an application program 340 using external codecs attempts to invoke an external codec to output a media stream.
  • the first output filter module 355A intercepts an original compressed data set representing at least a portion of the media stream. Instead of decompressing the data set, as an external codec would, the first output filter module 355 A transmits the original compressed data set over the network 315 to the client agent 375.
  • the second output filter module 355B acting as an OS-level renderer captures stream-independent (generic) control commands (e.g., play, pause, stop, flush, end-of-stream) and transmits the information over the network 315 to the client agent 375.
  • stream-independent (generic) control commands e.g., play, pause, stop, flush, end-of-stream
  • the second output filter module 355B also captures information for locating images of the video stream (e.g., parent window assignment, positioning and clipping commands) on a display screen and transmits the information over the network 315 to the client agent 375.
  • the second output filter module 355B may also capture information for audio adjustment (e.g., volume and balance adjustment commands) and transmit the information over the network 315 to the client agent 375.
  • the second output filter module 355B when an application program 340 that uses embedded codecs attempts to invoke, for example, an OS-level renderer to output a media stream, the second output filter module 355B is invoked.
  • the second output filter module 355B intercepts a first decompressed data set representing at least a portion of the media stream from the output of the application program 340.
  • the second output filter module 355B then compresses, as explained below, the intercepted first decompressed data set and transmits the resulting compressed data set, over the network 315, to the client agent 375.
  • the second output filter module 355B as above, also captures, where the media stream includes a video stream, information for locating images of the video stream on a display screen and transmits the information over the network 315 to the client agent 375.
  • the server agent 350 may implement this interception functionality by providing a DirectShow® Transform filter with a filter merit level that exceeds that of the other filters installed on the server 310.
  • the WINDOWS family of operating systems allows multiple filters to be installed that are capable of handling particular types of media encoding.
  • the merit level assigned to a filter determines the priority given to a filter; filters having a higher merit level, i.e., priority, are selected to handle media types before filters with lower merit levels.
  • Support for various media types may be implemented by associating the Transform filter with the desired media types through changes to the system registry.
  • the Transform filters may be associated with specific Renderer filters.
  • this framework is extensible and permits future support for new types of media programming.
  • the server agent 350 may implement its client communications functionality using modified DirectShow® Transform and Renderer i filters. Instead of transforming (decompressing) received media data, the modified Transform filters transmit the data to the client agent 375. Instead of rendering transformed (decompressed) media data, the Renderer filters transmit both stream- generic (e.g., play, pause, stop, flush, end-of-stream) and stream-specific (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment) commands.
  • stream- generic e.g., play, pause, stop, flush, end-of-stream
  • stream-specific e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment
  • unmodified Transform and rendering filters at the client 305 transform (decompressO and render the media data for viewing, hearing, etc. by a user.
  • the Renderer filters also receive stream-specific control data.
  • the server agent 350 maintains a separate media queue for each media stream.
  • each media sfream may be prioritized, permitting the, e.g., transmission of real-time video conferencing information while transparently slowing or halting the transmission of still images from a web browser.
  • Stream prioritization may be accomplished by providing separate queues for different types of media, each stream having an assigned priority.
  • Network bandwidth may be determined in any one of a number of ways known in the art such as, for example, "pinging" the expected target server and measuring response time. For embodiments in which the media stream includes embedded timing information, that information may be used to determine if media data is not transmitted fast enough, i.e., that the bandwidth of the channel cannot support the transmission.
  • the media queues corresponding to the same application program 340 might have different priorities based on the media type (video, audio, MLDI, text, etc.). If network bandwidth between the server agent 350 and the client agent 375 is insufficient to accommodate all media streams, samples may be dropped, that is, deleted, from lower priority queues so that higher priority queues may be services without interruption. For example, video transmission might be interrupted and appear in a "slide-mode," while audio is still uninterrupted.
  • control information is associated with its own control queue, permitting the prioritization of the transmission of control data ahead of any media data.
  • the transmission of control information from the server agent 350 to the client agent 375 permits control of the client rendering filters' scheduler. Latency in initial playback may be reduced by pre-filling the media queue (not shown) of the client agent 375 as fast as possible before playback commences. Similarly, subsequent variations in network latency may be addressed by readjusting the client agent's 375 media queue as necessary. Should the client agent 375 detect that one or more of its media queues have fallen below or have exceeded certain resource thresholds, then the client agent 375 may send a burst or a stop request respectively to the server agent 350 for the desired amount of media queue adjustment.
  • DirectShow® functionality leverages functionality provided by the operating system and reduces or eliminates the reirnplementation of duplicative functionality.
  • the graphical or media display provided by the client agent 375 leverages operating system functionality to provide complex clipping and audio control functionality.
  • the client agent 375 may also utilize the DirectDraw® and DirectSound® capabilities provided by the operating system, reducing required CPU resources and improving overall performance.
  • the server agent 350 intercepts and fransmits to the client agent 375, over the network 315, the graphical display commands associated with the non-media graphical information that are output from the application program 340.
  • the first output filter module 355A, the second output filter module 355B or both (where the application program 340 uses external codecs), or the second output filter module 355B (where the application program 340 uses embedded codecs), captures timing information associated with the media sfream and transmits the timing information, over the network 315, to the client agent 375. More specifically, the output filter module 355A, 355B captures, and transmits to the client agent 375, presentation times for each frame of the media stream, thereby enabling the client agent 375 to synchronize video and audio streams and to maintain the correct frame rate.
  • the server agent 350 interfaces with the server transceiver 335 and the application program 340.
  • the server agent 350 receives from the client agent 375, over the network 315, a list of media formats supported by the client agent 375.
  • the server agent 350 registers the output filter modules 355 A, 355B by manipulating the configuration of the server 310.
  • the server agent 350 registers the output filter modules 355 A, 355B by editing the registry of the server 310.
  • the server agent 350 then informs the client agent 375 that the server 310 can handle all such media formats.
  • the client agent 375 and server agent 350 negotiate supported formats and capabilities on an as-needed basis. For example, the client agent 375 would defer informing server agent 350 of its support for JPEG2000 format media until server agent 350 specifically requests the creation of a JPEG2000 display by the client agent 375.
  • the filter module 355A when an application program 340 loads a first output filter module 355 A while attempting to render a media stream, the filter module 355A communicates with the server agent 350, which in turn sends a request to the client agent 375 to create the media stream by providing the media stream properties: allocator, media type, media stream priority, etc.
  • the client agent 375 responds to the server agent 350's request to create the media stream by indicating the success or failure or the request. The success or failure of the request depends on the availability of the respective filter modules capable of decompressing the media stream, sufficient memory, etc., at the client 305.
  • the application program 340 loads the second output filter module 355B and connects it to the first output filter module 355A, thereby providing efficient streaming of compressed media.
  • the application program 340 unloads the first output filter module 355A and loads the native Microsoft or third party-supplied output filter modules, which decompress and render the stream at the server 310.
  • the native Microsoft or third party-supplied output filter modules which decompress and render the stream at the server 310.
  • the first output filter module 355A and the second output filter module 355B communicate directly with the server agent 350.
  • the client agent 375 interfaces with the client transceiver 330 and the presentation interface 345.
  • the client agent 375 initially informs the server agent 350 of the media formats supported by the client agent 375.
  • the client agent 375 also receives from the output filter modules 355B over the network 315, the compressed data set and any associated timing information.
  • the client agent 375 receives over the network 315, from the second output filter module 355B, both stream-generic (e.g., play, pause, stop, flush, end-of-stream) and stream-specific (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment) commands from the server agent 350, the graphical display commands associated with the non-media graphical information.
  • stream-generic e.g., play, pause, stop, flush, end-of-stream
  • stream-specific e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment
  • the client agent 375 uses either external or embedded codecs, decompresses the compressed data set and, together with the graphical display commands associated with the non-media graphical information, any information for locating images of a video stream on a display screen, and any timing information, generates a media presentation at the presentation interface 345.
  • the presentation interface 345 has, in one embodiment, a display screen that renders a graphical display, such as, for example, a video presentation. In another embodiment, the presentation interface 345 includes a speaker that renders an audio presentation.
  • the client 305 may include any number of presentation interfaces 345.
  • the information provided in specifying the media formats supported by the client agent 375 may determine the mode of operation at the server 310. If the compressed data set is in a format that is not supported by the client agent 375, the second output filter module 355B may recompress the decompressed data set into a supported format. In another embodiment, when the client agent 375 does not support the format of a stream, the application program 340 unloads the first output filter module 355A and loads the native Microsoft or third party-supplied output filter modules, which decompress and render the stream at the server 310.
  • FIGS. 4A, 4B, and 4C one embodiment of a method 400 that generates a media presentation at the client 305, using the exemplary embodiment of FIG. 3, is illustrated.
  • the client agent 375 informs the server agent 350 of all the media formats supported by the client agent 375.
  • the list of supported media formats is created by enumerating the external codecs installed on the client 305. For example, the codecs installed in the operating system of the client 305 are enumerated by the client agent 375, over the network 315, to the server agent 350.
  • the list of supported media formats is created by enumerating the codecs embedded in the client agent 375.
  • the codecs embedded in the software program are enumerated by the client agent 375, over the network 315, to the server agent 350.
  • the client agent 375 creates the list of supported media formats, and informs the server agent 350 of those supported media formats, by enumerating both the external codecs installed on the client 305 and the codecs embedded in the client agent 375.
  • the client agent 375 and server agent 350 negotiate supported formats and capabilities on an as-needed basis. For example, the client agent 375 would defer informing server agent 350 of its support for JPEG2000 format media until server agent 350 specifically requests the creation of a JPEG2000 display by the client agent 375.
  • capabilities are negotiated on a per-stream basis when the sfream is created.
  • the client agent 375 responds to the server agent 350's request to create a media presentation by indicating the success or failure or the request.
  • the success or failure of the request depends on the availability of the respective codecs capable of decompressing the data, sufficient memory, etc., at the client 305.
  • the application program 340 loads the codec, thereby providing efficient display of graphical data.
  • the application program 340 loads the native Microsoft or third party-supplied codecs, which decompress and render the data at the server 310.
  • the client agent 375 generates globally unique identifiers ("GUTDs") and associates each GUTD with a particular codec. The client agent 375 then fransmits the list of generated GUTDs to the server agent 350 to inform the server agent 350 of the media formats supported by the client agent 375. In another embodiment, the client agent 375 fransmits a list of four character codes, each four character code being associated with a particular codec, to the server agent 350 to inform the server agent 350 of the media formats supported by the client agent 375.
  • GUITDs globally unique identifiers
  • the server agent 350 Upon receiving the list of supported media formats from the client agent 375, the server agent 350 registers, at step 408, the first output filter module 355A and/or the second output filter module 355B on the server 310, as associated with the supported media formats. The server agent 350, at step 412, then reports back to the client agent 375 that the server 310 can handle all of the enumerated media formats.
  • an application program 340 starts executing on the server 310.
  • the application program 340 identifies within its output, at step 420, the presence of media content, such as, for example, a media sfream, the first output filter module 355A, the second output filter module 355B, or both are invoked. If, at step 424, the application program 340 uses external codecs, both the first output filter module 355A and the second output filter module 355B are invoked at step 428 as the application program 340 attempts to invoke an external codec.
  • the first output filter module 355A then intercepts, at step 432, an original compressed data set representing at least a portion of the media stream and transmits, at step 436, the original compressed data set to the client agent 375, without decompressing the data set.
  • the client agent 375 receives the original compressed data set and decompresses, at step 444, the original compressed data set to generate a decompressed data set.
  • the client agent 375 uses either external codecs installed on the client 305 or codecs embedded in the client agent 375 itself to decompress the original compressed data set.
  • the second output filter module 355B is, at step 448, invoked as the application program 340 attempts to invoke an OS-level renderer to display the decompressed data set.
  • the second output filter module 355B then intercepts, at step 452, a first decompressed data set, representing at least a portion of the media stream, from the output of the application program 340 and compresses, at step 456, the intercepted first decompressed data set.
  • a variety of compression techniques including both lossy compression techniques and lossless compression techniques, may be used by the second output filter module 355B, at step 456, to compress the media sfream.
  • the intercepted first decompressed data set may be compressed, at step 456, by the second output filter module 355B using, for example, a lightweight lossy video encoding algorithm, such as, for example, MJPEG compression.
  • the second output filter module 355B may choose the desired compression ratio or it may use a predetermined compression ratio.
  • the degree of quality loss chosen by the second output filter module 355B will, typically, depend on the available bandwidth of the network connection. For example, where a user of the client 305 uses a slow modem to connect to the network 315, the second output filter module 355B may choose to use low quality video. Where, on the other hand, a user of the client 305 uses a LAN link or a broadband connection to connect to the network 315, the second output filter module 355B may choose to use a higher quality video.
  • the second output filter module 355B transmits, at step 460, the compressed data set to the client agent 375 in place of the first decompressed data set.
  • the client agent 375 receives the compressed data set and decompresses, at step 468, the compressed data set to generate a second decompressed data set.
  • the client agent 375 uses either external codecs installed on the client 305 or codecs embedded in the client agent 375 itself to decompress the compressed data set.
  • the second output filter module 355B captures information for locating images of the video stream on a display screen and transmits the captured information over the network 315 to the client agent 375.
  • the client agent 375 receives the information for locating the images of the video stream on the display screen.
  • the server agent 350 intercepts and transmits, over the network 315, graphical display commands, associated with the non-media graphical information outputted by the application program 340, to the client agent 375.
  • the client agent 375 receives the graphical display commands associated with the non-media graphical information.
  • the output filter module 355A, 355B captures timing information associated with the media stream
  • the output filter module 355 A, 355B transmits, at step 488, the timing information to the client agent 375.
  • the client agent 375 receives, at step 492, the timing information and generates, at step 496, the media presentation at the presentation interface 345.
  • the client agent 375 uses the timing information, the graphical display commands associated with the non-media graphical information, and, where the media stream includes a video stream, the information for locating the images of the video stream on a display screen to seamlessly combine the decompressed data set (or, more specifically, where the application program 340 uses embedded codecs, the second decompressed data set) with the non-media graphical information.
  • the client agent 375 generates, at step 496, the media presentation at the presentation interface 345 using only the decompressed data set (or, more specifically, where the application program 340 uses embedded codecs, the second decompressed data set), the graphical display commands associated with the non-media graphical information, and, where the media sfream includes a video stream, the information for locating the images of the video stream on a display screen .
  • FIGS. 5A-5C present another embodiment of the present invention implemented on a server using an operating system selected from the MICROSOFT WINDOWS family of operating systems. The operating system at the client need not be the same operating system as the operating system employed by the server.
  • the appropriate first output filter modules 355 A and second output filter modules 355B are registered with the server 310 (Step 504).
  • the identified media types are exemplary and, utilizing this architecture, future media types may be added for operation in accord with the present invention.
  • An application program 340 executes at the server (Step 508) and a media stream is identified within the output of the application program 340 (Step 512).
  • the media stream is identified when the application make a system call to the WINDOWS media subsystem to locate suitable codecs for handling the media stream.
  • the application program 340 loads the first output filter module 355 A corresponding with the major media type of the media stream identified within the application output and having the highest merit of any filter registered with the system as capable of handling this major media type (Step 516).
  • the first output filter module 355A communicates with the server agent 350 to request that the client agent 375 create a media stream as specified by transmitted properties: allocator (buffer) properties, media type (major: video, audio, etc.; minor: MPEG-1, MPEG-2, etc.; format, etc.), media stream priority for on-demand quality control, etc.
  • allocator buffer
  • media type major: video, audio, etc.; minor: MPEG-1, MPEG-2, etc.; format, etc.
  • media stream priority for on-demand quality control, etc.
  • the server agent 350 organizes media streams in different contexts. Media streams, originating from different instances of an application program 340 have different major contexts. Media streams originating from the same instance of an application program 340 have different minor contexts within the same major context.
  • the server agent 350 creates a command queue for control information for each major context, and a media samples queue for each minor context.
  • the media samples queues help the server agent 350 accommodate variations in bandwidth and network latency by, for example, allowing certain frames of video data to be dropped to maintain playback speed and to select which data from various streams to drop. For example, video data may be discarded before audio data is discarded.
  • the queues of different major contexts are serviced in a round-robin fashion.
  • control queue typically has the highest priority
  • media queues corresponding to different minor contexts might have different priorities based on the nature of the sfream.
  • audio sfreams will have higher priority than video sfreams.
  • the client agent 375 attempts to create a media stream conforming to the media properties specified in the transmitted request (Step 520).
  • the client agent 375 attempts to load a generic source filter module and connect it with native or third party provided output filter modules that are capable of decompressing and rendering the media sfream.
  • Client agents 375 utilizing different operating systems may undertake different actions to achieve the same result. For example, the client might load and use the services of a multimedia application capable of decompressing and rendering a media stream, such as a MPEG-1 video stream.
  • the client agent 375 sends a reply to the server agent 350, indicating whether the request to create the specified media stream was successful (Step 524).
  • the server agent 350 may, in turn, provide this information to the application program 340. If the client succeeds in creating the stream, then using DIRECTSHOW intelligent connect logic, the application program 340 loads the second output filter module 355B and connects it to the first output filter module 355 A (Step 528), thereby enabling the processing of compressed media and control information transmitted to the client agent 375.
  • the first output filter module 355A intercepts the original compressed data set representing at least a portion of the media stream and timing information, and further detects dynamic changes in the media type, such as format changes (Step 532).
  • the second output filter module 355B intercepts media stream-generic commands (e.g., play, pause, stop, flush, end-of-stream) and stream-specific commands (e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment commands) (Step 536).
  • media stream-generic commands e.g., play, pause, stop, flush, end-of-stream
  • stream-specific commands e.g., video parent window assignment, positioning and clipping, audio volume and balance adjustment commands
  • the server agent 350 transmits the original compressed data set, optionally including timing information, to the client agent 375 (Step 540).
  • the server agent 350 may drop samples from lower priority media queues to make bandwidth available for higher priority media queues. Media queues having equal priority are serviced in round robin fashion.
  • the transmission rate of the second output filter module 355B to the server agent 350 and, therefore, to the client agent 375 is controlled so that the client's media queue is prefilled as quickly as allowed by the network throughput before playback commences, reducing latency in the initial playback. Thereafter, the media queue of the client agent 375 is used to accommodate variations in network latency, as discussed below.
  • the server agent 350 also transmits control information to the client agent 375 (Step 544). Typically, control information is never discarded despite constraints on the availability of network bandwidth.
  • the client agent 375 receives the original compressed data set and optional timing information (Step 548).
  • the client agent 375 also receives the control information and any notifications of dynamic changes in the media type, e.g., format changes (Step 552).
  • the client agent 375 creates a single command queue for control information for each major context and a media samples queue for each minor context.
  • the media samples queues let the client agent 375 accommodate variations in bandwidth and network latency.
  • the control queues of the different major context are serviced in a round-robin fashion.
  • the media samples in the queues are utilized by the associated source filter module(s) based on the timing information.
  • the client agent sends status notifications to the server agent (Step 556). These notifications include indications of the client's success or failure creating a requested media sfream. Variations in bandwidth or network latency may be addressed by adjusting the media queue of the client agent 375 as necessary. Should the client agent 375 detect that one or more of its media queues have fallen below or have exceeded certain resource thresholds, then the client agent 375 may respectively send a burst or a stop request to the server agent 350 for the desired amount of media queue adjustment. The client agent 375 may also send status information concerning unexpected errors encountered in the generation of the media presentation.
  • the client agent 375 may pause re-buffer the queue or drop media samples, as required (Step 560).
  • a native or third party supplied output filter module decompresses the received compressed data set to generate a decompressed data set (Step 564).
  • the client agent 375 applies control information, timing information, and dynamic changes in the media type to the collection of filter modules (e.g., an instance of the generic source filter module, the connected instances of native or third party output filter modules, etc.) (Step 568).
  • the application program 340 running on the server 310 unloads the first output filter module 355 A and subsequently loads the native or third party-supplied output filter modules, which decompress and render the data sfream at the server 310 (Step 570).
  • a decompressed data set representing at least a portion of the media sfream is intercepted (Step 574), compressed (Step 578), and transmitted to the client agent 375 (Step 582), where they are received by the client agent 375 (Step 586).
  • the server agent 350 intercepts and fransmits graphical display commands, associated with non-media graphical information, to the client agent 375 (Step 590).
  • the client agent 375 receives the graphical display commands (Step 594).
  • the client agent 375 generates the media presentation using the native or third party provided output filter module (renderer filter) to render the decompressed data sent onto the presentation interface 245 (Step 598).
  • the present invention may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture.
  • the article of manufacture may be a floppy disk, a hard disk, a CD ROM, a flash memory card, a PROM, a RAM, a ROM, or a magnetic tape, hi general, the computer-readable programs may be implemented in any programming language. Some examples of languages that can be used include C, C++, or JAVA.
  • the software programs may be stored on or in one or more articles of manufacture as object code.

Abstract

L'invention concerne la génération, au niveau d'un client, d'un affichage comprenant des composants graphiques et/ou multimédia. Un procédé de génération d'une présentation multimédia au niveau d'un client consiste à transmettre une sortie au client à partir d'un programme d'application en cours d'exécution sur un serveur, à identifier un flux multimédia dans une sortie d'application, à recevoir un ensemble de données compressées représentant une partie au moins du flux multimédia, et à transmettre cet ensemble de données compressées au client. Des informations de synchronisation et/ou de contrôle associées au flux multimédia sont capturées et transmises au client.
PCT/US2004/029993 2003-09-12 2004-09-13 Procede et dispositif destines a generer des affichages graphiques et multimedia au niveau d'un client WO2005029864A1 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
AU2004305808A AU2004305808A1 (en) 2003-09-12 2004-09-13 Method and apparatus for generating graphical and media displays at a thin client
CA002538340A CA2538340A1 (fr) 2003-09-12 2004-09-13 Procede et dispositif destines a generer des affichages graphiques et multimedia au niveau d'un client
EP04784000A EP1665798A1 (fr) 2003-09-12 2004-09-13 Procede et dispositif destines a generer des affichages graphiques et multimedia au niveau d'un client
JP2006526396A JP2007505580A (ja) 2003-09-12 2004-09-13 シンクライアントにおいてグラフィカルおよびメディア表示を生成するための方法および装置
IL174245A IL174245A0 (en) 2003-09-12 2006-03-12 Method and apparatus for generating graphical and media displays at a thin client

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US50257803P 2003-09-12 2003-09-12
US60/502,578 2003-09-12
US51046103P 2003-10-10 2003-10-10
US60/510,461 2003-10-10

Publications (2)

Publication Number Publication Date
WO2005029864A1 true WO2005029864A1 (fr) 2005-03-31
WO2005029864A8 WO2005029864A8 (fr) 2006-11-09

Family

ID=34381050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/029993 WO2005029864A1 (fr) 2003-09-12 2004-09-13 Procede et dispositif destines a generer des affichages graphiques et multimedia au niveau d'un client

Country Status (6)

Country Link
EP (1) EP1665798A1 (fr)
JP (1) JP2007505580A (fr)
KR (1) KR20060110267A (fr)
AU (1) AU2004305808A1 (fr)
CA (1) CA2538340A1 (fr)
WO (1) WO2005029864A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009006564A2 (fr) * 2007-07-05 2009-01-08 Mediaport Entertainment, Inc. Systèmes et procédés pour la surveillance de dispositifs, de systèmes, d'utilisateurs et d'activité d'utilisateurs dans des sites éloignés
WO2009146938A2 (fr) * 2008-06-06 2009-12-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Interface utilisateur portable avec accès à un ordinateur hôte
EP2403251A1 (fr) 2010-07-01 2012-01-04 Fujitsu Limited Transmission de mises à jour d'images d'un serveur à un client léger
US8411972B2 (en) 2010-12-03 2013-04-02 Fujitsu Limited Information processing device, method, and program
GB2514777A (en) * 2013-06-03 2014-12-10 Displaylink Uk Ltd Management of memory for storing display data
US8953676B2 (en) 2010-07-01 2015-02-10 Fujitsu Limited Information processing apparatus, computer-readable storage medium storing image transmission program, and computer-readable non transitory storage medium storing image display program
US8982135B2 (en) 2011-01-31 2015-03-17 Fujitsu Limited Information processing apparatus and image display method
EP3160156A1 (fr) * 2015-10-21 2017-04-26 Nagravision S.A. Système, dispositif et procédé pour améliorer du contenu audio-vidéo à l'aide d'images d'application

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9198084B2 (en) 2006-05-26 2015-11-24 Qualcomm Incorporated Wireless architecture for a traditional wire-based protocol
JP2009054097A (ja) * 2007-08-29 2009-03-12 Casio Comput Co Ltd 描画データ処理装置および描画データ処理プログラム
KR101026759B1 (ko) * 2008-08-26 2011-04-08 최백준 터미널 환경의 서버 기반 컴퓨팅 시스템에서 영상 밀림, 영상 손실 및 지연 없이 동영상 재생을 그 파일 형태 및 크기와 무관하게 분산 처리하기 위한 동영상 재생 분산 처리 시스템 및 동영상 재생 분산 처리 방법
US9398089B2 (en) 2008-12-11 2016-07-19 Qualcomm Incorporated Dynamic resource sharing among multiple wireless devices
JP5476734B2 (ja) * 2009-02-19 2014-04-23 日本電気株式会社 サーバ、リモート操作システム、伝送方式選択方法、プログラム及び記録媒体
US9264248B2 (en) 2009-07-02 2016-02-16 Qualcomm Incorporated System and method for avoiding and resolving conflicts in a wireless mobile display digital interface multicast environment
US9582238B2 (en) 2009-12-14 2017-02-28 Qualcomm Incorporated Decomposed multi-stream (DMS) techniques for video display systems
JP5471794B2 (ja) 2010-05-10 2014-04-16 富士通株式会社 情報処理装置、画像送信プログラム及び画像表示方法
KR101312268B1 (ko) 2010-12-24 2013-09-25 주식회사 케이티 클라우드 컴퓨팅 환경에서 게임 서비스 제공 방법, 클라우드 컴퓨팅 서버, 및 클라우드 컴퓨팅 시스템
US9065876B2 (en) 2011-01-21 2015-06-23 Qualcomm Incorporated User input back channel from a wireless sink device to a wireless source device for multi-touch gesture wireless displays
US20130013318A1 (en) 2011-01-21 2013-01-10 Qualcomm Incorporated User input back channel for wireless displays
US10135900B2 (en) 2011-01-21 2018-11-20 Qualcomm Incorporated User input back channel for wireless displays
US8964783B2 (en) 2011-01-21 2015-02-24 Qualcomm Incorporated User input back channel for wireless displays
US9413803B2 (en) 2011-01-21 2016-08-09 Qualcomm Incorporated User input back channel for wireless displays
US9787725B2 (en) 2011-01-21 2017-10-10 Qualcomm Incorporated User input back channel for wireless displays
US10108386B2 (en) 2011-02-04 2018-10-23 Qualcomm Incorporated Content provisioning for wireless back channel
US9503771B2 (en) * 2011-02-04 2016-11-22 Qualcomm Incorporated Low latency wireless display for graphics
TW201251429A (en) * 2011-06-08 2012-12-16 Hon Hai Prec Ind Co Ltd System and method for sending streaming of desktop sharing
US9525998B2 (en) 2012-01-06 2016-12-20 Qualcomm Incorporated Wireless display with multiscreen service
TW201419868A (zh) * 2012-09-11 2014-05-16 Nec Corp 通訊系統與方法以及伺服器裝置與終端設備
CN112567751A (zh) * 2018-09-26 2021-03-26 华为技术有限公司 一种3d图形数据压缩和解压缩的方法及装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000033217A1 (fr) * 1998-11-30 2000-06-08 Siebel Systems, Inc. Systeme client-serveur a architecture de clientinimale
WO2001075610A1 (fr) * 2000-03-31 2001-10-11 Siebel Systems, Inc. Procede et systeme pour client leger destine a generer une sortie de langue d'expedition de page a partir des appliquettes, vues et definition de l'ecran
WO2001092973A2 (fr) * 2000-05-26 2001-12-06 Citrix Systems, Inc. Procede et systeme permettant de reduire efficacement les donnees d'affichage graphique en vue de les transmettre via un mecanisme de protocole de transport a faible largeur de bande
US20030014476A1 (en) * 2001-01-03 2003-01-16 Peterson David Allen Thin client computer operating system
US20030055889A1 (en) * 2001-08-27 2003-03-20 Meng-Cheng Chen Cache method
US20030065715A1 (en) * 2001-08-20 2003-04-03 Burdick William R. System and method of a wireless thin-client, server-centric framework
EP1320240A2 (fr) * 2000-05-26 2003-06-18 Citrix Systems, Inc. Procédé et système permettant de réduire de manière efficace des données graphiques d'affichage destinées à une émission sur un mécanisme de protocole de transport à faible largeur de bande
US20040103438A1 (en) * 2002-11-27 2004-05-27 Yong Yan Methods and systems for transferring events including multimedia data

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4120711B2 (ja) * 1996-11-15 2008-07-16 株式会社日立製作所 映像表示システム
JPH11341027A (ja) * 1998-05-26 1999-12-10 Canon Inc バス管理方法及び装置
JP4600875B2 (ja) * 2000-08-28 2010-12-22 ソニー株式会社 マルチメディア情報処理装置及び方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000033217A1 (fr) * 1998-11-30 2000-06-08 Siebel Systems, Inc. Systeme client-serveur a architecture de clientinimale
WO2001075610A1 (fr) * 2000-03-31 2001-10-11 Siebel Systems, Inc. Procede et systeme pour client leger destine a generer une sortie de langue d'expedition de page a partir des appliquettes, vues et definition de l'ecran
WO2001092973A2 (fr) * 2000-05-26 2001-12-06 Citrix Systems, Inc. Procede et systeme permettant de reduire efficacement les donnees d'affichage graphique en vue de les transmettre via un mecanisme de protocole de transport a faible largeur de bande
EP1320240A2 (fr) * 2000-05-26 2003-06-18 Citrix Systems, Inc. Procédé et système permettant de réduire de manière efficace des données graphiques d'affichage destinées à une émission sur un mécanisme de protocole de transport à faible largeur de bande
US20030014476A1 (en) * 2001-01-03 2003-01-16 Peterson David Allen Thin client computer operating system
US20030065715A1 (en) * 2001-08-20 2003-04-03 Burdick William R. System and method of a wireless thin-client, server-centric framework
US20030055889A1 (en) * 2001-08-27 2003-03-20 Meng-Cheng Chen Cache method
US20040103438A1 (en) * 2002-11-27 2004-05-27 Yong Yan Methods and systems for transferring events including multimedia data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHEN J ET AL: "Multimedia over ip for thin clients : building a collaborative resource-sharing prototype", MULTIMEDIA AND EXPO, 2001. ICME 2001. IEEE INTERNATIONAL CONFERENCE ON 22-25 AUG. 2001, PISCATAWAY, NJ, USA,IEEE, 22 August 2001 (2001-08-22), pages 431 - 434, XP010661867, ISBN: 0-7695-1198-8 *
CHIA-CHEN KUO ET AL: "Design and implementation of a network application architecture for thin clients", PROCEEDINGS OF THE 26TH. ANNUAL INTERNATIONAL COMPUTER SOFTWARE AND APPLICATIONS CONFERENCE. COMPSAC 2002. OXFORD, ENGLAND, AUG. 26 - 29, 2002, ANNUAL INTERNATIONAL COMPUTER SOFTWARE AND APPLICATIONS CONFERENCE, LOS ALAMITOS, CA : IEEE COMP. SOC, US, vol. CONF. 26, 26 August 2002 (2002-08-26), pages 193 - 198, XP010611116, ISBN: 0-7695-1727-7 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009006564A2 (fr) * 2007-07-05 2009-01-08 Mediaport Entertainment, Inc. Systèmes et procédés pour la surveillance de dispositifs, de systèmes, d'utilisateurs et d'activité d'utilisateurs dans des sites éloignés
WO2009006564A3 (fr) * 2007-07-05 2009-02-26 Mediaport Entertainment Inc Systèmes et procédés pour la surveillance de dispositifs, de systèmes, d'utilisateurs et d'activité d'utilisateurs dans des sites éloignés
WO2009146938A2 (fr) * 2008-06-06 2009-12-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Interface utilisateur portable avec accès à un ordinateur hôte
WO2009146938A3 (fr) * 2008-06-06 2010-05-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Interface utilisateur portable avec accès à un ordinateur hôte
EP2403251A1 (fr) 2010-07-01 2012-01-04 Fujitsu Limited Transmission de mises à jour d'images d'un serveur à un client léger
US8819270B2 (en) 2010-07-01 2014-08-26 Fujitsu Limited Information processing apparatus, computer-readable non transitory storage medium storing image transmission program, and computer-readable storage medium storing image display program
US8953676B2 (en) 2010-07-01 2015-02-10 Fujitsu Limited Information processing apparatus, computer-readable storage medium storing image transmission program, and computer-readable non transitory storage medium storing image display program
US8411972B2 (en) 2010-12-03 2013-04-02 Fujitsu Limited Information processing device, method, and program
US8982135B2 (en) 2011-01-31 2015-03-17 Fujitsu Limited Information processing apparatus and image display method
GB2514777A (en) * 2013-06-03 2014-12-10 Displaylink Uk Ltd Management of memory for storing display data
GB2514777B (en) * 2013-06-03 2018-12-19 Displaylink Uk Ltd Management of memory for storing display data
EP3160156A1 (fr) * 2015-10-21 2017-04-26 Nagravision S.A. Système, dispositif et procédé pour améliorer du contenu audio-vidéo à l'aide d'images d'application

Also Published As

Publication number Publication date
WO2005029864A8 (fr) 2006-11-09
AU2004305808A1 (en) 2005-03-31
JP2007505580A (ja) 2007-03-08
EP1665798A1 (fr) 2006-06-07
KR20060110267A (ko) 2006-10-24
CA2538340A1 (fr) 2005-03-31

Similar Documents

Publication Publication Date Title
WO2005029864A1 (fr) Procede et dispositif destines a generer des affichages graphiques et multimedia au niveau d'un client
AU2009251123B2 (en) Methods and apparatus for generating graphical and media displays at a client
US5838927A (en) Method and apparatus for compressing a continuous, indistinct data stream
JP4716645B2 (ja) ドキュメントビューイング方法
US7853648B1 (en) System and method for providing interactive images
US7653749B2 (en) Remote protocol support for communication of large objects in arbitrary format
US20100005187A1 (en) Enhanced Streaming Operations in Distributed Communication Systems
US20040080533A1 (en) Accessing rendered graphics over the internet
US9325759B2 (en) Methods and apparatus for generating graphical and media displays at a client
US7769900B1 (en) System and method for providing interframe compression in a graphics session
EP1821490A1 (fr) Procédé pour la transmission de données graphiques vers un client à fonctionalités réduites
US7659907B1 (en) System and method for providing dynamic control of a graphics session

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MK MN MW MX MZ NA NI NO NZ PG PH PL PT RO RU SC SD SE SG SK SY TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IT MC NL PL PT RO SE SI SK TR BF CF CG CI CM GA GN GQ GW ML MR SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2538340

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2004784000

Country of ref document: EP

Ref document number: 2004305808

Country of ref document: AU

Ref document number: 2006526396

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 174245

Country of ref document: IL

WWE Wipo information: entry into national phase

Ref document number: 1020067005104

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 649/KOLNP/2006

Country of ref document: IN

ENP Entry into the national phase

Ref document number: 2004305808

Country of ref document: AU

Date of ref document: 20040913

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2004305808

Country of ref document: AU

WWP Wipo information: published in national office

Ref document number: 2004784000

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1020067005104

Country of ref document: KR