US20120210205A1 - System and method for using an application on a mobile device to transfer internet media content - Google Patents

System and method for using an application on a mobile device to transfer internet media content Download PDF

Info

Publication number
US20120210205A1
US20120210205A1 US13/370,751 US201213370751A US2012210205A1 US 20120210205 A1 US20120210205 A1 US 20120210205A1 US 201213370751 A US201213370751 A US 201213370751A US 2012210205 A1 US2012210205 A1 US 2012210205A1
Authority
US
United States
Prior art keywords
content
application
media
web
mobile device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/370,751
Inventor
Greg Sherwood
James J. Kosmach
Osama Al-Shaykh
Richard June
Eva MacKay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
III Holdings 2 LLC
Original Assignee
PacketVideo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201161463088P priority Critical
Application filed by PacketVideo Corp filed Critical PacketVideo Corp
Priority to US13/370,751 priority patent/US20120210205A1/en
Publication of US20120210205A1 publication Critical patent/US20120210205A1/en
Assigned to PACKETVIDEO CORP. reassignment PACKETVIDEO CORP. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AL-SHAYKH, OSAMA, JUNE, RICHARD, MACKAY, Eva, KOSMACH, JAMES J., SHERWOOD, GREG
Assigned to PACKETVIDEO CORPORATION reassignment PACKETVIDEO CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AL-SHAYKH, OSAMA, KOSMACH, JAMES J., MACKAY, Eva, SHERWOOD, GREG, JUNE, RICHARD
Assigned to III HOLDINGS 2, LLC reassignment III HOLDINGS 2, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PACKETVIDEO CORPORATION
Priority claimed from US14/830,322 external-priority patent/US20160048485A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72583Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status for operating the terminal by selecting telephonic functions from a plurality of displayed items, e.g. menus, icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/44Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • G06F16/972Access to data in other repository systems, e.g. legacy data or dynamic Web page generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. local area networks [LAN], wide area networks [WAN]
    • H04L12/2803Home automation networks
    • H04L12/2807Exchanging configuration information on appliance services in a home automation network
    • H04L12/2809Exchanging configuration information on appliance services in a home automation network indicating that an appliance service is present in a home automation network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. local area networks [LAN], wide area networks [WAN]
    • H04L12/2803Home automation networks
    • H04L12/2807Exchanging configuration information on appliance services in a home automation network
    • H04L12/281Exchanging configuration information on appliance services in a home automation network indicating a format for calling an appliance service function in a home automation network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. local area networks [LAN], wide area networks [WAN]
    • H04L12/2803Home automation networks
    • H04L12/2807Exchanging configuration information on appliance services in a home automation network
    • H04L12/2812Exchanging configuration information on appliance services in a home automation network describing content present in a home automation network, e.g. audio video content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. local area networks [LAN], wide area networks [WAN]
    • H04L12/2803Home automation networks
    • H04L12/283Processing of data at an internetworking point of a home automation network
    • H04L12/2836Protocol conversion between an external network and a home network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or inside the home ; Interfacing an external card to be used in combination with the client device
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6125Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6156Network physical structure; Signal processing specially adapted to the upstream path of the transmission network
    • H04N21/6175Network physical structure; Signal processing specially adapted to the upstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. local area networks [LAN], wide area networks [WAN]
    • H04L12/2803Home automation networks
    • H04L2012/2847Home automation networks characterised by the type of home appliance used
    • H04L2012/2849Audio/video appliances

Abstract

A system and a method use an application on a mobile device to transfer internet media content to a rendering device in a home network. The application may use an HTML rendering engine to display a web page to a user of the mobile device, and the web page may have controls for accessing the internet media content. The application may receive a user interaction signal which may indicate that a user invoked one of the controls for accessing the internet media content. In response, the application may initiate transfer of the internet media content to the rendering device in the home network and/or may queue the internet media content for later playback using the rendering device.

Description

  • This application claims the benefit of U.S. Provisional Application Ser. No. 61/463,088 filed Feb. 11, 2011.
  • BACKGROUND OF THE INVENTION
  • The present invention generally relates to a system and method for using an application on a mobile device to transfer internet media content to a rendering device in a home network. More specifically, the application may use an HTML rendering engine to display a web page to a user of the mobile device, and the web page may have controls for accessing the internet media content. The application may initiate transfer of the internet media content to the rendering device in the home network and/or may queue the internet media content for later playback using the rendering device.
  • The internet has a large and still growing number of content sites which offer access to digital media content, such as digital photographs, digital music files and/or streams, digital video files and/or streams, and/or the like. Such internet media content is typically accessible through web pages provided by the content site or by a third party site, such as a search engine provider or a content indexing service. Therefore, end users typically access internet media content using a web browser to discover, select and/or play back the internet media content. The internet media content is typically played using media playback capabilities of the web browser. Alternatively, the internet media content may be played by a companion media player which may be a separate application or which may be added to the web browser in the form of a “browser plug-in.”
  • For example, a user may access Flickr (trademark of Yahoo! Inc., domain at www.flickr.com) in a web browser to search for and view photographs uploaded by users of the Flickr photo sharing site. As another example, a user may visit YouTube (trademark of Google Inc., domain at www.youtube.com) to search for and view video content uploaded by users of the YouTube video sharing service. As another example, a user may pull up the Google search engine (trademark of Google Inc., domain at www.google.com) to perform a free-form search, and, as a result, the user may obtain a page of search results which includes and provides access to internet media content from various content sites. Accordingly, a user has many avenues to explore content sites and/or to discover digital media content available on the internet.
  • Web browsers were originally created for full-function personal computer devices, such as desktop and laptop PC's. However, as a result of the constant evolution of computing technology, a wide array of consumer electronics devices now have web browsers. For example, web browsers are currently present in mobile phones, smartphones, PDA's, tablet computing devices, televisions and gaming devices. Web browsers in such electronic devices often have browsing capabilities similar to the browsing capabilities of a PC browser. For example, the web browser in a modern smartphone may have full HTML rendering capabilities; may be capable of running scripting languages, such as JavaScript (trademark of Oracle America, Inc.) and Flash (trademark of Adobe Systems Inc.); may be capable of playing media content discovered through web pages; and/or similar web browsing capabilities.
  • However, web browsers on such electronic devices still lack the extensibility and flexibility of a typical PC browser. For example, web browsers for mobile phones, smartphones and PDA's typically do not have a “plug-in” architecture and, therefore, do not support the ability to accept external plug-ins or toolbars for adding new functionality to the web browser. There are practical reasons for this limitation. For example, such extensibility may present security concerns for the mobile device web browser. As another example, mobile devices often have limited screen size and user input facilities for which display of a toolbar or implementation of additional user interface functions for interaction with a plug-in may be impractical.
  • The discovery of and access to digital media content is, of course, not limited to web browsers. Multimedia home networking technologies, such as UPnP AV (trademark of UPnP Forum Non-Profit Corp.) and DLNA (trademark of Digital Living Network Alliance Corp.), allow users to consume digital media content through a growing array of consumer electronics devices. For example, a user may have a library of digital music files on a media server device in a home network, and the user may access and play back the digital music files using a DLNA-compliant networked stereo device. In a similar fashion, the user may have digital video files stored in the home network, and the user may access and play back the digital video files using a DLNA-compliant television attached to the home network. A DLNA-compliant rendering device may have an internal control point which enables the device to present a user interface by which the user may browse media servers and sources to discover, select, retrieve, and/or play back the media content available through the home network. Alternatively or additionally, a DLNA-compliant rendering device may support external control so that an external control point device and/or an external control point application may be used to send media content to the rendering device. Moreover, an external control point device and/or an external control point application may instruct the rendering device to retrieve media from a media server and/or may control the media playback on the rendering device.
  • The assignee of the present application created a web browser plug-in product called “Twonky Beam” which enables media content discovered in a PC web browser to be sent to an external rendering device, such as a DLNA-compliant television, stereo and/or photo frame device. Embodiments of the web browser plug-in product are disclosed in the application for U.S. patent published as U.S. App. Pub. No. 2010/0332565, herein incorporated by reference in its entirety. The web browser plug-in product disclosed in U.S. App. Pub. No. 2010/0332565 utilizes the plug-in architecture available on most PC web browsers; however, this plug-in architecture may be lacking on the web browsers available for other electronic devices, such as mobile phones, smartphones and PDA's.
  • The assignee of the present application also created a networked media controller product which may reside in a home network and may provide control point functionality, media playback control, queued playback, and/or other media control functions. The networked media controller may enable other electronic devices, such as mobile devices, to access such media control functions through the home network using one or more control interfaces provided by the networked media controller. Embodiments of the networked media controller product are disclosed in the application for U.S. patent published as U.S. Pub. No. 2010/0095332, herein incorporated by reference in its entirety.
  • Many non-PC computing devices now support downloadable applications. For example, the “App Store” (trademark of Apple Inc.) provides downloadable applications for Apple devices such as the iPhone, iPad and iPod Touch (all trademarks of Apple Inc.). As another example, the Android Marketplace provides downloadable applications for use on Android OS devices, such as Android smartphones and tablets (trademark of Google Inc.). As a third example, the ability to download and execute applications is built into many devices, such as televisions, media player devices, Blu-ray players (trademark of Blu-ray Disc Association) and gaming consoles.
  • Operating systems, such as iOS (trademark of Apple Inc.) and the Android OS, provide a rich application development environment. As a result, the downloadable applications may access various OS-provided functions and services related to graphics display, user input, network access, rendering of web content, scripting, media playback, and/or the like. For example, the Android operating system has a WebView class which is capable of rendering web pages, executing JavaScript (trademark of Oracle America, Inc.) code, processing user interaction, and the like. However, current web browsers, such as the web browsers provided by iOS and the Android OS, do not have a plug-in architecture to enable the mobile device to use a plug-in product to transfer internet media content to a rendering device in a home network.
  • SUMMARY OF THE INVENTION
  • The present invention generally relates to a system and method for using an application on a mobile device to transfer internet media content to a rendering device in a home network. More specifically, the application may use an HTML rendering engine to display a web page to a user of the mobile device, and the web page may have controls for accessing the internet media content. The application may receive a user interaction signal which may indicate that a user invoked one of the controls for accessing the internet media content. In response, the application may initiate transfer of the internet media content to the rendering device in the home network and/or may queue the internet media content for later playback using the rendering device.
  • It is, therefore, an advantage of the present invention to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network.
  • Another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which use a standard HTML rendering engine to render a web page on the mobile device and provide controls in the rendered web page for accessing the internet media content.
  • And, another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which analyze web content to identify controls for accessing the internet media content.
  • A further advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which modify web content to insert additional media controls into the web content and use a standard HTML rendering engine to render the modified web content.
  • Another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which provide script fragments for processing a standard representation of a web page.
  • Yet another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which enable the application to receive a user interaction signal from a standard HTML rendering engine and respond by instructing the rendering device to render the internet media content.
  • A further advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which present playback mode selection controls for selecting a playback mode which determines media playback behavior for internet media content selected in a rendered web page.
  • Another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which have a media transcoder component provided by the application for transcoding the internet media content based on capabilities of the rendering device.
  • Yet another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which have a media server component provided by the application for proxying the transfer of internet media content to the rendering device.
  • Another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in the home network which transfer the internet media content using a rendering control component in the home network which is separate from the mobile device and the rendering device.
  • And, another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which retrieve and render a web page on the mobile device and retrieve an alternative version of the web page having controls for accessing a different version of the internet media content of the web page.
  • Yet another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which do not require that the mobile device browser has a plug-in architecture.
  • A further advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which enable web browsing to occur with enhanced multimedia functions for the internet media content.
  • Another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which use an HTML rendering engine external to the application.
  • And, another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which render the internet media content in the home network without downloading the internet media content to a media server in the home network.
  • Another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which enable the user to send the internet multimedia content to any compatible rendering device in the home network.
  • Yet another advantage of the present invention is to provide a system and a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network which implement a user interface which combines web browsing tasks with the tasks of selecting, managing and controlling rendering devices in the home network.
  • Additional features and advantages of the present invention are described in, and will be apparent from, the detailed description of the presently preferred embodiments and from the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1-4 and 6 illustrate black box diagrams of systems for using an application on a mobile device to transfer internet media content to a rendering device in a home network in embodiments of the present invention.
  • FIG. 5 illustrates a black box diagram of web content and modified web content used in an embodiment of the present invention.
  • FIG. 7 illustrates a flowchart of a method for using an application on a mobile device to transfer internet media content to a rendering device in a home network in an embodiment of the present invention.
  • FIGS. 8 and 9 illustrate embodiments of a user interface for using an application on a mobile device to transfer internet media content to a rendering device in a home network in embodiments of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present invention generally relates to a system and method for using an application on a mobile device to transfer internet media content to a rendering device in a home network. More specifically, the application may use an HTML rendering engine to display a web page to a user of the mobile device, and the web page may have controls for accessing the internet media content. The web page may be based on web content retrieved from a content source. The application may analyze the web content to identify the controls for accessing the internet media content, and/or the application may modify the web content to insert additional media controls. The application may receive a user interaction signal which may indicate that a user invoked one of the controls for accessing the internet media content and/or one of the additional media controls. In response, the application may initiate transfer of the internet media content to the rendering device in the home network and/or may queue the internet media content for later playback using the rendering device.
  • Referring now to the drawings wherein like numerals refer to like parts, FIG. 1 generally illustrates a network configuration 10 in which an embodiment of the present invention may operate. The network configuration 10 may have a mobile device 11 connected to a local network 12 which may have one or more rendering devices. For example, the local network 12 may have a first rendering device 21, a second rendering device 22, and/or a third rendering device 23 (collectively hereafter “the rendering devices 21, 22, 23”).
  • The present invention is not limited to a specific number of rendering devices, and the local network 12 may have any number of rendering devices.
  • The local network 12 may be, for example, a local area network, a home network, an office network, and/or the like. The local network 12 may be based on one or more network technologies such as IEEE 802.11 (“WiFi”), Ethernet, Firewire (trademark of Apple Inc.), Multimedia over Coax (“MoCa”), Bluetooth, and/or the like. The local network 12 is not limited to these examples, and the local network 12 may incorporate any network connection technology known to one skilled in the art. The local network 12 may incorporate various connection technologies using wires, routers, network adapter devices, antennas and/or the like.
  • The local network 12 may be connected to and/or may be in communication with the internet 13 and/or a similar wide area network which provides access to content sources. For example, the local network 12 may use a cable modem, a DSL modem, a home networking router, and/or a similar device which provides an internet connection. The present invention is not limited to a specific technique for connecting the local network 12 to the internet 13, and the local network 12 may be connected to the internet 13 using any means known to one skilled in the art.
  • The mobile device 11 may be a mobile phone, a smartphone, a personal digital assistant, a laptop PC, a tablet device, a portable gaming device, a portable media player, and/or the like. The mobile device 11 may be a general purpose computing device with capability to execute applications. The mobile device 11 may have pre-installed applications. Further, the mobile device 11 may enable a user 15 of the mobile device 11 to download and/or install applications which may be subsequently accessed and used on the mobile device 11. The mobile device 11 may have web browsing capabilities, media playback capabilities, networking capabilities, and/or the like.
  • As previously set forth, the local network 12 may have and/or may be connected to one or more rendering devices, such as, for example, the rendering devices 21, 22, 23. Each of the one or more rendering devices may be a networked television device, a networked stereo device, a digital photo frame device, a desktop PC, a laptop PC, another mobile device, a gaming console, and/or the like. One or more of the rendering devices may be compatible with multimedia networking standards, such as, for example, UPnP AV and/or DLNA. The one or more rendering devices may have rendering capabilities; for example, the one or more rendering devices may be capable of rendering multimedia content of a particular format, multimedia content encoded using particular codecs, multimedia content encoded using a particular set of properties, multimedia content encoded using particular profiles and/or levels, multimedia content protected by a particular DRM technology, multimedia content transported using a particular streaming format, and/or the like.
  • The rendering devices may be capable of advertising and/or communicating the rendering capabilities to other devices. For example, the rendering devices may communicate the rendering capabilities based on UPnP discovery and device description processes and/or based on other techniques, such as exchange of an SDP session description. The present invention is not limited to these example techniques, and the rendering devices may communicate the rendering capabilities using any technique known to one skilled in the art.
  • The user 15 may use the mobile device 11 to browse the web, to view and interact with web pages, and to access and play back internet multimedia content discovered using the web pages. The mobile device 11 may request and/or may retrieve web content from one or more content sources available through the internet 13. For example, the mobile device 11 may use the internet 13 to request and/or retrieve web content from a first content source 31, a second content source 32 and/or a third content source 33 (collectively hereafter “the content sources 31, 32, 33”). The present invention is not limited to a specific number of content sources, and the mobile device 11 may use the internet 13 to request and/or retrieve web content from any number of content sources.
  • The mobile device 11 may display the web content as one or more web pages to the user 15. The user 15 may interact with the web pages to search for, discover and/or select internet multimedia content. The mobile device 11 may play back internet multimedia content selected from the web pages by the user 15. Further, the mobile device 11 may present enhanced multimedia functions, such as, for example, transferring the selected internet multimedia content to one of the rendering devices in the local network 12, and/or adding the selected internet multimedia content to a playback queue. For example, the mobile device 11 may transfer the selected internet multimedia content to one of the rendering devices 21, 22, 23.
  • The content sources, such as, for example, the content sources 31, 32, 33, may provide various types of internet content. For example, the content sources may provide web content, web pages, search results, text, web formatting, images, graphics, style sheets, URLs, digital audio files, digital audio streams, digital video files, digital video streams, scripting code, embedded media players, and/or the like. The scripting code may be, for example, JavaScript (trademark of Oracle America, Inc.), ECMAScript, Flash (trademark of Adobe Systems Inc.) and/or the like. Each of the content sources may be, for example, a web server, a media server, a search engine, a content service, a media sharing site, a social networking site, a news site, a sports site, and/or the like. One or more of the content sources may be multiple independent servers; for example, a content source may be a server farm with well-known properties, such as load balancing and failover.
  • A single web page may reference and/or may embed elements from various servers and/or content sources. Thus, retrieval and/or display of a single web page may require that the mobile device 11 requests and/or retrieves elements from multiple different web servers, media servers and/or content sources. Therefore, actions for requesting and/or retrieving a web page, as discussed in more detail hereafter, have all necessary requests and retrievals required to obtain the various elements necessary to construct and/or to render the web page.
  • The local network 12 may have and/or may be connected to a rendering control component 25 which may be external to the mobile device 11. If available, the rendering control component 25 may provide various rendering control functions to other devices in the local network 12. For example, the rendering control component 25 may provide standard UPnP AV and/or DLNA functions, such as device discovery, capability exchange, and/or control point functions. The rendering control component 25 may instruct a rendering device in the local network 12, such as one of the rendering devices 21, 22, 23, to request, retrieve, and/or render media content. The rendering control component 25 may control how a rendering device in the local network 12 renders the media content. For example, the rendering control component 25 may raise or lower the playback volume. Further, the rendering control component 25 may move playback forward or backward in a media object and/or in a playlist of media objects. The rendering control component 25 may provide queued playback to a rendering device in the local network 12. For example, the rendering control component 25 may internally maintain a queue state which tracks media objects which the rendering control component 25 is directing a rendering device to play back in sequence.
  • The rendering control component 25 may provide an interface by which another device, such as the mobile device 11, may access, invoke, and/or control the various rendering control functions provided by the rendering control component 25. In an embodiment, the rendering control component 25 may be the “Control Element 100” of U.S. App. Pub. No. 2010/0095332, herein incorporated by reference in its entirety. The present invention is not limited to a specific embodiment of the rendering control component 25, and the rendering control component 25 may be any known component for providing rendering control functions to other devices in the local network 12.
  • The rendering control component 25 may have additional functionality. For example, the rendering control component 25 may have a media server element capable of proxying internet media content and/or making proxied internet media content available to a rendering device in the local network 12. The media server element may support well-known server protocols; for example, the media server element it may be a UPnP AV compliant media server. The rendering control component 25 may have a media transcoder element capable of transcoding the internet media content for compatibility with the rendering capabilities of a rendering device.
  • In an embodiment, the mobile device 11 may utilize the rendering control component 25 to transcode and/or to proxy the internet media content before the internet media content is transferred to a rendering device in the local network 12. In another embodiment, the mobile device 11 may transcode and/or may proxy the internet media content, and, as a result, the mobile device 11 may provide such functions without relying on the rendering control component 25.
  • FIG. 2 generally illustrates a system 40 for using an application 41 executed by the mobile device 11 to transfer internet media content to a rendering device 42 in the local network 12. In an embodiment, the local network 12 may be a home network. The rendering device 42 may be, for example, one of the rendering devices 21, 22, 23. The application 41 may use a HTML rendering engine 48 to present enhanced media functions to the user 15 of the mobile device 11. In an embodiment, the HTML rendering engine 48 may be a separate element provided by the mobile device 11 relative to the application 41. For example, the HTML rendering engine 48 may be provided by the operating system of the mobile device 11. The enhanced media functions may enable the user 15 to initiate transfer of selected internet media content to the rendering device 42 in the local network 12, to control the rendering of the internet media content, to add the internet media content to a playback queue, and/or the like.
  • The application 41 may be pre-installed on the mobile device. For example, the application 41 may be installed on the mobile device 11 before the mobile device 11 is purchased by the user 15. Alternatively, the application 41 may be installed at the direction of the user 15. For example, the application 41 may be downloaded from an application store at the direction of the user 15. The application 41 may be provided by a non-transitory computer-readable medium, such as register memory, processor cache and Random Access Memory (RAM), which may have program instructions executed by the mobile device.
  • The user 15 may interact with the application 41 and/or the HTML rendering engine 48 through one or more user interface components of the mobile device 11, such as a user interface 49 of the mobile device 11. In an embodiment, the application 41 may be used for web browsing instead of the default web browser of the mobile device 11. As discussed in more detail hereafter, the user interface 49 may be the user interface facilities provided by the operating system of the mobile device 11.
  • The HTML rendering engine 48 may be capable of rendering web content. For example, the HTML rendering engine 48 may process web content to create a rendered web page. Then, the rendered web page may be displayed to the user 15 of the mobile device 11.
  • The HTML rendering engine 48 may have additional functionality related to the rendering of web content. For example, the HTML rendering engine 48 may retrieve the web content corresponding to a web page from one or more web servers, media servers, and/or content sources. For example, the HTML rendering engine 48 may retrieve the web content corresponding to a web page from a content source 44, and the content source 44 may be one of the content sources 31, 32, 33. The HTML rendering engine 48 may construct, may maintain and/or may provide access to a structured representation of the web page and elements related to the web page. For example, the HTML rendering engine 48 may support a Document Object Model (“DOM”) as known to one skilled in the art. As a result, the HTML rendering engine 48 may enable the application 41 to use the DOM to access the web page and/or elements related to the web page. As another example, the HTML rendering engine 48 may have one or more scripting engines. As a result, the HTML rendering engine 48 may execute scripting code, such as Javascript, ECMAscript, Flash and/or the like. The scripting code may be included in the web page as originally retrieved from a content source. Alternatively, the scripting code may be added to the web page by the application 41 an/or may be directly communicated to the HTML rendering engine 48 by the application 41.
  • The HTML rendering engine 48 may provide a rendered web page for display on the mobile device 11. The rendered web page may displayed within a larger application user interface created and/or maintained by the application 41, and the application user interface may be displayed in the user interface 49 of the mobile device 11. Various communication models known to one skilled in the art may support display of the rendered web page within a larger application user interface. As a first example, the application 41 may designate a portion of a screen area for use by the HTML rendering engine 48, and subsequently the HTML rendering engine 48 may be responsible for rendering the rendered web page in the designated portion of the screen area as needed. The HTML rendering engine 48 may communicate directly with the user interface 49 of the mobile device 11 to display the rendered web page.
  • As a second example, the application 41 may request the rendered web page directly from the HTML rendering engine 48 as needed, and the application 41 may subsequently create and/or maintain an application user interface which provides the rendered web page. In a preferred embodiment, the user interface 49 of the mobile device 11 may provide the application user interface. The present invention is not limited to a specific means for displaying the rendered web page to the user 15 of the mobile device 11. The division of responsibility for displaying the application user interface and/or the data flow by which the rendered web page is displayed may depend on the user interface facilities available from the operating system of the mobile device 11.
  • The HTML rendering engine 48 may support user interaction. For example, the HTML rendering engine 48 may detect user selection of a particular control, icon, button and/or link within the web page. As a result, the HTML rendering engine 48 may communicate to the application 41 that the user 15 selected the control, the icon, the button and/or the link. Alternatively or additionally, the HTML rendering engine 48 may detect lower level user interaction events. For example, the HTML rendering engine 48 may detect that the user 15 pressed a physical button on the mobile device 11, moved a pointing device, touched a particular location and/or area of a touchscreen display, rotated the mobile device 11 from one display orientation to another, and/or the like. As a result, the HTML rendering engine 48 may communicate such lower level user interaction events to the application 41.
  • The HTML rendering engine 48 may support additional browsing functions. For example, the HTML rendering engine 48 may maintain a browsing history to enable the user 15 and/or the application 41 to access history functions, such as a “Go Back” function to return to the previous web page and/or a “Go Forward” function to move to the next web page in the browsing history. The HTML rendering engine 48 may support functions to add the current web page to a favorites list. The HTML rendering engine 48 may support in-page navigation techniques, such as panning, vertical scrolling, zooming the page view in and out, and/or the like. The HTML rendering engine 48 may support standard browsing interaction techniques, such as tables, text boxes, and/or form submission.
  • The HTML rendering engine 48 may support, may facilitate and/or may enable playback on the mobile device 11 of internet media content selected from a web page by the user 15. The HTML rendering engine 48 may access a default media player and/or another media playback resource to support the playback of the selected internet media content.
  • The HTML rendering engine 48 may be provided by the operating system of the mobile device 11. Alternatively, the HTML rendering engine 48 may be provided by another component of the mobile device 11 and/or may be accessible to the application 41. For example, the HTML rendering engine 48 may provide one or more classes, function calls, application programming interfaces (“API's”), libraries and/or the like by which the application 41 may access the functionality of the HTML rendering engine 48.
  • The HTML rendering engine 48 may be any collection of one or more components, services, classes, libraries and/or API's which provide the functionality described herein for the HTML rendering engine 48. The present invention does not require that a single component, service, class, library or API implements all of the functionality of the HTML rendering engine 48. For example, the operating system of the mobile device 11 may have one service which renders web content and may have a different service which handles user interaction for the displayed web content, and each of the two services may be a portion of the HTML rendering engine 48.
  • The operating system of the mobile device 11 may have a collection of classes, functions, API's, libraries, services and/or the like for displaying the user interface 49 of the mobile device 11 and/or for enabling the user 15 to interact with the user interface 49. The application 41 may communicate with the user interface 49 to display the application user interface. For example, the application 41 may display text, graphics, controls and/or the like which are not part of the rendered web page. As previously set forth, the application 41 may be responsible for obtaining the rendered web page from the HTML rendering engine 48 and/or communicating the rendered view to the user interface 49 for display to the user 15. Alternatively, the HTML rendering engine 48 may directly communicate the rendered web page to the user interface 49 through a communication connection 50 as shown FIG. 2. Examples of the user interface 49 in embodiments of the present invention are detailed in FIGS. 7 and 8.
  • Referring again to FIG. 2, the application 41 may communicate with the HTML rendering engine 48 to request the retrieval of a new web page, analyze web content for the current web page, modify the web content of the current web page, receive user interaction signals, and/or the like. Specific examples of such communication are provided in subsequent figures.
  • The mobile device 11 may have a network interface 52. The application 41 and/or the HTML rendering engine 48 may use the network interface 52 to communicate with the local network 12 and/or the internet 13. The application 41 and/or the HTML rendering engine 48 may communicate with one or more content sources through the network, such as, for example, the content sources 21, 22, 23. For example, the application 41 and/or the HTML rendering engine 48 may request web content from a content source, and the content source may respond by transmitting the web content to the application 41 and/or the HTML rendering engine 48 through the local network 12 and/or the internet 13.
  • The application 41 may use the network interface 52 and/or the local network 12 to communicate with the rendering control component 25 and/or the rendering device 42, such as one of the rendering devices 21, 22, 23. For example, the application 41 may instruct the rendering control component 25 to initiate rendering of media content on the rendering device 42. As another example, the application 41 may instruct the rendering control component 25 to add media content to a playback queue maintained by the rendering control component 25. As yet another example, the application 41 may directly instruct the rendering device 42 to initiate rendering of media content and/or may directly communicate with the rendering device 42 to control the rendering of media content.
  • FIG. 3 generally illustrates a system 60 for using the application 41 to retrieve, analyze and/or modify web content for the purpose of transferring internet media content to a rendering device in an embodiment of the present invention. The system 60 may use the structure and connections previously described for FIG. 2.
  • Referring again to FIG. 3, the application 41 may use a content request transmission 61 to send a content request to a content source 55. The content source 55 may be, for example, one of the content sources 31, 32, 33 and the content source 44. In response, the application 41 may receive a web content transmission 62 from the content source 55. The web content transmission 62 may provide the web content requested by the application 41. In this way, the application 41 may send one or more content requests to one or more content sources to obtain the web content corresponding to a web page. Each content request may be an HTTP GET request or a similar request as known to one skilled in the art.
  • Each content request may have a requester identification field which may identify the mobile device 11, the web browser, the software client, and/or the application sending the content request. For example, the content request may identify the mobile device 11 and/or the application 41. In an embodiment, the requester identification field may be a “user-agent” field. The content source 55 may adapt the web content based on the requester identification field. For example, the content source 55 may format a web page differently depending on whether the requester identification field indicates a mobile device web browser targeting a small screen on a mobile device, or a PC web browser targeting a high resolution computer monitor. Therefore, the application 11 may use a requester identification field associated with the mobile device 11 and/or a mobile device web browser to obtain a version of the web content suitable for display on the mobile device 11.
  • The application 41 may analyze the web content to identify media controls for accessing internet media content. The media controls may enable the user 15 to select, access, retrieve and/or play back the internet media content using a standard web browser. For example, the application 41 may search the web content for HTML “<a>” tags which link to media content files available from an internet content source. As another example, the application 41 may search the web content for scripting code which is capable of accessing media content from an internet content source. The application 41 may identify the media controls which may be links, active thumbnail images, icons, buttons, regions of the displayed web page, scripting code segments, and/or the like.
  • The application 41 may modify the web content to insert additional media controls. The additional media controls may be provided using links, graphic images, icons, buttons, scripting code, and/or the like. The application 41 may insert the additional media controls in proximity to corresponding media controls identified during the web content analysis described above. Moreover, the application 41 may create and/or may insert the additional media controls so that the additional media controls are displayed in proximity to the corresponding media controls when the web page is displayed by the HTML rendering engine 48.
  • For example, the application 41 may analyze the web content to detect a first media control for accessing first internet media content. The first media control may enable the user 15 to play back the first internet media content in the web browser and/or using a media player helper application selected by the web browser. After detecting the first media control, the application 41 may modify the web content to insert a first additional media control and/or a second additional media control for display in proximity to the first media control. For example, the first additional media control may correspond to transferring the internet media content to an external rendering device, and/or the second external media control may correspond to adding the internet media content to a playback queue. The first additional media control and/or the second additional media control may have, may include and/or may correspond to a link which the user 15 may invoke in the web page rendered from the web content, and the link may reference a URL created by and/or known to the application 41. Alternatively, the first additional media control and/or the second additional media control may correspond to scripting code which may respond to user input in the web page rendered from the web content. For example, the scripting code may interact with the application using a function call, a class instance, an API, and/or a similar mechanism in response to user input in the web page rendered from the web content.
  • These steps for analyzing and modifying the web content may utilize information about organizational and/or formatting properties of a content site and/or a search engine. For example, a content site may provide access to a library of video content and may have a particular way of organizing, formatting and/or presenting search results in order to present active links to the video content relevant to the search. For example, a specific content site may provide web pages with twenty video results per page, and the controls for accessing the individual video contents may be JPEG thumbnail images which are active links pointing to HTTP URLs with a .flv (Flash Video File) extension. The application 41 may use this information when analyzing web content retrieved from the specific content site. As a result, the application 41 may identify the various JPEG thumbnail images which form the media controls for accessing internet video content from the specific content site.
  • Moreover, the application 41 may have information regarding how to optimally insert additional media controls into the web content provided by a specific content site. The information may be, for example, a graphical style for the additional media controls, an absolute or relative location for placement of the additional media controls, a parameter for sizing the additional media controls, a list of the additional media controls relevant to the specific content site, and/or a rule for determining the location to place the additional media controls within web content, such as HTML code and/or scripting code. The information may enable the application 41 to insert the additional media controls with minimal impact to the graphical layout and appearance of the resulting web page. The information may be programmed into the application 41 and/or may be retrieved by the application 41 from a database of information corresponding to specific content sites. Thus, the information corresponding to popular internet content sites may be maintained in the database and/or may be updated based on changes to the organizational and/or formatting properties of the specific content sites. The database may be accessible to the application 41 through the local network 12 and/or the internet 13.
  • For the specific content site of the preceding example, the application 41 may identify the HTML <a> tag which contains the JPEG thumbnail image for accessing a particular internet video content stream. The application 41 may have information for the particular content site, and the information may have a predetermined rule for inserting two additional HTML <a> tags which reference and display graphic button images corresponding to the first additional media control and/or the second additional media control. The application 41 may process the web content corresponding to the complete search result page in a similar manner. As a result, the application 41 may identify each of the twenty JPEG thumbnail images present in the web content of the search result page from the particular content site. Then, the application 41 may modify the web content by inserting forty additional HTML <a> tags representing the additional media controls, namely the graphic button images which expose enhanced media functionality provided by the application 41.
  • The application 41 may access a database which may have scripting code fragments. Each of the scripting code fragments may correspond to a content source. In an embodiment, each of the scripting code fragments may analyze web content provided by the corresponding content source to identify media controls for accessing internet media content available from the content source. The application 41 may query the database for scripting code fragments corresponding to a particular content source and may retrieve one or more scripting code fragments from the database to analyze web content from the particular content source. In an embodiment, the database may have scripting code fragments which modify web content provided by the content source to insert additional media controls. The application 41 may retrieve one or more scripting code fragments from the database to insert media controls into the web content from a particular content source. In an embodiment, the database may be accessible to the application using the internet 13.
  • The application 41 may not have information regarding the organizational properties and/or the formatting properties of a particular content site. For example, the information may be missing from a database built into the application 41 and/or a remote information database accessible to the application 41. In this case, the application 41 may have default logic for analyzing the web content to identify the media controls for accessing internet media content and/or for modifying the web content to insert the additional media controls. For example, the application 41 may search the web content for structures commonly used to access internet media content. Such structures may be HTML <a> tags which link to objects which have a MIME type or a file extension associated with internet media content. Further, such structures may be embedded scripting code which matches code known to access internet media content. The application 41 may have a default approach to inserting the additional media controls in web content from content sites for which the application has no information. For example, the application 41 may have default button images which correspond to the additional media controls and/or which may be inserted into the web content in proximity to the identified media controls.
  • In an embodiment, the application 41 may access, may examine and/or may analyze a description of the web page to identify the media controls for accessing internet media content. The description of the web page may have a page source which may include a markup source, links, scripts and/or active objects. The markup source may include, for example, HTML, xHTML, XML and/or the like. The links may be, for example, URLs which may reference additional markup source, scripts, active objects and/or media content. The scripts and/or the active objects may include, for example, JavaScript, ECMAScript, VBScript, Flash ActionScript, and/or code written in other scripting languages which may be executed during interaction with and/or rendering of the web page. Alternatively, the description of the web page may be an internal representation of a previously retrieved and/or parsed web page. For example, the description of the web page may be a Document Object Model (“DOM”) representation of a webpage accessed using a standard API provided by a web browser as known to one having ordinary skill in the art. The DOM representation may enable the application 41 to access the structure, the content, the links, the scripts and/or the active objects of the web page. The present invention is not limited to a specific embodiment of the description of the web page, and the present invention may utilize any description of the web page known to one having ordinary skill in the art.
  • The identification of the media controls for accessing internet media content may utilize a set of known media types, file types, file extensions and/or MIME types relevant to media content to identify the internet media content. Relevant image file types may be, for example, bitmap files, JPEG files, TIFF files, PNG files, SVG files and/or the like. Relevant audio file types may be, for example, MP3 files, AAC audio files, Windows Media Audio files, FLAC files, Ogg audio files and/or the like. Relevant video types may be, for example, Flash Video files, MP4 files, 3GPP media files, 3GPP2 media files, Windows Media Video files, AVI files, ASF files, QuickTime files, Ogg video files and/or the like. The identification of the web content is not limited to file detection, and streaming representations of the various media types may be detected. For example, “rtsp” links that direct to streams representing audio content and/or video content may be identified.
  • The relevant media types may be detected using known file extensions. For example, JPEG image files typically have a “.jpg” extension, MP3 audio files typically have a “.mp3” extension, and QuickTime files typically have a “.mov” extension. Alternatively, the relevant media types may be detected using known MIME type associations as defined by the Internet Assigned Numbers Authority (IRNA). For example, JPEG image files may be associated with a “image/jpeg” description, MP3 audio files may be associated with an “audio/mpeg” description, and MP4 video files may be associated with a “video/mp4” description. Therefore, the application 41 may analyze the description of the web page for web content, links and/or references which have the known media types, file types, file extensions and/or MIME types associated with the web content.
  • In an embodiment, protocol exchanges with a remote web server and/or media server may be observed, may be initiated and/or may be analyzed. The protocol exchanges may be observed, may be initiated and/or may be analyzed to recognize the media types, the file types, the file extensions and/or the MIME types. For example, the MIME type associated with media content may be returned in response to a HTTP GET message requesting the media content. Thus, header information in an HTTP GET protocol exchange may be analyzed to determine whether the MIME type of the media content sent in response corresponds to a known media type.
  • In an embodiment, a portion of a media content object of the web page may be requested using a link and/or a reference to the media content object identified using the description of the web page. Analysis of the portion of the object may be used to determine whether to identify the media content object as a whole as internet media content. For example, most media content types have up-front identifiers, known to one having ordinary skill in the art as “Magic Numbers,” placed at and/or near the front of the media content file. The up-front identifiers may be sufficient to identify the object as a media content file. For example, a Flash video file may begin with an up-front identifier of an ASCII representation of “FLV.” As a further example, leading portions of an MP4 or 3GPP file may have an up-front identifier of an “ftyp” atom having recognizable brands represented in ASCII form as “3gp4,” “3gp5,” “isom,” “mp41” and/or other brands. The definition of the recognizable brands may be found in standard specifications from ISO/IEC, 3GPP and/or other standards organizations, and such brands are known to one having ordinary skill in the art. Thus, the identification of the media controls for accessing internet media content may involve requesting a portion of a media content object to enable the portion of the media content object to be parsed and/or analyzed to determine whether up-front identifiers and/or other identifying information are present.
  • The identification of the media controls for accessing internet media content may use media publication and/or syndication standards, such as, for example, RSS, to detect the media controls. For example, if the web page has and/or references an RSS feed, the identification of the media controls for accessing internet media content may involve analysis of the RSS feed to detect the internet media content in the RSS feed. The present invention may make use of one or more of the methods disclosed herein for identifying the media controls for accessing internet media content; however, the present invention is not limited to these methods and may employ other methods for identifying the media controls for accessing internet media content known to one having ordinary skill in the art.
  • The application 41 may provide the web content and/or the modified web content to the HTML rendering engine 48 in a web content and/or modified web content transmission 63. In an embodiment, the application 41 modifies the web content to insert the additional media controls; in another embodiment, the application 41 does not modify the web content to insert the additional media controls. The HTML rendering engine 48 may use the web content and/or the modified web content received in the web content and/or modified web content transmission 63 to create a rendered web page which may be displayed to the user 15 of the mobile device 11. If the application 41 modified the web content, the rendered web page may have the additional media controls.
  • The user 15 may interact with the rendered web page. For example, the user 15 may select, may point at, may touch, may invoke, and/or may click on one of the media controls or the additional media controls. In response, the HTML Rendering Engine 48 may generate a user interaction signal 64 and/or may communicate the user interaction signal 64 to the application 41. The user interaction signal 64 may indicate that the user 15 selected, pointed at, touched, invoked, and/or clicked on an object, image, button and/or link in the rendered web page. The user interaction signal 64 may identify the object, the image, the button and/or the link. The user interaction signal 64 may identify and/or may provide a URL corresponding to the object, the image, the button and/or the link. The application 41 may use the user interaction signal 64 and/or the URL to determine that the user 15 invoked one of the media controls previously identified by the application 41 and/or that the user 15 invoked one of the additional media controls inserted by the application 41.
  • Alternatively, the user interaction signal 64 may convey a lower level user interaction event. For example, the user interaction signal 64 may indicate that the user 15 selected, pointed at, touched and/or clicked on a particular location within the rendered web page. In this case, the application 41 may compare the particular location to the known locations of the media controls previously identified by the application 41 and/or the additional media controls inserted by the application 41. As a result, the application 41 may use the user interaction signal 64 to determine the particular control invoked by the user 15.
  • Still further, the user interaction signal 64 may cause a portion of the application 41 to execute. For example, one of the additional media controls may be scripting code capable of interacting with the application 41 in response to user input in the rendered web page. In this case, the scripting code may invoke the application 41 using a particular function, class, API and/or library associated with the application. Therefore, the user interaction signal 64 may be a direct invocation of the application 41 or part of the application 41 by the additional media controls, and/or the application 41 or the part of the application 41 may be in the form of scripting code executed by the HTML rendering engine 48.
  • As an example, the following JavaScript code fragment is suitable for addition to mobile device web pages provided by the YouTube video sharing service (domain at www.youtube.com). The application 41 may insert the code fragment into web content retrieved from the YouTube video sharing service and/or may provide the code fragment to the HTML rendering engine 48. The HTML rendering engine 48 may execute the code fragment in a scripting engine of the HTML rendering engine 48. The code fragment may process a touch event invoked by the user 15 touching a location (x,y) on the touchscreen of the mobile device 11. The code fragment may examine the web content to determine whether the touched location corresponds to a media control for playing YouTube video content accessible from the corresponding web page. If the touched location corresponds to a media control, the code fragment may invoke the application 41 using a function provided by the application 41, namely mediaControlClicked( ). As a result, the mediaControlClicked( ) function and/or the application 41 may initiate an appropriate media action.
  •  function processTouchEvent_YouTube(x, y) { var touchedElement = document.elementFromPoint(x,y); if (touchedElement) {  var WATCH_THUMB = “watch_thumb”;  var PAGE_ELEMENT_ID = “page_element_id”  var elem = touchedElement;  var mediaControlFound = false;  while (elem != null && !mediaControlFound) {  if (elem.nodeName == “DIV”) { var node = elem. getAttributeNode(PAGE_ELEMENT_ID); if (node && (node.value == WATCH_THUMB)) mediaControlFound = true; } elem = elem.offsetParent; } if (mediaControlFound) { window.HTMLOUT.mediaControlClicked( ); } else { window.HTMLOUT.mediaControlNotClicked( );  }  } else {  window.HTMLOUT.mediaControlNotClicked( );  } }
  • The application 41 may initiate a media action in response to receipt of the user interaction signal 64. For example, the application 41 may play the internet media content corresponding to the control invoked by the user 15 on the mobile device 11. Further, the application 41 may transfer the internet media content corresponding to the control invoked by the user 15 to a rendering device in the local network 12, such as one of the rendering devices 21, 22, 23. Still further, the application 41 may add the internet media content corresponding to the control invoked by the user 15 to a playback queue. The media action may depend on the specific control invoked by the user 15 as determined by the application 41 based on the user interaction signal 64.
  • FIG. 4 generally illustrates an embodiment of a system 70 for using the application 41 to retrieve, analyze and/or modify web content for the purpose of transferring internet media content to a rendering device in an embodiment of the present invention. The system 70 may use the structure and connections previously described for FIG. 2.
  • Referring again to FIG. 4, the HTML rendering engine 48 may request and/or retrieve web content corresponding to a web page. The request and/or the retrieval of the web content may be in response to instructions from the application 41 and/or may be based on user interaction with a previously rendered web page. For example, the user 15 may have selected a link for a new web page, and the HTML rendering engine 48 may attempt to load the new web page in response to the selection of the link by the user 15.
  • The HTML rendering engine 48 may use a content request transmission 71 to send a content request to a content source 75. The content source 75 may be, for example, one of the content sources 31, 32, 33 and the content source 44. In response, the HTML rendering engine 48 may receive a web content transmission 72 from the content source 75. The web content transmission 72 may provide the web content requested by the HTML rendering engine 48. In this way, the HTML rendering engine 48 may send one or more content requests to one or more content sources to obtain the web content corresponding to a web page. The HTML rendering engine 48 may utilize any of the various techniques for retrieving web content previously described herein.
  • The HTML rendering engine 48 may construct a structured representation of the web content corresponding to the web page. For example, the HTML rendering engine 48 may construct a DOM model of the web page. The HTML rendering engine 48 may enable the application 41 to access the structured representation of the web page. Such access may be exposed to the application 41 using an API, a class, a function call, and/or another suitable mechanism.
  • The application 41 may perform an analysis and/or a modification 73 of the web content. For example, the application 41 may access the structured representation of the web page to analyze the web content, to identify the media controls for accessing the internet media content, and/or to modify the web content by inserting additional media controls. The application 41 may access the structured representation of the web page through the HTML rendering engine 48 and may utilize any of the techniques for analyzing and/or modifying the web content previously described for FIG. 3.
  • Alternatively, the application 41 may provide scripting code to the HTML rendering engine 48 so that the scripting code may use the structured representation to perform the analysis and/or the modification 73 of the web content. For example, the application 41 may provide a JavaScript code fragment to the HTML rendering engine 48. The HTML rendering engine 48 may have a scripting engine to execute the JavaScript code fragment. The Javascript code fragment may access the structured representation of the web page to identify the media controls for accessing the internet media content. The Javascript code fragment may generate a result which identifies the media controls, and/or the HTML rendering engine 48 may forward the result to the application 41.
  • The HTML rendering engine 48 may use the web content and/or the modified web content to create a rendered web page displayed to the user 15 of the mobile device 11. If the application 41 modified the web content, the rendered web page may have the additional media controls. The user 15 may interact with the rendered web page. For example, the user 15 may select, may point at, may touch, may invoke and/or may click on one of the media controls and/or the additional media controls. In response, the HTML rendering engine 48 may generate a user interaction signal 74 and/or may communicate the user interaction signal 74 to the application 41. The user interaction signal 74 may indicate that the user selected, pointed at, touched, invoked, and/or clicked on an object, an image, a button and/or a link in the rendered web page. The user interaction signal 74 may identify the object, the image, the button and/or the link. The user interaction signal 74 may identify and/or may provide a URL corresponding to the object, the image, the button and/or the link. The application 41 may use the user interaction signal 74 and/or the URL to determine that the user 15 invoked one of the media controls previously identified by the application 41 and/or one of the additional media controls inserted by the application 41.
  • Alternatively, the user interaction signal 74 may convey a lower level user interaction event. For example, the user interaction signal 74 may indicate that the user 15 selected, pointed at, touched and/or clicked on a particular location within the rendered web page. In this case, the application 41 may compare the particular location to the known locations of the media controls previously identified by the application 41 and/or the additional media controls inserted by the application 41. As a result, the application 41 may use the user interaction signal 74 to determine the particular control invoked by the user 15.
  • Still further, the user interaction signal 74 may cause a portion of the application 41 to execute. For example, one of the additional media controls may be scripting code capable of interacting with the application 41 in response to user input in the rendered web page. In this case, the scripting code may invoke the application 41 using a particular function, class, API and/or library associated with the application. Therefore, the user interaction signal 74 may be a direct invocation of the application 41 or part of the application 41 by the additional media controls, and/or the application 41 or the part of the application 41 may be in the form of scripting code executed by the HTML rendering engine 48.
  • The application 41 may initiate a media action in response to receipt of the user interaction signal 74. For example, the application 41 may play the internet media content corresponding to the control invoked by the user 15 on the mobile device 11. Further, the application 41 may transfer the internet media content corresponding to the control invoked by the user 15 to a rendering device in the local network 12, such as one of the rendering devices 21, 22, 23 and/or the rendering device 42. Still further, the application 41 may add the internet media content corresponding to the control invoked by the user 15 to a playback queue. The media action may depend on the specific control invoked by the user as determined by the application 41 based on the user interaction signal 74.
  • Referring again to FIGS. 3 and 4, the systems 60, 70 may enable the mobile device 11 to receive web content for a web page having original media controls for accessing internet media content. Each original media control may have a corresponding internet media content object. In an embodiment, the application 41 may receive the web content in response to a content request message sent by the application 41. In another embodiment, the HTML rendering engine 48 may receive the web content in response to a content request message sent by the HTML rendering engine 48.
  • Then, the application 41 on the mobile device 11 may process the web content to identify the original media controls. Then, the application 41 may modify the web content to form modified web content having one or more additional media controls. In an embodiment, the application 41 may modify the web content using a structured representation of the web page created by the HTML rendering engine 48. Each of the additional media controls may correspond to one of the original media controls. One or more of the additional media controls may be an HTML <a> tag which links to a first URL conveyed by the user interaction signal 64, 74 to the application 41. One or more of the additional media controls may be created by a scripting code fragment provided by the application 41 to the HTML rendering engine 48.
  • One or more of the additional media controls may be created by a scripting code fragment retrieved by the application 41 from a database of scripting code fragments. The database may be remote from the mobile device 11 and/or may be accessible to the mobile device 11 through the internet 13. Each of the scripting code fragments in the database may correspond to a content source. The application 41 may retrieve the scripting code fragment by querying the database using a content source identifier.
  • Then, the modified web content may be displayed as a rendered web page using the HTML rendering engine 48. The application 41 may receive the user interaction signal 64, 74 which indicates user interaction with the rendered web page. In an embodiment, the HTML rendering engine 48 may generate the user interaction signal 64, 74 in response to user interaction with the rendered web page. The user interaction signal 64, 74 may be generated by a scripting code fragment inserted into the web content by the application 41.
  • The application 41 may process the user interaction signal 41 to identify one of the additional media controls invoked by the user 15. Then, the application 41 may initiate a media action corresponding to the identified additional media control, and the media action may involve the internet media content object corresponding to the original media control to which the identified additional media control corresponds. The media action may instruct an external rendering device to play back the internet media content object; alternatively, the media action may add the internet media content object to a playback queue.
  • FIG. 5 generally illustrates web content 80 and modified web content 81 used in an embodiment of the present invention. FIG. 5 depicts a conceptual illustration of the web content 80 corresponding to a web page, and FIG. 5 is not intended to represent a web page in rendered form.
  • The web content 80 may be retrieved by the application 41 and/or the HTML rendering engine 48. The web content 80 may be and/or may have a single element, such as an HTML file, or may be and/or may have multiple files that reference multiple elements, such as graphic images, style sheets, borders, scripting code, media objects, and/or the like. The web content 80 may be stored in a structured representation by the application 41 and/or the HTML rendering engine 48. For example, the web content 80 may be stored in a DOM format.
  • The application 41 may identify media controls 88. In the example illustrated in FIG. 5, the application 41 may identify a first media control 91, entitled “C1”; a second media control 92, entitled “C2”; and/or a third media control 93, entitled “C3.” The media controls 88 may enable the user 15 to access internet media content present in the original web content retrieved from a content source.
  • The application 41 may insert additional media controls 89 in the web content 80 to form the modified web content 81. For example, in the example illustrated in FIG. 5, the application 41 may insert a first additional media control 101, entitled “C1 a”; a second additional media control 102, entitled “C1 b”; a third additional media control 103, entitled “C2 a”; a fourth additional media control 104, entitled “C2 b”; a fifth additional media control 105, entitled “C3 a”; and/or a sixth additional media control 106, entitled “C3 b.” The present invention is not limited to a specific number of media controls or additional media controls, and any number of media controls and additional media controls may be used.
  • The additional media controls 89 inserted by the application 41 may correspond to the media controls 88 provided by the rendered web page 105. For example, the first additional media control 101 and/or the second additional media control 102 of the modified web content 81 may correspond to the first media control 91 of the web content 80. The third additional media control 103 and/or the fourth additional media control 104 of the modified web content 81 may correspond to the second media control 92 of the web content 80. The fifth additional media control 105 and/or the sixth additional media control 106 of the modified web content 81 may correspond to the third media control 93 of the web content 80.
  • Therefore, the first additional media control 101 and/or the second additional media control 102 may correspond to enhanced media functions which operate on the internet media content accessible using the first media control 91. In a similar fashion, the third additional media control 103 and/or the fourth additional media control 104 may correspond to enhanced media functions which operate on the internet media content accessible using the second media control 92. The fifth additional media control 105 and/or the sixth additional media control 106 may correspond to enhanced media functions which operate on the internet media content accessible using the third media control 93.
  • The additional media controls 89 inserted by the application 41 may depend on properties of the internet media content. For example, the application 41 may determine that an additional media control for transferring media content for playback on a DLNA-compatible stereo device may be relevant to a digital music track found in the web content for a web page. However, the additional media control may not be relevant to a digital video clip identified in the web content for the same web page. Therefore, the application 41 may insert a different set of the additional media controls 89 for each of the media controls 88 identified by the application 41, and each set of the additional media controls 89 may be appropriate for the internet media content accessible by the corresponding one of the media controls 88.
  • FIG. 6 generally illustrates a system 200 for using the application 41 on the mobile device 11 to transfer internet media content to a rendering device in a home network in an embodiment of the present invention. As previously set forth, the application 41 may communicate with one or more user interface components provided by the mobile device 11 and/or the operating system of the mobile device 11 to display an user interface 49 and/or receive user input. The application 41 may communicate with the network interface 52 provided by the mobile device 11 and/or the operating system of the mobile device 11 to access the local network 12 and/or a wide area network, such as the internet 13. In this way, the application may communicate with other devices, such as rendering devices, rendering control components, content sources, and/or the like. The application 41 may access a remote database through the local network 12 and/or the internet 13 to retrieve content-site specific information for analyzing and/or modifying web content from a content site.
  • The application 41 may communicate with the HTML rendering engine 48 using one or more classes, functions, API's, and/or the like. The HTML rendering engine 48 may be provided by the operating system of the mobile device 11 and/or may be present on the mobile device 11 for access and use by the application 41. The application 41 may provide the web content 80 and/or the modified web content 81 to the HTML rendering engine 48. The application 41 may communicate with the HTML rendering engine 48 to request the HTML rendering engine 48 to retrieve the web content 80 for a web page, to render the web page, and/or to display the web page. The application 41 may communicate with the HTML rendering engine 48 to analyze and/or modify the web content 80 stored and/or maintained in a structured representation by the HTML rendering engine 48. The application 41 may provide scripting code fragments to the HTML rendering engine 48 for insertion into the web content 80 of a web page and/or for execution in a scripting engine of the HTML rendering engine 48. The application 41 may receive the user interaction signal 64, 74 from the HTML rendering engine 48 and may execute media actions in response to receipt of the user interaction signal 64, 74.
  • The application 41 may have application logic elements 201 which may perform the various logical functions described herein. The application 41 may have a retrieval element 202 which may retrieve the web content 80, an analysis element 203 which may analyze the web content 80, a modification element 204 which may modify the web content 80 to form the modified web content 81, a UI control element 205 which may control the application user interface, an interaction processing element 206 which may process the user interaction signal 64, 74, and/or a queue control element 207 which may control queued playback. An embodiment of the application 41 may not have one or more of the application logic elements 201. For example, an embodiment of the application 41 may rely on the HTML rendering engine 48 to retrieve the web content; therefore, the application 41 may not have the retrieval element 202. As another example, an embodiment of the application 41 may not modify the web content 80 to insert the additional media controls 89; therefore, the application 41 may not have the modification element 204. As yet another example, an embodiment of the application 41 may not support queued playback; therefore, the application 41 may not have the queue control element 207.
  • The retrieval element 202 may send one or more content requests to one or more content sources to obtain the web content 80 corresponding to a web page. Each of the content requests may be an HTTP GET request or a similar request as known to one skilled in the art. The retrieval element 202 may generate the content requests to have a requester identification field associated with the mobile device 11 and/or a web browser of the mobile device 11 to obtain the web content 80 in a form suitable for display on the mobile device 11. The content requests may be sent and/or the corresponding responses may be received through the network interface 52.
  • The analysis element 203 may examine the web content 80 for a web page to identify the media controls 88 for accessing internet media content. The analysis element 203 may examine a representation of the web page created, stored and/or maintained by the application 41 as previously set forth in the description of the embodiment depicted in FIG. 3. Alternatively, the analysis element 203 may examine a representation of the web page created, stored and/or maintained by the HTML rendering engine 48 as previously set forth in the description of the embodiment depicted in FIG. 4. If the analysis element 203 examines a representation of the web page created, stored and/or maintained by the HTML rendering engine 48, the analysis element 203 may communicate with and/or may interact with the HTML rendering engine 48 to perform the analysis of the web content 80.
  • Analysis of the web content 80 may be performed directly by the analysis element 203 to analyze the web content 80. For example, the application 41 may have executable classes, functions and/or code which may directly examine the web content 80 and/or a structured representation of the web content 80. The executable classes, functions and/or code may identify the media controls 88 for accessing the internet media content. The executable classes, functions and/or code may identify HTML <a> tags which may link to media content objects, scripting code present in the web content 80 for accessing media content, and/or the like.
  • Alternatively, analysis of the web content 80 may be performed by scripting code provided by and/or generated by the analysis element 203 to analyze the web content 80. The scripting code may be provided to the HTML rendering engine 48 and/or may be embedded in the web content 80 for the web page. The scripting code may be executed by a scripting engine of the HTML rendering engine 48. The scripting code may identify the media controls 88 and/or may communicate information about the media controls 88 to the application 41. The communication may indicate the number of the media controls 88, the type of the media controls 88, the location of the media controls 88 in the web content 80 and/or the rendered web page, information about the internet media content retrievable using the media controls 88, and/or the like. For example, the information may have and/or may be a URL for accessing the media content, a MIME type of the media content, metadata describing the media content, and/or the like.
  • Analysis of the web content 80 may rely on information retrieved from a database of content-source specific information. The information may specify how a particular content source provides media controls 88 in web pages. The information may specify rules and/or instructions for analyzing the web content 80 from a particular content source to identify the media controls 88. The information may have and/or may be executable code and/or scripting code capable of analyzing the web content 80 from a particular content source to identify the media controls 88. The application 41 and/or the analysis element 203 may query the database using a content source identifier, such as the domain name of the content source. As a result, the application 41 may obtain the information relevant to the content source and/or may use the information to analyze the web content 80 from the content source.
  • The modification element 204 may modify the web content 80 of a web page to insert the additional media controls 89. The modification element 204 may operate on a representation of the web page created, stored and/or maintained by the application 41 as previously set forth in the description of the embodiment depicted in FIG. 3. Alternatively, the modification element 204 may examine a representation of the web page created, stored and/or maintained by the HTML rendering engine 48 as previously set forth in the description of the embodiment depicted in FIG. 4. If the modification element 204 examines a representation of the web page created, stored and/or maintained by the HTML rendering engine 48, the modification element 204 may communicate with and/or may interact with the HTML rendering engine 48 to modify the web content 80 to generate the modified web content 81.
  • Modification of the web content 80 may insert HTML code into the web content 80 of the web page. The inserted HTML code may display one or more of the additional media controls 89, such as a text link, an icon, a button, a thumbnail image, and/or the like. The inserted HTML code may provide a link to a URL created by and/or known to the application 41. The application 41 may determine that the user 15 invoked one of the additional media controls 89 based on receipt of a user interaction signal 64, 74 which contains and/or references the URL.
  • Alternatively, modification of the web content 80 may utilize scripting code. The modification element 204 may embed the scripting code into the web content 80 and/or may provide the scripting code to the HTML rendering engine 48. The scripting code may display the additional media controls 89 in the rendered web page, may interact with the user 15 through the rendered web page, and/or may communicate user interaction signals to the application 41.
  • The inserted HTML code, graphic images representing the additional media controls 89, information on how to insert the HTML code in the web content 80, and/or the like may be content source-specific and/or may be retrieved by the application 41 from a database of content-source specific information. In a similar fashion, the scripting code, the graphic images referenced by the scripting code, the information on how to insert the scripting code in the web content 80, and/or the like may be content-source specific, and may be retrieved by the application 41 from a database of content-source specific information.
  • The UI control element 205 may communicate with one or more user interface elements of the mobile device 11 and/or the user interface 49 of the mobile device 11 to display the application user interface and/or enable the user 15 to interact with the application user interface. The application user interface which may be presented by the user interface of the mobile device 49 may have application controls, such as buttons, menus, dialog boxes, text entry boxes, and/or other well-known user interface elements, to enable the user 15 to control and/or to interact with the application 41. Such application controls may be separate from and/or in addition to the rendered web page which may have controls, buttons, links, text entry boxes, and/or the like. In FIGS. 7 and 8, application controls presented by the application 41 are distinguished from user interface elements which are part of the rendered web page.
  • Referring again to FIG. 6, the UI control element 205 may control display of the application controls and/or may accept user input events relevant to the application controls. For example, application controls may have and/or may be a “Back” button and a “Forward” button for web browsing. The UI control element 205 may communicate with one or more user interface components of the operating system of the mobile device 11 to cause the “Back” button and the “Forward” button to be displayed in the user interface 49 on the mobile device 11 and/or to register to receive user input events relevant to these buttons. In a similar fashion, the UI control element 205 may control the presentation of additional application controls, such as a control for managing the visual layout of the screen of the mobile device 11. The UI control element 205 may manage user input events related to the various application controls described herein for the various embodiments.
  • The application may have a discover and control renderers component 208, a renderer control component interface 209 (hereafter “the RCC interface 209”), a media transcoder component 210, and/or a media server component 211. The discover and control renderers component 208 may discover rendering devices available in the local network 12. Further, the discover and control renderers component 208 may determine rendering capabilities of the rendering devices to determine whether media content may be compatible with the rendering devices. Still further, the discover and control renderers component 208 may track the presence and/or the absence of specific rendering devices available in the network and/or may maintain records tracking the rendering capabilities of the available renderers. Moreover, the discover and control renderers component 208 may communicate with the rendering devices to initiate, maintain and/or control the rendering of media content on the rendering devices. The discover and control renderers component 208 may have any of the functions and/or capabilities described for the “renderer discovery and control component 110” of U.S. Patent App. Pub. No. 2010/0332565, herein incorporated by reference in its entirety. In an embodiment, the discover and control renderers component 208 may be and/or may function as a UPnP AV control point.
  • The RCC interface 209 may communicate with the rendering control component 25 which may be external to the mobile device 11 and/or may provide renderer control functions to other devices. The application 41 may communicate with the rendering control component 25 to control rendering devices and/or to access queued playback functionality provided by the rendering control component 25. The various rendering control functions and interfaces for accessing such functions over a network are disclosed in U.S. Patent App. Pub. No. 2010/0095332, herein incorporated by reference in its entirety.
  • The media transcoder component 210, if present, may provide transcoding functionality by adapting the internet media content for compatibility with the rendering capabilities of a rendering device. The media transcoder component 210 may perform audio codec transcoding, video codec transcoding, format transcoding, and/or the like. Internet media content selected by the user 15 for playback on an external rendering device may be received through the network interface 52 and/or may be transcoded by the media transcoder component 210. The resulting transcoded media content may be made available to rendering devices in the local network 12 through the media server component 211 as described hereafter. The media transcoder component 210 may have any of the functions and/or behavior described for the “transcoding engine 90” of U.S. Patent App. Pub. No. 2010/0332565, herein incorporated by reference in its entirety.
  • The media server component 211, if present, may make the internet media content available to rendering devices in the local network 12. The media server component 211 may function as an HTTP server, a UPnP AV media server, a DLNA compliant media server, a proxy server, and/or the like. The media server component 211 may have any of the functions and/or behavior described for the “media server 100” disclosed in U.S. Patent App. Pub. No. 2010/0332565, herein incorporated by reference in its entirety.
  • The media transcoder 210 and/or the media server 211 may be provided in the application 41 as shown in FIG. 6. Alternatively, one or both of the media transcoder 210 and/or the media server 211 may be provided by another component available in the network. For example, the media transcoder 210 and/or the media server 211 may be combined in a stand-alone proxy server and/or may be incorporated into the rendering control component 25 which may be external to the mobile device 11. The application 41 and/or the RCC interface 209 may communicate with the separate component which provides the media transcoder 210 and/or the media server 211. For example, the application 41 and/or the RCC interface 209 may instruct the separate component to request, retrieve, transcode and/or make available a particular internet media content object which must be transferred to a rendering device as the result of a media action requested by the user 15 of the mobile device 11.
  • The interaction processing element 206 may receive user interaction signals 64, 74 from the HTML rendering engine 48. The interaction processing element 206 may process the user interaction signals 64, 74 to determine and/or to initiate corresponding media actions. As a first example, the HTML rendering engine 48 may provide a user interaction signal 64, 74 indicating that the user 15 invoked a media control 88 corresponding to a particular URL in the rendered web page. The interaction processing element 206 may receive the user interaction signal 64, 74 and/or may resolve the particular URL to determine the relevant media content object and the associated media action. For example, the particular URL may correspond to one of the additional media controls 89 previously inserted into the web content 80 by the modification element 204. The additional media control 89 may correspond to transfer of a first internet media content object to the first rendering device 21 in the local network 12.
  • As a result, the interaction processing element 206 may initiate the media action. For example, the interaction processing element 206 may direct the discover and control renderers component 208 of the application 41 to instruct the first rendering device 21 to request, retrieve and/or initiate playback of the first internet media content object. Alternatively, the interaction processing element 206 may use the RCC interface 209 of the application 41 to instruct the rendering control component 25 to communicate with the first rendering device 21 to initiate transfer of the first internet media content for playback on the first rendering device 21.
  • Initiation of a media action may involve one or more of the discover and control renderers component 208, the RCC interface 209, the media transcoder component 210 and/or the media server component 211 of the application 41. The role of each of these components may vary depending on the embodiment and the media action being initiated.
  • As a second example of the interaction processing element 206 processing the user interaction signals 64, 74, the HTML rendering engine 48 may provide a user interaction signal 64, 74 which may indicate that the user 15 pointed at, selected, touched and/or clicked on a particular location in the rendered web page. The user interaction signal 64, 74 may indicate the location using (x,y) coordinates into the rendered web page. The interaction processing element 206 may determine whether the location matches any of the media controls 88 and/or the additional media controls 89 in the rendered web page. If the location matches one of the media controls 88 or one of the additional media controls 89, the interaction processing element 206 may determine which of the media controls 88 or additional media controls 89 matches the location. Then, the interaction processing element 206 may initiate a corresponding media action. For example, the (x,y) location specified in the user interaction event may match an additional media control 89 which corresponds to adding a second internet media content object to a playback queue. Then, the interaction processing element 206 may cause the second internet media content object to be added to the playback queue.
  • The interaction processing element 206 may communicate with the queue control element 207 to add the second internet media content object to an internal playback queue. The interaction processing element 206 may use the RCC interface 209 to communicate with the rendering control component 25 to add the second internet media content object to a playback queue created, controlled and/or maintained by the rendering control component 25. The specific steps used to initiate the media action may vary based on the embodiment and/or based on the presence or absence of the rendering control component 25.
  • As a third example of the interaction processing element 206 processing the user interaction signals 64, 74, the user interaction signal 64, 74 may be a direct invocation of the application 41 or of part of the application 41 by a scripting code fragment. The scripting code fragment may have been inserted into the web content 80 by the application 41 and/or previously provided by the application 41 to the HTML rendering engine 48. The scripting code fragment may have determined that the user 15 invoked an additional media control 89 associated with the scripting code fragment. The scripting code fragment may have determined that the additional media control 89 invoked by the user 15 corresponds to transfer of a third internet media content object to the second rendering device 22 for playback on the second rendering device 22. The scripting code fragment may directly call, invoke and/or communicate with a function of the application 41 designed to respond to user interaction signals 64, 74 from such scripting code fragments. When calling, invoking and/or communicating with the function, the scripting code fragment may provide information which may identify which of the additional media controls 89 the user invoked and/or more specific information which may specify the third internet media content object, the media action, and/or the second rendering device 22.
  • As a result, the function and/or the application 41 may initiate the media action. For example, the function and/or the application 41 may cause the discover and control renderers component 208 of the application 41 to instruct the second rendering device to request, retrieve and/or initiate playback of the third internet media content object. Alternatively, the function and/or the application 41 may use the RCC interface 209 to instruct the rendering control component 25 to communicate with the second rendering device 22 to initiate transfer of the third internet media content for playback on the second rendering device.
  • If initiating transfer of internet media content to an external rendering device, the application 41 may determine an alternate version of the internet media content for display using the external rendering device. Many content sources adapt their web pages and media content based on the client device which receives the web pages and the media content. For example, a content source may have a first web page format optimized for use on mobile devices, and a second web page format optimized for use within full PC web browsers. The first web page format may, for example, provide access to versions of the internet media content having a low quality, bitrate and/or resolution suitable for display on a small mobile device screen. In contrast, the second web page format may provide access to versions of the internet media content which have a higher quality, bitrate and/or resolution suitable for display on a large computer monitor display. Further, the external rendering device may have a high resolution visual display. For example, the external rendering device may be a DLNA-compatible high definition television. Therefore, the version of the internet media content accessible through the second web page format may be more suitable for display on the external rendering device relative to the version of the internet media content accessible through the first web page format.
  • The content source may provide the first web page format in response to content requests which specify the mobile device 11 and/or a web browser of the mobile device 11 in a requester identification field of the content requests. In a similar fashion, the content source may provide the second web page format in response to content requests which specify a PC and/or a PC web browser in the requester identification field.
  • Therefore, the application 41 may determine a higher quality version of the internet media content suitable for display using the external rendering device. For example, the application 41 may identify a higher quality version of the internet media content using the method 300 generally illustrated in FIG. 7. The method 300 may be performed by the mobile device 11 executing program instructions provided by a non-transitory computer-readable medium, such as register memory, processor cache and Random Access Memory (RAM).
  • At step 301, the first web content 80 for the web page may be retrieved for display on the mobile device 11. For example, retrieval may use the content requests specifying the mobile device 11 and/or a web browser of the mobile device 11 in a requester identification field of the content requests. At step 303, the first web content 80 may be analyzed to identify the media controls 88 in the web page as previously set forth. At step 305, the first web content 80 may be modified to insert the additional media controls 89 if such technique is used in the specific embodiment. Step 305 is optional; in some embodiments, the first web content 80 is modified to insert the additional media controls 89, and in other embodiments, the first web content 80 is not modified to insert the additional media controls 89.
  • At step 307, a rendered web page may be created from the first web content 80 and/or the modified first web content 81. The HTML rendering engine 48 may create the rendered web page for display to the user 15. The application user interface which may be presented by the user interface of the mobile device 49 may display the rendered web page to the user 15. At step 309, the user may invoke one of the media controls 88 and/or one of the additional media controls 89. The media control 88 and/or the additional media control 89 invoked by the user 15 may be associated with internet media content. As a result, the HTML rendering engine 48 may send the user interaction signal 64, 74 to the application 41. At step 311, second web content 80 for the web page may be retrieved using content requests which specify a full web browser in a requester identification field of the content requests. At step 313, the second web content 80 may be analyzed to determine a higher quality version of the internet media content suitable for display on the external rendering device. At step 315, the higher quality version of the internet media content may be transferred to the external rendering device.
  • For the method 300, retrieval of the second web content 80 at step 311 does not require retrieval of all web content elements necessary for rendering of the web page. The second web content 80 is not used to render a web page on the mobile device 11 or on any other device; the second web content 80 is only used to determine the higher quality version of the internet media content suitable for display on the external rendering device. Consequently, the retrieval of the second web content 80 at step 311 may omit elements of the web page, such as graphics, images, style sheets, and/or other elements which are only used for visual rendering of the web page.
  • The steps of the method 300 may be carried out in any order, and the present invention is not limited to the order of the steps presented in the preceding text and FIG. 7. For example, step 311 and/or step 313 may occur at any time earlier in the method 300. In an embodiment, the first web content and/or the second web content may be retrieved “up-front” to avoid delay and achieve faster response time between the user input accepted in step 309 and the transfer of the higher quality version of the internet media content to the external rendering device in step 315.
  • The queue control element 207 may maintain one or more playback queues for media playback. As known to one skilled in the art, a playback queue is typically a first-in, first-out (“FIFO”) queue which may track multiple media objects and may enable sequential playback of the multiple media objects. A playback queue may play the multiple media objects using the mobile device 11 and/or an external rendering device.
  • For example, the user 15 of the mobile device 11 may designate a playback queue for playback of digital music content to an external rendering device which may be a DLNA-compatible networked stereo device. Then, the user 15 may add a first digital music track, a second digital music track, and/or a third digital music track to the playback queue. As a result, the playback queue may play the first digital music track on the DLNA-compatible networked stereo device, followed by the second digital music track, and then the third digital music track.
  • The application 41 may support creation, management and/or use of internal playback queues. Consequently, the queue control element 207 may create one or more playback queue structures to provide playback queue functionality to the user 15 of the mobile device 11. The user 15 may add media objects, such as internet media content objects, to the playback queue. For example, the user 15 may invoke the media controls 88 and/or the additional media controls 89 in a rendered web page to add internet media content objects to a playback queue. Then, the queue control element 207 may instruct an external rendering device to play back the internet media content objects in sequence based on the position of the internet media objects within the playback queue. As a media content object in the playback queue is finished playing on the external rendering device, the queue control element 207 may instruct the external rendering device to begin to request, retrieve and/or play back the next media content object in the playback queue.
  • The queue control element 207 may use the discover and control renderers component 208 of the application 41 to directly control the external rendering device. Alternatively, the queue control element 207 may use the RCC interface 209 of the application 41 to communicate with the rendering control component 25. The rendering control component 25, if present, may have the ability to create, manage and use playback queues. In this case, the queue control element 207 may use the RCC interface 209 to instruct the rendering control component 25 to create a playback queue which targets a specific external rendering device. Subsequently, the queue control element 207 may use the RCC interface 209 to instruct the rendering control component 25 to add media content objects to the playback queue and/or to control the queued playback of media objects by the external rendering device. As a result, the queue state may be maintained in the rendering control component 25 instead of the application 41. In a similar fashion, the responsibility for maintaining and playing the playback queue may be delegated to the rendering control component 25 which may be external to the mobile device 11. Queue functionality which may be provided by the rendering control component 25 is described in detail in U.S. App. Pub. No. 2010/0095332, herein incorporated by reference in its entirety.
  • The queue control element 207 may support playback queue management functions which may be exposed to the user 15 through application controls displayed in the user interface 49 by the UI control element 205. For example, the user 15 may be able to view the current contents of a playback queue, edit the playback queue to remove and/or rearrange media content objects, delete a playback queue, create a new playback queue, skip ahead to the next media content object in the playback queue, skip backward to a preceding media content object in the playback queue, and/or the like.
  • FIG. 8 generally illustrates an embodiment of the application user interface displayed by the user interface 49 provided by the mobile device 11. The application user interface displayed by the user interface 49 of the mobile device 11 may be provided by the application 41 and/or the HTML rendering engine 48. In the example depicted in FIG. 8, the user 15 may have selected, may have specified and/or may have navigated to a specific content site, namely www.tvholic.com. The content site may provide internet video content and/or may provide a search facility for finding video content available through the content site. The user 15 may have used the search facility to execute a search. For example, the user 15 may have used text entry capabilities of the mobile device 11 and/or the HTML rendering engine 48 to enter “two cats” into a search form at the top of a rendered web page.
  • As a result, the application 41 and/or the HTML rendering engine 48 may have requested and/or retrieved the web content corresponding to the search results web page using techniques previously described. Further, the application 41 may have analyzed the web content using any of the techniques described herein to identify the media controls 88 for accessing internet media content. For example, the application 41 may have identified an active thumbnail image 110 for accessing a video clip entitled “In the Alley” as shown in FIG. 8. The application 41 may have identified other media controls not labeled in FIG. 8, and the present invention is not limited to the specific embodiment of the application user interface depicted in FIG. 8.
  • The HTML rendering engine 48 may process the web content 80 corresponding to the search results web page to create a rendered web page 105. The rendered web page 105 may be displayed in the application user interface. The rendered web page 105 may have the media controls 88, such as the active thumbnail image 110. The rendered web page 105 may have text, graphics, borders, supplementary information about the internet media content, and/or the like. The rendered web page 105 may have any information specified in the web content retrieved from the content site. The present invention is not limited to the specific embodiment of the rendered web page 105 depicted in FIG. 8.
  • The application 41 may display application controls 115 in the application user interface. As shown in FIG. 8, the application controls 115 may have and/or may be navigation controls 116 and/or playback mode selection controls 117. The navigation controls 116 may have and/or may be web browser functions, such as, for example, a back button, a forward button, a button to add the current page to a list of favorites, a button to reload and/or refresh the current page, a button to access a list of web browser configuration options, and/or the like.
  • The playback mode selection controls 117 may enable the user 15 to select a method used to play back media selected in and/or invoked from the rendered web page 105. For example, the playback mode selection controls 117 may have and/or may be a local playback mode control 118, an external renderer playback mode control 119, an add to queue playback mode control 120, and/or the like. The user may select one of the playback mode selection controls 117 to specify a selected playback mode. As a result, the application user interface may identify the selected playback mode. For example, the selected playback mode may be highlighted, may be displayed brightly, may be circled, and/or may be otherwise graphically distinguished from the other playback modes.
  • The local playback mode control 118 may correspond to playing the internet media content on the mobile device 11. For example, if the local playback mode control 118 is the selected playback mode control, the user 15 may select the active thumbnail image 110 to play the video clip “In the Alley” on the display screen of the mobile device 11. In a similar fashion, the user 15 may select any of the media controls 88 in the rendered web page 105, and the application 41 and/or the HTML rendering engine 48 may respond by using the display screen of the mobile device 11 to play the internet media content corresponding to the selected media control.
  • The external renderer playback mode control 119 may correspond to playing the internet media content using an external rendering device, such as one of the rendering devices 21, 22, 23 and/or the rendering device 42. For example, if the external renderer playback mode control 119 is the selected playback mode control, the user 15 may select the active thumbnail image 110 to play the video clip “In the Alley” on an external rendering device, such as a DLNA-compliant television. In a similar fashion, the user 15 may select any of the media controls 88 in the rendered web page 105, and the application 41 may respond by using the external rendering device to play the internet media content corresponding to the selected media control. As previously set forth, the application 41 may communicate directly with the external rendering device using the discover and control renderers 208 component of the application 41, and/or the application 41 may instruct the rendering control component 25 which may be external to the mobile device 11 to initiate the transfer of media content to the external rendering device.
  • The add to queue playback mode control 120 may correspond to adding the internet media content to a queue for queued playback. For example, if add to queue playback mode control 120 is the selected playback mode control, the user 15 may select the active thumbnail image 110 to add the internet video clip “In the Alley” to the queue. In a similar fashion, the user 15 may select any of the media controls 88 in the rendered web page 105, and the application 41 may respond by adding the internet media content corresponding to the selected media control to the queue. As previously set forth, the application 41 may have an internal representation of the playback queue and/or may use a playback queue provided and maintained by the rendering control component 25 which may be accessible through the local network 12.
  • The application user interface may present the playback mode selection controls 117 adaptively so that the playback mode selection controls 117 may depend on the available rendering devices in the local network 12, on the presence or absence of the rendering control component 25, and/or on the properties of the internet media content corresponding to the media controls 88 identified by the application 41 for the rendered web page 105. For example, the application user interface may not present the external renderer playback mode control 119 if no rendering devices are available in the local network 12. As another example, the application user interface may present multiple external renderer playback mode controls 119 if multiple rendering devices are available in the local network 12, and each of the multiple external renderer playback mode controls 119 may correspond to one of the multiple rendering devices. As yet another example, the application user interface may not present the add to queue playback mode control 120 if the application 41 relies on the presence of the rendering control component 25 to implement queued playback and the rendering control component 25 is unavailable in the local network 12.
  • The application user interface provided by the user interface 49 of the mobile device 11 may have additional functions not shown in FIG. 8. For example, the application user interface may have a function for selecting a rendering device from the rendering devices in the local network 12. The selected rendering device may be the target rendering device for external renderer playback mode invoked by selection of the external renderer playback mode 119. The application user interface may have a function for accessing rendering controls which control playback on the external rendering device. The application user interface may have a function to view, access, manage and/or control the playback queue.
  • One or more of the additional functions may be accessible using the playback mode selection controls 117. In an embodiment, the user 15 may “touch” one of the playback mode selection controls 117 to select a playback mode and/or may “long press” one of the playback mode selection controls 117 to configure the playback mode. For example, a “long press” on the external renderer playback mode control 119 may be used to access a screen. The screen may enable the user 15 to select a new target rendering device to use for external renderer playback mode, and/or the screen may enable the user 15 to access rendering controls for controlling rendering on the current target rendering device. In a similar fashion, a “long press” on the add to queue playback mode control 120 may be used to access a screen for viewing the current playback queue and for editing, managing and/or playing back the playback queue.
  • As a result, the application 41 may override the original functions of the media controls 88 presented within the web page by enabling the media controls 88 to invoke playback methods which may not have been originally supported by the web page. For example, the application 41 may use the playback mode selection controls 117 to provide playback methods which may not have been furnished by the content provider or the designer of the web page.
  • For example, an embodiment of the application 41 may retrieve a web page having the media controls 88 for accessing internet media content. Then, the web page may be displayed in the user interface 49 of the mobile device 11. Then, the playback mode selection controls 117 may be presented to enable the user 15 to select a playback mode from multiple playback modes. User input may select a playback mode using the playback mode selection controls 117 and/or may select one of the media controls 88 provided by the web page. Then, the internet media content accessed by the selected media control 88 may be played back using the selected playback mode. The internet media content accessed by the selected media control 88 may be played in response to selection of the playback mode using the playback mode selection controls 117 and/or may be played in response to selection of the media control 88 provided by the web page. The internet media content accessed by the selected media control may be played on the mobile device 11 and/or an external rendering device and/or may be added to a playback queue.
  • In an embodiment, the playback mode selection controls 117 may be implemented independently from other elements disclosed herein. For example, an embodiment of the application 41 may implement the playback mode selection controls 117 without.
  • FIG. 9 generally illustrates an embodiment of the application user interface displayed by the user interface 49 of the mobile device 11. The application user interface may be provided by the application 41 and/or the HTML rendering engine 48. The example depicted in FIG. 9 uses the same search results web page described for FIG. 8. However, FIG. 9 illustrates an embodiment in which the application 41 modifies the web content to insert additional media controls.
  • As previously set forth, the application 41 and/or the HTML rendering engine 48 may have requested and/or retrieved the web content 80 corresponding to the search results web page. The application 41 may have analyzed the web content 80 to identify the media controls 88 for accessing internet media content. Further, the application 41 may have modified the web content 80 to insert the additional media controls 89 corresponding to the media controls 88. The steps of analyzing the web content 80 and forming the modified web content 81 may have used any of the techniques previously described.
  • The HTML rendering engine 48 may have processed the modified web content 81 to create the rendered web page 105. The rendered web page 105 may be displayed in the application user interface. The rendered web page 105 may have the media controls 88, such as the active thumbnail image 110, and/or the additional media controls 89 inserted by the application 41. The rendered web page 105 may have text, graphics, borders, supplementary information about the internet media content, and/or the like. The rendered web page 105 may have any information specified in the web content retrieved from the content site.
  • The additional media controls 89 may have and/or may be an external renderer control 130 and/or an add to queue control 131 as shown in FIG. 9. The external renderer control 130 may correspond to playback of the associated internet media content on an external rendering device. The add to queue control 131 may correspond to addition of the associated internet media content to a queue for queued playback. Each of the media controls 88 identified by the application 41 may have corresponding additional media controls 89. Therefore, each of the additional media controls may correspond to a particular internet media content object, namely the internet media content object accessible using the corresponding media control 88.
  • For example, the user 15 may touch the active thumbnail image 110 for the “In the Alley” video clip to play the video clip on the mobile device 11. In addition, the user 15 may touch the external renderer control 130 corresponding to the “In the Alley” video clip to initiate playback of the video clip using the external rendering device. In a similar fashion, the user 15 may touch the add to queue control 131 corresponding to the “In the Alley” video clip to add the video clip to a playback queue.
  • The application user interface may display application controls 135. For example, the application controls 135 may have and/or may be the navigation controls 116 which may have web browser functions, such as a back button, a forward button, a button to add the current page to a list of favorites, a button to reload and/or refresh the current page, a button to access a list of web browser configuration options, and/or the like; controls for selecting a new target rendering device; playback controls for the target rendering device; playback queue controls; and/or the like. The present invention is not limited to these examples, and the application user interface may have additional application controls not listed herein.
  • In an embodiment, the application 41 may insert the additional media controls 89 adaptively based on the rendering devices available in the local network 12, the presence or absence of the rendering control component 25, and/or properties of the internet media content corresponding to the media controls 88 identified by the application 41 for the rendered web page 105. For example, the application 41 may insert a separate external renderer control 130 for each available rendering device capable of playing the internet media content. As a result, a particular internet media content may have zero, one or multiple external renderer controls 130 inserted by the application 41, depending on the number of compatible rendering devices available at that time. As another example, an embodiment of the application 41 may support queued playback of music content but not of video content. As a result, the application 41 may only insert the add to queue control 131 for music content objects.
  • The embodiments of the application user interface illustrated in the preceding examples do not limit the present invention. Variations of the preceding examples may be used in an embodiment of the present invention. For example, a modified user interface based on the preceding examples may present the rendered web page 105 corresponding to the original web content as illustrated in FIG. 8, but the modified user interface may not present the playback mode selection controls 117 shown in FIG. 8. The rendered web page 105 may have the media controls 88, such as the active thumbnail image 110, and the user 15 may select, may touch, may invoke and/or may click on one of the media controls 88 in the rendered web page 105. As a result, the HTML rendering engine 48 may generate the user interaction signal 64, 74 and/or may communicate the user interaction signal 64, 74 to the application 41.
  • In response to receipt of the user interaction signal 64/74, the application 41 may display controls to select a media action for the internet media content corresponding to the one of the media controls 88 selected by the user 15. For example, the application 41 may display a dialog window and/or a similar user input mechanism. The dialog window may prompt the user 15 to select local playback of the internet media content on the mobile device 11, playback of the internet media content on an external rendering device, or addition of the internet media content to a queue for queued playback. The user 15 may select one of these playback options using the dialog window, and, as a result, the application 41 may initiate a media action corresponding to the selected playback option.
  • FIGS. 8 and 9 depicts a search results web page as an example. The techniques disclosed herein are not limited to a search results web page and may be applied to any web page having controls for accessing internet media content. The application 41 may retrieve, may analyze, may modify, may present and/or may utilize any web page renderable using the HTML rendering engine 48.
  • As a result of the system and method described herein, the application 41 executed by a mobile device 11 may transfer internet media content to a rendering device in a home network. The application 41 may use the HTML rendering engine 48 to display a web page to the user 15 of the mobile device 11, and the web page may have the media controls 88 for accessing the internet media content. The web page may be based on web content 80 retrieved from a content source. The application 41 may analyze the web content 80 to identify the media controls 88 for accessing the internet media content, and/or the application 41 may form the modified web content 89 having the additional media controls 89. The application 41 may receive the user interaction signal 64, 74 which may indicate that the user 15 invoked one of the media controls 88 and/or one of the additional media controls 89. In response, the application 41 may initiate transfer of the internet media content to the rendering device in the home network and/or may queue the internet media content for later playback using the rendering device.
  • Various changes and modifications to the presently preferred embodiments described herein will be apparent to those skilled in the art. Such changes and modifications may be made without departing from the spirit and scope of the present invention and without diminishing its attendant advantages. Such changes and modifications are covered by the appended claims.

Claims (32)

1. A method for transferring internet media content using an application on a mobile device to transfer internet media content, the method comprising the steps of:
receiving web content for a web page on the mobile device wherein the web content has original media controls for accessing the internet media content and further wherein each of the original media controls has a corresponding internet media content object;
identifying the original media controls wherein the application on the mobile device processes the web content to identify the original media controls;
modifying the web content to form modified web content having one or more additional media controls wherein the application modifies the web content and further wherein each of the additional media controls corresponds to one of the original media controls;
displaying the modified web content as a rendered web page using an HTML rendering engine;
receiving a user interaction signal which indicates user interaction with the rendered web page wherein the application receives the user interaction signal;
processing the user interaction signal wherein the application processes the user interaction signal to identify one of the additional media controls invoked by the user; and
initiating a media action corresponding to the identified additional media control wherein the media action involves the internet media content object which corresponds to the original media control to which the identified additional media control corresponds.
2. The method of claim 1 further comprising the step of:
the application receiving the web content in response to a content request message sent by the application.
3. The method of claim 1 further comprising the step of:
the HTML rendering engine receiving the web content in response to a content request message sent by the HTML rendering engine.
4. The method of claim 1 further comprising the step of:
the HTML rendering engine generating the user interaction signal in response to user interaction with the rendered web page.
5. The method of claim 1 wherein the mobile device has an operating system which provides the HTML rendering engine.
6. The method of claim 1 further comprising the step of:
the application modifying the web content using a structured representation of the web page created by the HTML rendering engine.
7. The method of claim 1 wherein one of the additional media controls is an HTML <a> tag which links to a first URL wherein the user interaction signal conveys the first URL to the application.
8. The method of claim 1 wherein one of the additional media controls is created by a scripting code fragment provided by the application to the HTML rendering engine.
9. The method of claim 1 wherein one of the additional media controls is created by a scripting code fragment retrieved by the application from a database of scripting code fragments wherein each of the scripting code fragments in the database corresponds to a content source.
10. The method of claim 9 wherein the database is remote from the mobile device and accessible to the mobile device via the internet.
11. The method of claim 9 further comprising the step of:
the application retrieving the scripting code fragment by querying the database using a content source identifier.
12. The method of claim 1 wherein the user interaction signal is generated by a scripting code fragment inserted into the web content by the application.
13. The method of claim 1 wherein the media action instructs an external rendering device to play the internet media content object.
14. The method of claim 1 wherein the media action adds the internet media content object to a playback queue.
15. A method for transferring internet media content using an application on a mobile device to transfer internet media content, the method comprising the steps of:
retrieving a web page having media controls for accessing internet media content;
displaying the web page in a user interface of the mobile device;
presenting application controls for selecting a playback mode from a plurality of playback modes;
accepting first user input which selects a first playback mode using the application controls;
accepting second user input which selects a first media control from the media controls of the web page wherein the first media control is for accessing first internet media content; and
playing the first internet media content using the first playback mode.
16. The method of claim 15 wherein the first media content is played in response to the first user input.
17. The method of claim 15 wherein the first media content is played in response to the second user input.
18. The method of claim 15 wherein the first playback mode is one of playback on the mobile device, playback on an external rendering device connected to the mobile device by a local network, and addition to a playback queue.
19. A method for transferring internet media content using an application on a mobile device to transfer internet media content, the method comprising the steps of:
retrieving first web content for a web page for display on the mobile device using a first content request having a first requester identification field which specifies one of the mobile device and a mobile device web browser;
creating a rendered web page from the first web content wherein an HTML rendering engine creates the rendered web page and further wherein the rendered web page is displayed to a user of the mobile device;
receiving a user interaction signal generated in response to user input on the mobile device wherein the user input is accepted after display of the rendered web page and further wherein the application receives the user interaction signal;
retrieving second web content for the web page using a second content request having a second requester identification field which is different than the first requester identification field;
analyzing the second web content to determine an alternate version of the internet media content for display on an external rendering device in communication with the mobile device; and
initiating transfer of the alternate version of the internet media content to the external rendering device.
20. The method of claim 19 further comprising the steps of:
analyzing the first web content to identify media controls; and
modifying the first web content to insert additional media controls displayed on the mobile device.
21. The method of claim 19 wherein the HTML rendering engine generates the user interaction signal.
22. The method of claim 19 wherein the second web content is not used to display a web page on the mobile device.
23. The method of claim 19 wherein the second web content has a set of visual elements necessary for display of the second web content as a web page and further wherein retrieval of the second web content does not retrieve the visual elements.
24. The method of claim 19 wherein the application initiates the transfer of the alternate version of the internet media content in response to receiving the user interaction signal.
25. The method of claim 19 wherein the second requester identification field specifies a full web browser which the mobile device is not capable of running.
26. A method for transferring internet media content using an application on a mobile device to transfer internet media content, the method comprising the steps of:
retrieving web content for a web page wherein the web content is retrieved from a first content source;
querying a database of script code fragments wherein the application queries the database and further wherein the query identifies the first content source wherein the database identifies one or more script code fragments available from the database which correspond to the first content source in response to the query;
receiving a first script code fragment from the database wherein the application receives the first script code fragment from the database wherein the first script code fragment is one of the one or more script code fragments identified by the database in response to the query;
inserting the first script code fragment into the web content to produce modified web content; and
displaying the modified web content as a web page on the mobile device.
27. The method of claim 26 wherein the database is remote from the mobile device and accessible to the mobile device via a network.
28. The method of claim 26 wherein the first script code fragment identifies one or more original media controls in the web page wherein the one or more original media controls allow a user to access internet media content from the web page.
29. The method of claim 26 wherein the first script code fragment adds one or more additional media controls to the web page wherein the one or more additional media controls correspond to media actions which apply to internet media content accessible from the web page.
30. The method of claim 26 wherein the first script code fragment directly invokes a function of the application in response to user interaction with the web page.
31. The method of claim 26 wherein the database of script code fragments has script code fragments which correspond to each content source of a plurality of content sources wherein the first content source is one of the plurality.
32. The method of claim 26 wherein an HTML rendering engine displays the modified web content and further wherein the HTML rendering engine uses a scripting engine to execute the first script code fragment.
US13/370,751 2011-02-11 2012-02-10 System and method for using an application on a mobile device to transfer internet media content Abandoned US20120210205A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201161463088P true 2011-02-11 2011-02-11
US13/370,751 US20120210205A1 (en) 2011-02-11 2012-02-10 System and method for using an application on a mobile device to transfer internet media content

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/US2012/024700 WO2012109568A1 (en) 2011-02-11 2012-02-10 System and method for using an application on a mobile device to transfer internet media content
US13/370,751 US20120210205A1 (en) 2011-02-11 2012-02-10 System and method for using an application on a mobile device to transfer internet media content
US14/830,322 US20160048485A1 (en) 2009-06-26 2015-08-19 System and method for using an application on a mobile device to transfer internet media content

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/459,090 Continuation-In-Part US9195775B2 (en) 2009-06-26 2009-06-26 System and method for managing and/or rendering internet multimedia content in a network

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/830,322 Continuation-In-Part US20160048485A1 (en) 2009-06-26 2015-08-19 System and method for using an application on a mobile device to transfer internet media content

Publications (1)

Publication Number Publication Date
US20120210205A1 true US20120210205A1 (en) 2012-08-16

Family

ID=46637852

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/370,751 Abandoned US20120210205A1 (en) 2011-02-11 2012-02-10 System and method for using an application on a mobile device to transfer internet media content

Country Status (2)

Country Link
US (1) US20120210205A1 (en)
WO (1) WO2012109568A1 (en)

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120158892A1 (en) * 2010-12-21 2012-06-21 Akira Ichie Content Transmitting and Receiving Device, Content Transmitting and Receiving Method, and Content Transmitting and Receiving Program Product
US20120254931A1 (en) * 2011-04-04 2012-10-04 Google Inc. Content Extraction for Television Display
US20130067026A1 (en) * 2011-03-11 2013-03-14 Qualcomm Incorporated Remote access and administration of device content and configuration using http protocol
US20130067086A1 (en) * 2011-03-11 2013-03-14 Qualcomm Incorporated System and method using a web proxy-server to access a device having an assigned network address
US20130067085A1 (en) * 2011-03-11 2013-03-14 Qualcomm Incorporated System and method using a client-local proxy-server to access a device having an assigned network address
US20130110900A1 (en) * 2011-10-28 2013-05-02 Comcast Cable Communications, Llc System and method for controlling and consuming content
US20140033075A1 (en) * 2012-07-25 2014-01-30 Offerpop Corporation Managing User Endorsements in Online Social Networking Systems
US20140053102A1 (en) * 2012-08-20 2014-02-20 Pantech Co., Ltd. Terminal and method for providing user interface
WO2014070561A1 (en) * 2012-10-30 2014-05-08 Microsoft Corporation Home cloud with virtualized input and output over home network
US20140207911A1 (en) * 2013-01-22 2014-07-24 James Kosmach System and method for embedding multimedia controls and indications in a webpage
US20140208352A1 (en) * 2013-01-23 2014-07-24 Rajiv Singh Flash video enabler for ios devices
US20140213243A1 (en) * 2013-01-30 2014-07-31 Electronics & Telecommunications Research Institute Service equipment control method and user equipment for performing the same
US20140218517A1 (en) * 2012-12-14 2014-08-07 Samsung Electronics Co., Ltd. Home monitoring method and apparatus
WO2014137131A1 (en) * 2013-03-04 2014-09-12 Samsung Electronics Co., Ltd. Method and apparatus for manipulating data on electronic device display
US20140372239A1 (en) * 2013-06-13 2014-12-18 Microsoft Corporation Event-based versioning and visibility for content releases
WO2014200548A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation Directing a playback device to play a media item selected by a controller from a media server
US8924556B2 (en) 2011-03-11 2014-12-30 Qualcomm Incorporated System and method for accessing a device having an assigned network address
US20150006696A1 (en) * 2013-06-26 2015-01-01 Qualcomm Incorporated Semantic mappings from human readable messages to programmatic interfaces
US9052898B2 (en) 2011-03-11 2015-06-09 Qualcomm Incorporated Remote access and administration of device content, with device power optimization, using HTTP protocol
US9081468B2 (en) 2011-11-23 2015-07-14 Offerpop Corporation Integrated user participation profiles
US20150356084A1 (en) * 2014-06-05 2015-12-10 Sonos, Inc. Social Queue
US9268750B2 (en) 2012-04-04 2016-02-23 Offerpop Corporation Shared link tracking in online social networking systems
US20160149982A1 (en) * 2013-08-12 2016-05-26 Google Inc. Dynamic resizable media item player
US20160283461A1 (en) * 2014-02-26 2016-09-29 Tencent Technology (Shenzhen) Company Limited Method and terminal for extracting webpage content, and non-transitory storage medium
CN106464954A (en) * 2014-03-31 2017-02-22 奥兰治 Device and method for transferring the rendering of multimedia content
EP3114639A4 (en) * 2014-03-05 2017-04-12 Sonos, Inc. Webpage media playback
US9666233B2 (en) * 2015-06-01 2017-05-30 Gopro, Inc. Efficient video frame rendering in compliance with cross-origin resource restrictions
US9690540B2 (en) 2014-09-24 2017-06-27 Sonos, Inc. Social media queue
US9721036B2 (en) 2012-08-14 2017-08-01 Microsoft Technology Licensing, Llc Cooperative web browsing using multiple devices
US9723038B2 (en) 2014-09-24 2017-08-01 Sonos, Inc. Social media connection recommendations based on playback information
US9787862B1 (en) 2016-01-19 2017-10-10 Gopro, Inc. Apparatus and methods for generating content proxy
US9792502B2 (en) 2014-07-23 2017-10-17 Gopro, Inc. Generating video summaries for a video using video summary templates
US20170310873A1 (en) * 2016-04-25 2017-10-26 Olympus Corporation Terminal apparatus, information acquisition system, information acquisition method, and computer-readable recording medium
US9838730B1 (en) 2016-04-07 2017-12-05 Gopro, Inc. Systems and methods for audio track selection in video editing
US9860286B2 (en) 2014-09-24 2018-01-02 Sonos, Inc. Associating a captured image with a media item
US9871994B1 (en) 2016-01-19 2018-01-16 Gopro, Inc. Apparatus and methods for providing content context using session metadata
US9874997B2 (en) 2014-08-08 2018-01-23 Sonos, Inc. Social playback queues
US9916863B1 (en) 2017-02-24 2018-03-13 Gopro, Inc. Systems and methods for editing videos based on shakiness measures
US9922682B1 (en) 2016-06-15 2018-03-20 Gopro, Inc. Systems and methods for organizing video files
US9953224B1 (en) 2016-08-23 2018-04-24 Gopro, Inc. Systems and methods for generating a video summary
US9953679B1 (en) 2016-05-24 2018-04-24 Gopro, Inc. Systems and methods for generating a time lapse video
US9959087B2 (en) 2014-09-24 2018-05-01 Sonos, Inc. Media item context from social media
US9967515B1 (en) 2016-06-15 2018-05-08 Gopro, Inc. Systems and methods for bidirectional speed ramping
US9972066B1 (en) 2016-03-16 2018-05-15 Gopro, Inc. Systems and methods for providing variable image projection for spherical visual content
US10002641B1 (en) 2016-10-17 2018-06-19 Gopro, Inc. Systems and methods for determining highlight segment sets
US10015469B2 (en) 2012-07-03 2018-07-03 Gopro, Inc. Image blur based on 3D depth information
US10044972B1 (en) 2016-09-30 2018-08-07 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10045120B2 (en) 2016-06-20 2018-08-07 Gopro, Inc. Associating audio with three-dimensional objects in videos
US10078644B1 (en) 2016-01-19 2018-09-18 Gopro, Inc. Apparatus and methods for manipulating multicamera content using content proxy
US10096341B2 (en) 2015-01-05 2018-10-09 Gopro, Inc. Media identifier generation for camera-captured media
US10097893B2 (en) 2013-01-23 2018-10-09 Sonos, Inc. Media experience social interface
US10129721B2 (en) * 2015-01-06 2018-11-13 Samsung Electronics Co., Ltd. Method for supporting situation specific information sharing and electronic device supporting the same
US10129464B1 (en) 2016-02-18 2018-11-13 Gopro, Inc. User interface for creating composite images
US10194189B1 (en) 2013-09-23 2019-01-29 Amazon Technologies, Inc. Playback of content using multiple devices
US10192585B1 (en) 2014-08-20 2019-01-29 Gopro, Inc. Scene and activity identification in video summary generation based on motion detected in a video
US10229719B1 (en) 2016-05-09 2019-03-12 Gopro, Inc. Systems and methods for generating highlights for a video
US10268898B1 (en) 2016-09-21 2019-04-23 Gopro, Inc. Systems and methods for determining a sample frame order for analyzing a video via segments
US10282632B1 (en) 2016-09-21 2019-05-07 Gopro, Inc. Systems and methods for determining a sample frame order for analyzing a video
US10339443B1 (en) 2017-02-24 2019-07-02 Gopro, Inc. Systems and methods for processing convolutional neural network operations using textures
US10338955B1 (en) 2015-10-22 2019-07-02 Gopro, Inc. Systems and methods that effectuate transmission of workflow between computing platforms
US10360663B1 (en) 2017-04-07 2019-07-23 Gopro, Inc. Systems and methods to create a dynamic blur effect in visual content
US10360290B2 (en) 2014-02-05 2019-07-23 Sonos, Inc. Remote creation of a playback queue for a future event
US10395119B1 (en) 2016-08-10 2019-08-27 Gopro, Inc. Systems and methods for determining activities performed during video capture
US10397415B1 (en) 2016-09-30 2019-08-27 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10395122B1 (en) 2017-05-12 2019-08-27 Gopro, Inc. Systems and methods for identifying moments in videos
US10402938B1 (en) 2016-03-31 2019-09-03 Gopro, Inc. Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
US10402698B1 (en) 2017-07-10 2019-09-03 Gopro, Inc. Systems and methods for identifying interesting moments within videos
US10424009B1 (en) 2013-02-27 2019-09-24 Amazon Technologies, Inc. Shopping experience using multiple computing devices

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8332478B2 (en) * 1998-10-01 2012-12-11 Digimarc Corporation Context sensitive connected content
US8544046B2 (en) * 2008-10-09 2013-09-24 Packetvideo Corporation System and method for controlling media rendering in a network using a mobile device
WO2010065107A1 (en) * 2008-12-04 2010-06-10 Packetvideo Corp. System and method for browsing, selecting and/or controlling rendering of media with a mobile device
US9195775B2 (en) * 2009-06-26 2015-11-24 Iii Holdings 2, Llc System and method for managing and/or rendering internet multimedia content in a network

Cited By (87)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120158892A1 (en) * 2010-12-21 2012-06-21 Akira Ichie Content Transmitting and Receiving Device, Content Transmitting and Receiving Method, and Content Transmitting and Receiving Program Product
US8719439B2 (en) * 2010-12-21 2014-05-06 Kabushiki Kaisha Toshiba Content transmitting and receiving device, content transmitting and receiving method, and content transmitting and receiving program product
US8799470B2 (en) * 2011-03-11 2014-08-05 Qualcomm Incorporated System and method using a client-local proxy-server to access a device having an assigned network address
US20130067086A1 (en) * 2011-03-11 2013-03-14 Qualcomm Incorporated System and method using a web proxy-server to access a device having an assigned network address
US20130067085A1 (en) * 2011-03-11 2013-03-14 Qualcomm Incorporated System and method using a client-local proxy-server to access a device having an assigned network address
US8924556B2 (en) 2011-03-11 2014-12-30 Qualcomm Incorporated System and method for accessing a device having an assigned network address
US20130067026A1 (en) * 2011-03-11 2013-03-14 Qualcomm Incorporated Remote access and administration of device content and configuration using http protocol
US8819233B2 (en) * 2011-03-11 2014-08-26 Qualcomm Incorporated System and method using a web proxy-server to access a device having an assigned network address
US8862693B2 (en) * 2011-03-11 2014-10-14 Qualcomm Incorporated Remote access and administration of device content and configuration using HTTP protocol
US9052898B2 (en) 2011-03-11 2015-06-09 Qualcomm Incorporated Remote access and administration of device content, with device power optimization, using HTTP protocol
US20120254931A1 (en) * 2011-04-04 2012-10-04 Google Inc. Content Extraction for Television Display
US20130110900A1 (en) * 2011-10-28 2013-05-02 Comcast Cable Communications, Llc System and method for controlling and consuming content
US9081468B2 (en) 2011-11-23 2015-07-14 Offerpop Corporation Integrated user participation profiles
US9268750B2 (en) 2012-04-04 2016-02-23 Offerpop Corporation Shared link tracking in online social networking systems
US10015469B2 (en) 2012-07-03 2018-07-03 Gopro, Inc. Image blur based on 3D depth information
US20140033075A1 (en) * 2012-07-25 2014-01-30 Offerpop Corporation Managing User Endorsements in Online Social Networking Systems
US9721036B2 (en) 2012-08-14 2017-08-01 Microsoft Technology Licensing, Llc Cooperative web browsing using multiple devices
US20140053102A1 (en) * 2012-08-20 2014-02-20 Pantech Co., Ltd. Terminal and method for providing user interface
WO2014070561A1 (en) * 2012-10-30 2014-05-08 Microsoft Corporation Home cloud with virtualized input and output over home network
US9264478B2 (en) 2012-10-30 2016-02-16 Microsoft Technology Licensing, Llc Home cloud with virtualized input and output roaming over network
US20140218517A1 (en) * 2012-12-14 2014-08-07 Samsung Electronics Co., Ltd. Home monitoring method and apparatus
US20140207911A1 (en) * 2013-01-22 2014-07-24 James Kosmach System and method for embedding multimedia controls and indications in a webpage
US20140208352A1 (en) * 2013-01-23 2014-07-24 Rajiv Singh Flash video enabler for ios devices
US10097893B2 (en) 2013-01-23 2018-10-09 Sonos, Inc. Media experience social interface
US10341736B2 (en) 2013-01-23 2019-07-02 Sonos, Inc. Multiple household management interface
US20140213243A1 (en) * 2013-01-30 2014-07-31 Electronics & Telecommunications Research Institute Service equipment control method and user equipment for performing the same
US10424009B1 (en) 2013-02-27 2019-09-24 Amazon Technologies, Inc. Shopping experience using multiple computing devices
WO2014137131A1 (en) * 2013-03-04 2014-09-12 Samsung Electronics Co., Ltd. Method and apparatus for manipulating data on electronic device display
US20140372239A1 (en) * 2013-06-13 2014-12-18 Microsoft Corporation Event-based versioning and visibility for content releases
CN105493063A (en) * 2013-06-13 2016-04-13 微软技术许可有限责任公司 Event-based versioning and visibility for content releases
US10423992B2 (en) * 2013-06-13 2019-09-24 Microsoft Technology Licensing, Llc Method, system, and medium for event based versioning and visibility for content releases
US9313255B2 (en) 2013-06-14 2016-04-12 Microsoft Technology Licensing, Llc Directing a playback device to play a media item selected by a controller from a media server
WO2014200548A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation Directing a playback device to play a media item selected by a controller from a media server
US20150006696A1 (en) * 2013-06-26 2015-01-01 Qualcomm Incorporated Semantic mappings from human readable messages to programmatic interfaces
US9609062B2 (en) * 2013-06-26 2017-03-28 Qualcomm Incorporated Semantic mappings from human readable messages to programmatic interfaces
US20160149982A1 (en) * 2013-08-12 2016-05-26 Google Inc. Dynamic resizable media item player
US10194189B1 (en) 2013-09-23 2019-01-29 Amazon Technologies, Inc. Playback of content using multiple devices
US10360290B2 (en) 2014-02-05 2019-07-23 Sonos, Inc. Remote creation of a playback queue for a future event
US20160283461A1 (en) * 2014-02-26 2016-09-29 Tencent Technology (Shenzhen) Company Limited Method and terminal for extracting webpage content, and non-transitory storage medium
US10430514B2 (en) * 2014-02-26 2019-10-01 Tencent Technology (Shenzhen) Company Limited Method and terminal for extracting webpage content, and non-transitory storage medium
US9679054B2 (en) 2014-03-05 2017-06-13 Sonos, Inc. Webpage media playback
EP3114639A4 (en) * 2014-03-05 2017-04-12 Sonos, Inc. Webpage media playback
US10425454B2 (en) * 2014-03-31 2019-09-24 Orange Device and method for transferring the rendering of multimedia content
CN106464954A (en) * 2014-03-31 2017-02-22 奥兰治 Device and method for transferring the rendering of multimedia content
US20150356084A1 (en) * 2014-06-05 2015-12-10 Sonos, Inc. Social Queue
US10339975B2 (en) 2014-07-23 2019-07-02 Gopro, Inc. Voice-based video tagging
US9792502B2 (en) 2014-07-23 2017-10-17 Gopro, Inc. Generating video summaries for a video using video summary templates
US10074013B2 (en) 2014-07-23 2018-09-11 Gopro, Inc. Scene and activity identification in video summary generation
US10126916B2 (en) 2014-08-08 2018-11-13 Sonos, Inc. Social playback queues
US9874997B2 (en) 2014-08-08 2018-01-23 Sonos, Inc. Social playback queues
US10262695B2 (en) 2014-08-20 2019-04-16 Gopro, Inc. Scene and activity identification in video summary generation
US10192585B1 (en) 2014-08-20 2019-01-29 Gopro, Inc. Scene and activity identification in video summary generation based on motion detected in a video
US9959087B2 (en) 2014-09-24 2018-05-01 Sonos, Inc. Media item context from social media
US9690540B2 (en) 2014-09-24 2017-06-27 Sonos, Inc. Social media queue
US9723038B2 (en) 2014-09-24 2017-08-01 Sonos, Inc. Social media connection recommendations based on playback information
US9860286B2 (en) 2014-09-24 2018-01-02 Sonos, Inc. Associating a captured image with a media item
US10096341B2 (en) 2015-01-05 2018-10-09 Gopro, Inc. Media identifier generation for camera-captured media
US10129721B2 (en) * 2015-01-06 2018-11-13 Samsung Electronics Co., Ltd. Method for supporting situation specific information sharing and electronic device supporting the same
US9666233B2 (en) * 2015-06-01 2017-05-30 Gopro, Inc. Efficient video frame rendering in compliance with cross-origin resource restrictions
US10338955B1 (en) 2015-10-22 2019-07-02 Gopro, Inc. Systems and methods that effectuate transmission of workflow between computing platforms
US9871994B1 (en) 2016-01-19 2018-01-16 Gopro, Inc. Apparatus and methods for providing content context using session metadata
US10078644B1 (en) 2016-01-19 2018-09-18 Gopro, Inc. Apparatus and methods for manipulating multicamera content using content proxy
US10402445B2 (en) 2016-01-19 2019-09-03 Gopro, Inc. Apparatus and methods for manipulating multicamera content using content proxy
US9787862B1 (en) 2016-01-19 2017-10-10 Gopro, Inc. Apparatus and methods for generating content proxy
US10129464B1 (en) 2016-02-18 2018-11-13 Gopro, Inc. User interface for creating composite images
US9972066B1 (en) 2016-03-16 2018-05-15 Gopro, Inc. Systems and methods for providing variable image projection for spherical visual content
US10402938B1 (en) 2016-03-31 2019-09-03 Gopro, Inc. Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
US9838730B1 (en) 2016-04-07 2017-12-05 Gopro, Inc. Systems and methods for audio track selection in video editing
US10341712B2 (en) 2016-04-07 2019-07-02 Gopro, Inc. Systems and methods for audio track selection in video editing
US20170310873A1 (en) * 2016-04-25 2017-10-26 Olympus Corporation Terminal apparatus, information acquisition system, information acquisition method, and computer-readable recording medium
US10229719B1 (en) 2016-05-09 2019-03-12 Gopro, Inc. Systems and methods for generating highlights for a video
US9953679B1 (en) 2016-05-24 2018-04-24 Gopro, Inc. Systems and methods for generating a time lapse video
US9922682B1 (en) 2016-06-15 2018-03-20 Gopro, Inc. Systems and methods for organizing video files
US9967515B1 (en) 2016-06-15 2018-05-08 Gopro, Inc. Systems and methods for bidirectional speed ramping
US10045120B2 (en) 2016-06-20 2018-08-07 Gopro, Inc. Associating audio with three-dimensional objects in videos
US10395119B1 (en) 2016-08-10 2019-08-27 Gopro, Inc. Systems and methods for determining activities performed during video capture
US9953224B1 (en) 2016-08-23 2018-04-24 Gopro, Inc. Systems and methods for generating a video summary
US10268898B1 (en) 2016-09-21 2019-04-23 Gopro, Inc. Systems and methods for determining a sample frame order for analyzing a video via segments
US10282632B1 (en) 2016-09-21 2019-05-07 Gopro, Inc. Systems and methods for determining a sample frame order for analyzing a video
US10397415B1 (en) 2016-09-30 2019-08-27 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10044972B1 (en) 2016-09-30 2018-08-07 Gopro, Inc. Systems and methods for automatically transferring audiovisual content
US10002641B1 (en) 2016-10-17 2018-06-19 Gopro, Inc. Systems and methods for determining highlight segment sets
US10339443B1 (en) 2017-02-24 2019-07-02 Gopro, Inc. Systems and methods for processing convolutional neural network operations using textures
US9916863B1 (en) 2017-02-24 2018-03-13 Gopro, Inc. Systems and methods for editing videos based on shakiness measures
US10360663B1 (en) 2017-04-07 2019-07-23 Gopro, Inc. Systems and methods to create a dynamic blur effect in visual content
US10395122B1 (en) 2017-05-12 2019-08-27 Gopro, Inc. Systems and methods for identifying moments in videos
US10402698B1 (en) 2017-07-10 2019-09-03 Gopro, Inc. Systems and methods for identifying interesting moments within videos

Also Published As

Publication number Publication date
WO2012109568A1 (en) 2012-08-16

Similar Documents

Publication Publication Date Title
US9286045B2 (en) Method and system for providing applications to various devices
JP4837919B2 (en) System and method for coaxial navigation of a user interface
US9232021B2 (en) Dynamically rehosting web content
US8843816B2 (en) Document collaboration by transforming and reflecting a document object model
JP5325286B2 (en) Apparatus and method for interacting with multiple forms of information between multiple types of computing devices
EP2357573A1 (en) Method and apparatus for showing web page-related resources
US8307286B2 (en) Methods and systems for online video-based property commerce
US7631260B1 (en) Application modification based on feed content
US9305060B2 (en) System and method for performing contextual searches across content sources
US20180218756A1 (en) Video preview creation with audio
JP2007533015A (en) Media package and media package management system and method
US9538229B2 (en) Media experience for touch screen devices
US20120233639A1 (en) Media Playlist Management and Viewing Remote Control
US8793282B2 (en) Real-time media presentation using metadata clips
US8010629B2 (en) Systems and methods for unification of local and remote resources over a network
US20050071864A1 (en) Systems and methods for using interaction information to deform representations of digital content
US20120304068A1 (en) Presentation format for an application tile
KR20110065338A (en) Interactive video player component for mashup interfaces
US8170395B2 (en) Methods and systems for handling montage video data
US20080294694A1 (en) Method, apparatus, system, medium, and signals for producing interactive video content
US8701008B2 (en) Systems and methods for sharing multimedia editing projects
US20160173960A1 (en) Methods and systems for generating audiovisual media items
US20100268694A1 (en) System and method for sharing web applications
EP3107267B1 (en) Techniques to push content to a connected device
CN102460412B (en) For managing and/or reproduce the system and method for internet multimedia content in a network

Legal Events

Date Code Title Description
AS Assignment

Owner name: PACKETVIDEO CORP., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHERWOOD, GREG;KOSMACH, JAMES J.;AL-SHAYKH, OSAMA;AND OTHERS;SIGNING DATES FROM 20110209 TO 20110211;REEL/FRAME:033527/0858

AS Assignment

Owner name: PACKETVIDEO CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHERWOOD, GREG;KOSMACH, JAMES J.;AL-SHAYKH, OSAMA;AND OTHERS;SIGNING DATES FROM 20141008 TO 20141114;REEL/FRAME:034175/0812

AS Assignment

Owner name: III HOLDINGS 2, LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PACKETVIDEO CORPORATION;REEL/FRAME:034730/0584

Effective date: 20141120

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION