US20150213839A1 - Media application backgrounding - Google Patents

Media application backgrounding Download PDF

Info

Publication number
US20150213839A1
US20150213839A1 US14/228,199 US201414228199A US2015213839A1 US 20150213839 A1 US20150213839 A1 US 20150213839A1 US 201414228199 A US201414228199 A US 201414228199A US 2015213839 A1 US2015213839 A1 US 2015213839A1
Authority
US
United States
Prior art keywords
media item
video
playback
message
video portion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/228,199
Other versions
US9558787B2 (en
Inventor
Oliver John Woodman
Matt Doucleff
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Assigned to GOOGLE INC. reassignment GOOGLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DOUCLEFF, MATT, WOODMAN, OLIVER JOHN
Priority to US14/228,199 priority Critical patent/US9558787B2/en
Priority to EP20176571.6A priority patent/EP3739584A1/en
Priority to EP15743253.5A priority patent/EP3100452B1/en
Priority to PCT/US2015/013537 priority patent/WO2015116827A1/en
Priority to KR1020177035850A priority patent/KR102108949B1/en
Priority to CN201580006269.3A priority patent/CN105940671A/en
Priority to KR1020207012803A priority patent/KR102233785B1/en
Priority to BR112016017559A priority patent/BR112016017559A2/en
Priority to KR1020167023225A priority patent/KR20160113230A/en
Priority to CN202110381902.4A priority patent/CN113518070A/en
Priority to JP2016549106A priority patent/JP2017508368A/en
Priority to AU2015210937A priority patent/AU2015210937B2/en
Publication of US20150213839A1 publication Critical patent/US20150213839A1/en
Priority to US15/404,045 priority patent/US10432695B2/en
Publication of US9558787B2 publication Critical patent/US9558787B2/en
Application granted granted Critical
Assigned to GOOGLE LLC reassignment GOOGLE LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: GOOGLE INC.
Priority to AU2018202378A priority patent/AU2018202378B2/en
Priority to JP2018092056A priority patent/JP6557380B2/en
Priority to JP2018092812A priority patent/JP2018125891A/en
Priority to US16/589,054 priority patent/US10841359B2/en
Priority to AU2020204121A priority patent/AU2020204121B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4488Object-oriented
    • G06F9/449Object-oriented method invocation or resolution
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/005Reproducing at a different information rate from the information rate of recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/36Monitoring, i.e. supervising the progress of recording or reproducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/611Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording

Definitions

  • Implementations of the present disclosure relate to content delivery, and more specifically, to media playback on a device.
  • the Internet allows people to obtain information, connect with others and share information with each other.
  • Common Internet destinations include news websites, content sharing platforms, social networking websites, and the like.
  • Many websites and platforms include a content sharing aspect that allows users to view, upload, and share media items, such as video content, image content, audio content, and so on. Users can consume the media items from their user devices.
  • a method includes providing, by an application executed by a processing device, a playback of a media item that includes a video portion and an audio portion.
  • the method further includes receiving, by the application, a first message during the playback of the media item.
  • the method further includes in response to the first message, stopping the playback of the video portion of the media item while continuing to provide the audio portion of the media item.
  • the method further includes receiving, by the application, a second message while providing the audio portion of the media item.
  • the method further includes in response to the second message, resuming the playback of the video portion of the media item in synchronization with the audio portion being provided.
  • the method can include providing the playback of the media item, receiving a request to playback the media item, and reserving one or more resources associated with the playback of the media item in response to the request.
  • the method can further include stopping the playback of the video portion of the media item and releasing at least one of the one or more resources.
  • the one or more resources can include at least one of: a memory, a video decoder, or a video player. Releasing at least one of the one or more resources can include at least one of: clearing video data from a buffer, stopping a download of the video portion, or closing a network connection used for requesting the video portion.
  • Resuming the presentation of the video portion of the media item can include reacquiring the released one or more resources.
  • the first message can be at least one of: an indication that the application can have entered a background state on a mobile device, an indication that a display of the mobile device can be powered off, or an indication that a second application can have entered a foreground state on the mobile device.
  • the second message can be an indication that the application can have entered a foreground state on the mobile device.
  • the presentation of the video portion of the media item can be resumed without interrupting the presentation of the audio portion.
  • the video portion can be initially presented at a first quality and wherein resuming the presentation of the video portion of the media item can include presenting the video portion at a second quality.
  • the method can include presenting, via a graphical user interface, a message that the presentation of the video portion is to resume.
  • the method can include receiving the video portion and the audio portion from different sources.
  • the method can include receiving the video portion and the audio portion of the media item as a single file.
  • the method can further include separating the video portion and the audio portion into separate files.
  • computing devices for performing the operations of the above described implementations are also disclosed. Additionally, in implementations of the disclosure, a computer readable storage media stores methods for performing the operations of the above described implementations.
  • FIG. 1 illustrates an example system architecture, in accordance with one implementation of the disclosure.
  • FIG. 2 is a flow diagram illustrating a method for providing efficient media application backgrounding on a client device, according to some implementations of the disclosure.
  • FIG. 3 illustrates two example timelines for providing efficient media application backgrounding on a client device, according to some implementations of the disclosure.
  • FIG. 4 is a block diagram illustrating an exemplary computer system, according to some implementations.
  • Described herein is a mechanism for improving media consumption on a client device.
  • Conventional client devices can stream media items from content sharing platforms.
  • a media application on the client device can play the streamed media items on the client device using a media player.
  • a user may desire to operate another feature or other application on the client device such that the audio portion of the video continues to play while the video portion is no longer displayed on the client device. For example, the user may wish to use other applications or turn the client device screen off while the audio portion continues to play.
  • the user can minimize, close or otherwise cause the media application to enter a background state.
  • a background state can refer to a state, in which an application continues to run while no longer being visible and an application in the background state can refer to a “backgrounded” application.
  • Conventional client devices typically do not permit dynamically and independently adding and/or removing video and audio streams during playback of a media item.
  • a conventional client device can background a media application that is playing a video, the media application usually continues to run and process audio and video data (e.g., download, decode, cache) in spite of the video not being visible on-screen.
  • system resources such as memory (for buffering data), network sockets and video decoders.
  • some conventional devices can receive media items in non-multiplexed formats, where synchronized video and audio portions of a video are delivered separately. During playback, the client device generally renders the synchronized video and audio streams simultaneously.
  • Conventional systems usually do not operate differently when the media application is backgrounded, which may cause an unnecessary consumption of resources.
  • Mobile devices typically have a limited number of resources, which means that conventional approaches can prevent another application from using the resources.
  • Implementations of the present disclosure address the above deficiencies of conventional systems by providing a mechanism for controlling independent playback of a video portion of a media item and an audio portion of the media item.
  • An application on a client device provides playback of a media item that includes a video portion and an audio portion. At any time, the application can stop playback of the video portion of the media item while continuing to provide the audio portion. Later, the application can resume playback of the video portion in synchronization with the audio portion that is being provided.
  • Techniques described herein can reduce the cost of operating an application in background mode. While the application is not presenting the video portion, the application can stop downloading the video portion, pause decoding any downloaded video data and clear any associated memory. By releasing these and other resources, network and battery consumption of the client device can be reduced as a result. Users often do not desire to download video they do not intend to watch, as it uses data and bandwidth. Further, the audio portion can be seamlessly provided throughout the application transitioning from foreground to background and back to foreground.
  • portions of the media item can include audio, video, subtitle data, a stream of overlaid data, annotations, advertisements, comments, metadata, information about the content of the media item (e.g., actors, related movies, music tracks, facial recognition, and the like). Any of the portions can be handled using the techniques described herein.
  • FIG. 1 illustrates an example system architecture 100 , in accordance with one implementation of the disclosure, for independently providing a playback of a media item that includes a video portion and an audio portion.
  • the system architecture 100 includes any number of client devices 102 , a network 104 , a data store 106 , and a content sharing platform 110 .
  • network 104 may include a public network (e.g., the Internet), a private network (e.g., a local area network (LAN) or wide area network (WAN)), a wired network (e.g., Ethernet network), a wireless network (e.g., an 802.11 network or a Wi-Fi network), a cellular network (e.g., a Long Term Evolution (LTE) network), routers, hubs, switches, server computers, and/or a combination thereof.
  • a public network e.g., the Internet
  • a private network e.g., a local area network (LAN) or wide area network (WAN)
  • a wired network e.g., Ethernet network
  • a wireless network e.g., an 802.11 network or a Wi-Fi network
  • a cellular network e.g., a Long Term Evolution (LTE) network
  • the data store 106 may be a memory (e.g., random access memory), a cache, a drive (e.g., a hard drive), a flash drive, a database system, a distributed database, a distributed storage, or another type of component or device capable of storing data.
  • the data store 106 may also include multiple storage components (e.g., multiple drives or multiple databases) that may also span multiple computing devices (e.g., multiple server computers).
  • the data store 106 can store media items and portions of media items, such as audio portions and video portions.
  • the data store 106 can include a video source 118 of a video portion of a media item.
  • the data store 106 can include an audio source 120 of an audio portion of a media item.
  • the video source 118 and the audio source 120 are stored on the same data store 106 .
  • the video source 118 and the audio source 120 are stored on different data stores.
  • the different data stores can be owned and/or operated by one entity. Alternatively, or the different data stores can be owned and/or operated by multiple separate entities
  • the video source 118 is owned and operated by a first entity and the audio source 120 is owned and operated by a second entity.
  • the client device 102 can receive the separate video and audio streams from these two different entities.
  • the client devices 102 may each include computing devices such as personal computers (PCs), laptops, mobile phones, smart phones, tablet computers, netbook computers etc. In some implementations, client device 102 may also be referred to as a “user device.” Each client device includes a media player 112 .
  • the media player 112 may be an application or part of an application that allows users to view content, such as images, videos, web pages, documents, etc.
  • the media player 112 may be a web browser that can access, retrieve, present, and/or navigate content (e.g., web pages such as Hyper Text Markup Language (HTML) pages, digital media items, etc.) served by a web server.
  • HTML Hyper Text Markup Language
  • the media player 112 may render, display, and/or present the content (e.g., a web page, a media viewer) to a user, such as via a graphical user interface (GUI).
  • the media player 112 may also display an embedded media player (e.g., a Flash® player or an HTML5 player) that is embedded in a web page (e.g., a web page that may provide information about a product sold by an online merchant).
  • the media player 112 may be a standalone application that allows users to view digital media items (e.g., digital videos, digital images, electronic books, etc.).
  • the media player 112 may be provided to the client devices 102 by a server (not shown) and/or the content sharing platform 110 .
  • the media player 112 may be an embedded media player that is embedded in a web page provided by the content sharing platform 110 .
  • the media player 112 may be an application that is downloaded from the server.
  • the content sharing platform 110 may be one or more computing devices (such as a rackmount server, a router computer, a server computer, a personal computer, a mainframe computer, a laptop computer, a tablet computer, a desktop computer, etc.), data stores (e.g., hard disks, memories, databases), networks, software components, and/or hardware components that may be used to provide a user with access to media items and/or provide the media items to the user.
  • the content sharing platform 110 may allow a user to consumer, upload, search for, approve of (“like”), dislike, and/or comment on media items.
  • the content sharing platform 110 may also include a website (e.g., a webpage) that may be used to provide a user with access to the media items.
  • Content sharing platform 110 may include any type of content delivery network providing access to content and/or media items and can include a social network, a news outlet, a media aggregator, and the like.
  • the content sharing platform 110 can use the media item data store 106 to provide a media item to the client device 102 .
  • the content sharing platform 110 causes the client device 102 to receive media items from one or more data stores, such as from media item data store 106 .
  • the content sharing platform 110 includes the media item data store 106 .
  • the media item data store 106 is not part of the content sharing platform 110 .
  • the content sharing platform 110 can be communicably coupled to the media item data store 106 . When handling a user request for a media item, the content sharing platform 110 can interact with the media item data store 106 to provide the requested media item to the client device 102 .
  • the content sharing platform 110 can present or provide a list of available media items to a client device 102 .
  • Examples of a media item can include, and are not limited to, digital video, digital movies, digital photos, photo albums, digital music, website content, social media updates, video-on-demand, live-streamed media, electronic books (ebooks), electronic magazines, digital newspapers, digital audio books, electronic journals, web log (blog) entries, real simple syndication (RSS) feeds, electronic comic books, software applications, advertisements, etc.
  • media item is also referred to as a content item.
  • a media item may be consumed via the Internet and/or via a client application, such as the media player 112 of client device 102 .
  • a client application such as the media player 112 of client device 102 .
  • an online video also herein referred to as a video
  • “media,” media item,” “online media item,” “digital media,” “digital media item,” “content,” and “content item” can include one or more electronic files that can be executed or loaded using software, firmware or hardware configured to present the digital media item to an entity.
  • the client device 102 includes a media item playback manager 114 .
  • the media item playback manager 114 controls playback of a media item that includes two separate portions or streams. One of the portions can be an audio portion and the other can be a video portion.
  • the client device 102 receives the video portion from the video source 118 and the audio portion from the audio source 120 .
  • the video and audio portions can be delivered in any format and using any technology or protocol, such as HTTP Live Streaming (HLS), Dynamic Adaptive Streaming over HTTP (DASH), Adobe Dynamic Streaming for Flash®, Microsoft® Smooth Streaming, Octoshape Multi-BitRate, etc.
  • the media item playback manager 114 coordinates synchronized playback of the video and audio portions of the media item in the media player 112 and can reserve resources, as described herein.
  • the media item playback manager 114 can receive an indication to alter playback of the media item.
  • the indication can be, for example, a message (such as a preference or command) from an operating system 116 .
  • the media item playback manager 114 can receive the indication from any component of the client device 102 or from a user.
  • the indication can be at least one of: an indication that the media player 112 is no longer a primary application, an indication that the media player 112 has entered a background state on the client device 102 , an indication that a display of the client device 102 is powered off, an indication that a second application has entered a foreground state on the client device 102 , that video associated with the media player 112 has a lower priority, or a request received from a user via an interface or button to background the media player 112 .
  • the indication may be provided by the operating system 116 to the media item playback manager 114 to stop playback of the video portion of the media player. In response to the indication, the media item playback manager 114 can stop the playback of the video portion of the media item within the media player 112 while continuing to provide the audio portion of the media item.
  • the operating system 116 can track when an application presenting a media item (e.g., the media player 112 ) enters the foreground state and when it enters the background state.
  • the media item playback manager 114 can receive a signal, indication or message from the operating system 116 when the application (media player 112 ) has entered the background state.
  • the media item playback manager 114 can release any resources associated with downloading and playing a video.
  • the media item playback manager 114 can start a process to clear a video buffer and release video memory, a video decoder, close a network socket, and the like.
  • the media item playback manager 114 can hold some resources for some period of time. Holding resources can be beneficial for performance reasons, such as when an application enters a background state and then quickly enters a foreground state.
  • the media item playback manager 114 can release a video decoder, but continue to download data such that when the application enters the foreground state, the media player can begin decoding the video portion for a quick resume.
  • the media item playback manager 114 can hold some resources and then release them after a period of time.
  • the media item playback manager 114 can receive a second indication to alter playback of the media item while the media player 112 is providing the audio portion of the media item.
  • the second indication can be to initiate or resume playback of the video portion during playback of the audio portion (e.g., when the media player 112 enters a foreground state).
  • the second indication can be, for example, a message (such as a preference or command) from an operating system 116 that the media player 112 is now a primary application, an indication that the media player 112 has entered a foreground state on the client device 102 , an indication that a display of the client device 102 is powered on, or that video associated with the media player 112 has a high display priority.
  • the media item playback manager 114 can instruct the media player 112 to resume the playback of the video portion of the media item in synchronization with the audio portion that is being provided.
  • the content sharing platform 110 provides an index of the audio portion and the video portion that informs the client device 102 how to synchronize the audio and video portions during playback.
  • the audio and video portions can be time-indexed.
  • the media item playback manager 114 can identify where in time the playing audio falls in relation to the index. The media item playback manager 114 can then instruct the media player 112 to resume the video portion at a corresponding place in time.
  • the media item playback manager 114 can reacquire any released resources (e.g., video decoder, buffer (reallocate memory), open network sockets to download the video portion, etc.). While this is happening, the media player 112 continues to play the audio. Once the video portion is available and ready for playback, the media item playback manager 114 can again identify where in time the playing audio falls in relation to the index. By knowing the position of the audio, the media item playback manager 114 can instruct the media player 112 to resume playback of the video portion in synchronization with the playing audio portion.
  • released resources e.g., video decoder, buffer (reallocate memory), open network sockets to download the video portion, etc.
  • functions described in one implementation as being performed by the client device 102 can also be performed on the content sharing platform 110 in other implementations if appropriate.
  • functions described in one implementation as being performed by the content sharing platform 110 can also be performed on the client devices 102 in other implementations if appropriate.
  • the functionality attributed to a particular component can be performed by different or multiple components operating together.
  • the media player 112 and the media item playback manager 114 can operate within a single application.
  • the content sharing platform 110 can also be accessed as a service provided to other systems or devices through appropriate application programming interfaces (APIs), and thus is not limited to use in websites.
  • APIs application programming interfaces
  • the media player 112 and the media item playback manager 114 are part of the same application. In an implementation, the media item playback manager 114 is part of the operating system 116 .
  • the media item playback manager 114 can download a lower quality video when in a background state for fast resume and minimal network usage.
  • the media item playback manager 114 can request higher quality video which the media player 112 can play when received.
  • the media item playback manager 114 can decrease the quality of the audio portion to give more bandwidth to resume the video.
  • the media item playback manager 114 can identify current network conditions. When the media item playback manager 114 determines that it cannot reliably receive the video portion, such as due to the current network conditions, the media item playback manager 114 can trigger an audio-only mode until the video portion can be reliably received. The media item playback manager 114 can prompt a user via a GUI of the low video reliability. The user can elect to proceed with audio only and the media item playback manager 114 can receive such input via the GUI and can proceed accordingly.
  • the audio portion and the video portion are delivered via a single connection, stream or file.
  • the media item playback manager 114 or the operating system 116 can inform the content platform 110 when the media player 112 enters the background state or to send the audio portion and not the video portion.
  • the content platform 110 can stop the delivery of the video portion via the single connection.
  • the file can include identifiers of the video and audio portions that the media item playback manager 114 can use to separate the two portions at the client.
  • the media item can be associated with a live stream or live event (e.g., an online lecture or presentation, a video game stream).
  • a live stream or live event e.g., an online lecture or presentation, a video game stream.
  • the media item can be packaged in segments, such as by time (e.g., five second packets) or by size (e.g., one megabyte packets).
  • Audio packets can have a different size than the video packets.
  • the audio packets can be five seconds in length and the video packets can be ten seconds.
  • the audio and video portions of the media item can correspond to each other but can be packaged separately.
  • a media item can be divided into 200 sequenced packets.
  • the audio and video portions can likewise be divided into sequenced 200 packets, with each packet corresponding to the same moment in time (e.g., the 27 th packet of the video and audio portions correspond to the same moment in time).
  • the media item playback manager 114 can instruct the media player 112 to not play the next sequenced packet.
  • the media item playback manager 114 can identify which audio packet is currently playing and can instruct the media player 112 to start playing the next video packet at the same time it starts to play the next audio packet.
  • the media item playback manager 114 can also request and/or the next video packet from the content sharing platform 110 .
  • a software developer or an end user can configure the media item playback manager 114 .
  • a GUI can be provided to allow an end user to view some or all of the functionality of the media item playback manager 114 and modify it as needed.
  • the content sharing platform 110 can provide an API to allow a software developer to configure the media item playback manager 114 .
  • information that corresponds to the media item currently playing back in the media player 112 in the background may be shown in a media item information portion of a user interface of the client device 102 .
  • information about a video such as its title, view count, likes, etc. can be presented. Additional menu options related to the media item currently playing in the background may also be presented, such as play, pause, stop, fast forward, rewind, social post, add comment, etc.
  • the information that corresponds to the media item may be presented in a semi-transparent format (e.g., as a semi-transparent overlay or layer) while the user is performing other activities on the client device 102 .
  • FIG. 2 is a flow diagram illustrating a method 200 for providing efficient media application backgrounding on a client device, according to some implementations of the disclosure.
  • the method 200 may be performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (e.g., instructions run on a processing device to perform hardware simulation), or a combination thereof.
  • hardware e.g., circuitry, dedicated logic, programmable logic, microcode, etc.
  • software e.g., instructions run on a processing device to perform hardware simulation
  • method 200 may be performed by an application, such as the media player 112 or the media item playback manager 114 , or a combination thereof, as further described in conjunction with FIG. 1 .
  • method 200 begins at block 202 when processing logic receives a request to playback a media item having a video portion and an audio portion.
  • the request can be received as user input, such as via a touch screen of a client device.
  • the processing logic reserves one or more resources associated with the playback of the media item in response to the request.
  • the one or more resources can be, for example, a memory, a buffer, a video decoder, network socket, or a video player.
  • the processing logic receives the media item from one or more sources.
  • the processing logic presents the media item, which can include providing a playback of the media item via a display of the client device.
  • the processing logic receives a first message during the playback of the media item.
  • the processing logic stops the playback of the video portion of the media item while continuing to provide the audio portion of the media item.
  • the processing logic releases at least one of the one or more resources associated with the playback of the media item. For example, when releasing the at least one of the one or more resources, the processing logic can include at least one of: clearing video data from a buffer, stopping a download of the video portion, or closing a network connection used for requesting the video portion.
  • the processing logic receives a second message while providing the audio portion of the media item.
  • the second message is an indication that an application executed by the processing logic has entered a foreground state on the client device.
  • the processing logic reacquires the released resource(s).
  • the processing logic resumes the playback of the video portion of the media item in synchronization with the audio portion being provided. In implementations, the processing logic resumes presentation of the video portion of the media item without interrupting the presentation of the audio portion.
  • the processing logic presents the video portion initially at a first quality level or bitrate and when resuming the presentation of the video portion of the media item, the processing logic presents the video portion at a second quality level or bitrate.
  • the second quality can be higher than the first quality due to improved network resources.
  • FIG. 3 illustrates two example timelines 302 , 304 for providing efficient media application backgrounding on a client device, according to some implementations of the disclosure.
  • Intervals and events are used to illustrate activities. Length or dimensions of the intervals are not indicative of a particular time or duration, nor are they indicative of any time or duration relative to each other. Any interval can be any amount or duration of time.
  • an application on a client device plays both an audio portion and a video portion of a media item during interval 306 .
  • the application receives a first message and, in response to the message, stops playback of the video portion while continuing to playback the audio portion, as described herein.
  • the application releases video resources and by the end of interval 312 , the video resources are released.
  • the application receives a second message. In response to the second message received at event 314 , the application acquires video resources during interval 316 . Once the video resources are acquired, the application can resume playback of the video portion of the media item while continuing to playback the audio portion during interval 306 .
  • the second example further includes a joining window 320 which is an allowable duration for the application to acquire video resources and to begin playing the video portion in synchronization with the audio portion.
  • the duration of the joining window 320 can be a predetermined amount of time (e.g., five seconds).
  • the duration of the joining window 320 can be dynamic and can depend on any variable pertaining to the media item.
  • a duration of the joining window 320 can be a remaining duration of an audio data packet that is currently being played.
  • the client device can render a prompt to indicate to a user that the video is loading, buffering, etc.
  • the prompt can include a thumbnail, a spinner, a message, and the like.
  • the prompt can be displayed until the video portion resumes.
  • the client device can resume the video portion at a lower quality or bitrate for fast resume and can dynamically adjust the video quality or bitrate. If the video portion is not played or resumed within the joining window 320 , then the application can pause the audio (e.g., for duration of interval 322 ) until the video portion is ready for playback.
  • FIG. 4 illustrates a diagrammatic representation of a machine in the example form of a computer system 400 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.
  • the machine may be connected (e.g., networked) to other machines in a LAN, an intranet, an extranet, or the Internet.
  • the machine may operate in the capacity of a server or a client machine in client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a server, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • STB set-top box
  • WPA Personal Digital Assistant
  • a cellular telephone a web appliance
  • server a server
  • network router switch or bridge
  • the example computer system 400 includes a processing device (processor) 402 , a main memory 404 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 406 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 416 , which communicate with each other via a bus 408 .
  • a processing device processor
  • main memory 404 e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.
  • DRAM dynamic random access memory
  • SDRAM synchronous DRAM
  • RDRAM Rambus DRAM
  • static memory 406 e.g., flash memory, static random access memory (SRAM), etc.
  • SRAM static random access memory
  • Processor 402 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processor 402 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets.
  • the processor 402 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like.
  • the processor 402 is configured to execute instructions 426 for performing the operations and steps discussed herein.
  • the computer system 400 may further include a network interface device 422 .
  • the computer system 400 also may include a video display unit 410 (e.g., a liquid crystal display (LCD), a cathode ray tube (CRT), or a touch screen), an alphanumeric input device 412 (e.g., a keyboard), a cursor control device 414 (e.g., a mouse), and a signal generation device 420 (e.g., a speaker).
  • a video display unit 410 e.g., a liquid crystal display (LCD), a cathode ray tube (CRT), or a touch screen
  • an alphanumeric input device 412 e.g., a keyboard
  • a cursor control device 414 e.g., a mouse
  • a signal generation device 420 e.g., a speaker
  • the data storage device 416 may include a computer-readable storage medium 424 on which is stored one or more sets of instructions 426 (e.g., software) embodying any one or more of the methodologies or functions described herein.
  • the instructions 426 may also reside, completely or at least partially, within the main memory 404 and/or within the processor 402 during execution thereof by the computer system 400 , the main memory 404 and the processor 402 also constituting computer-readable storage media.
  • the instructions 426 may further be transmitted or received over a network 418 via the network interface device 422 .
  • the instructions 426 include instructions for a media player or a media item playback manager, which may correspond, respectively, to the media player 112 or the media item playback manager 114 described with respect to FIG. 1 , and/or a software library containing methods that provide a media player or a media item playback manager.
  • the computer-readable storage medium 424 is shown in an example implementation to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • computer-readable storage medium shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • computer-readable storage medium shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
  • the disclosure also relates to an apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may include a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions.
  • example or “exemplary” are used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the words “example” or “exemplary” is intended to present concepts in a concrete fashion.
  • the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to mean any of the natural inclusive permutations.

Abstract

A media application is disclosed. The media application provides a playback of a media item that includes a video portion and an audio portion. The media application stops the playback of the video portion of the media item while continuing to provide the audio portion of the media item. The media application resumes the playback of the video portion of the media item in synchronization with the audio portion being provided.

Description

    RELATED APPLICATION
  • This application is related to and claims the benefit of U.S. Provisional Patent Application No. 61/933,296, filed Jan. 29, 2014, the entirety of which is incorporated herein by reference.
  • TECHNICAL FIELD
  • Implementations of the present disclosure relate to content delivery, and more specifically, to media playback on a device.
  • BACKGROUND
  • The Internet allows people to obtain information, connect with others and share information with each other. Common Internet destinations include news websites, content sharing platforms, social networking websites, and the like. Many websites and platforms include a content sharing aspect that allows users to view, upload, and share media items, such as video content, image content, audio content, and so on. Users can consume the media items from their user devices.
  • SUMMARY
  • The following is a simplified summary of the disclosure in order to provide a basic understanding of some aspects of the disclosure. This summary is not an extensive overview of the disclosure. It is intended to neither identify key or critical elements of the disclosure, nor delineate any scope of the particular implementations of the disclosure or any scope of the claims. Its sole purpose is to present some concepts of the disclosure in a simplified form as a prelude to the more detailed description that is presented later.
  • In one aspect, a method includes providing, by an application executed by a processing device, a playback of a media item that includes a video portion and an audio portion. The method further includes receiving, by the application, a first message during the playback of the media item. The method further includes in response to the first message, stopping the playback of the video portion of the media item while continuing to provide the audio portion of the media item. The method further includes receiving, by the application, a second message while providing the audio portion of the media item. The method further includes in response to the second message, resuming the playback of the video portion of the media item in synchronization with the audio portion being provided.
  • Implementations can include any, all, or none of the following features. The method can include providing the playback of the media item, receiving a request to playback the media item, and reserving one or more resources associated with the playback of the media item in response to the request. The method can further include stopping the playback of the video portion of the media item and releasing at least one of the one or more resources. The one or more resources can include at least one of: a memory, a video decoder, or a video player. Releasing at least one of the one or more resources can include at least one of: clearing video data from a buffer, stopping a download of the video portion, or closing a network connection used for requesting the video portion. Resuming the presentation of the video portion of the media item can include reacquiring the released one or more resources. The first message can be at least one of: an indication that the application can have entered a background state on a mobile device, an indication that a display of the mobile device can be powered off, or an indication that a second application can have entered a foreground state on the mobile device. The second message can be an indication that the application can have entered a foreground state on the mobile device. The presentation of the video portion of the media item can be resumed without interrupting the presentation of the audio portion. The video portion can be initially presented at a first quality and wherein resuming the presentation of the video portion of the media item can include presenting the video portion at a second quality. The method can include presenting, via a graphical user interface, a message that the presentation of the video portion is to resume. The method can include receiving the video portion and the audio portion from different sources. The method can include receiving the video portion and the audio portion of the media item as a single file. The method can further include separating the video portion and the audio portion into separate files.
  • In additional implementations, computing devices for performing the operations of the above described implementations are also disclosed. Additionally, in implementations of the disclosure, a computer readable storage media stores methods for performing the operations of the above described implementations.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present disclosure is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings.
  • FIG. 1 illustrates an example system architecture, in accordance with one implementation of the disclosure.
  • FIG. 2 is a flow diagram illustrating a method for providing efficient media application backgrounding on a client device, according to some implementations of the disclosure.
  • FIG. 3 illustrates two example timelines for providing efficient media application backgrounding on a client device, according to some implementations of the disclosure.
  • FIG. 4 is a block diagram illustrating an exemplary computer system, according to some implementations.
  • DETAILED DESCRIPTION
  • Described herein is a mechanism for improving media consumption on a client device. Conventional client devices can stream media items from content sharing platforms. A media application on the client device can play the streamed media items on the client device using a media player. A user may desire to operate another feature or other application on the client device such that the audio portion of the video continues to play while the video portion is no longer displayed on the client device. For example, the user may wish to use other applications or turn the client device screen off while the audio portion continues to play. In turn, the user can minimize, close or otherwise cause the media application to enter a background state. A background state can refer to a state, in which an application continues to run while no longer being visible and an application in the background state can refer to a “backgrounded” application.
  • Conventional client devices typically do not permit dynamically and independently adding and/or removing video and audio streams during playback of a media item. Although a conventional client device can background a media application that is playing a video, the media application usually continues to run and process audio and video data (e.g., download, decode, cache) in spite of the video not being visible on-screen. When backgrounding a media application, conventional client devices typically also hold on to system resources, such as memory (for buffering data), network sockets and video decoders. In addition, some conventional devices can receive media items in non-multiplexed formats, where synchronized video and audio portions of a video are delivered separately. During playback, the client device generally renders the synchronized video and audio streams simultaneously. Conventional systems usually do not operate differently when the media application is backgrounded, which may cause an unnecessary consumption of resources. Mobile devices typically have a limited number of resources, which means that conventional approaches can prevent another application from using the resources.
  • Implementations of the present disclosure address the above deficiencies of conventional systems by providing a mechanism for controlling independent playback of a video portion of a media item and an audio portion of the media item. An application on a client device provides playback of a media item that includes a video portion and an audio portion. At any time, the application can stop playback of the video portion of the media item while continuing to provide the audio portion. Later, the application can resume playback of the video portion in synchronization with the audio portion that is being provided.
  • Techniques described herein can reduce the cost of operating an application in background mode. While the application is not presenting the video portion, the application can stop downloading the video portion, pause decoding any downloaded video data and clear any associated memory. By releasing these and other resources, network and battery consumption of the client device can be reduced as a result. Users often do not desire to download video they do not intend to watch, as it uses data and bandwidth. Further, the audio portion can be seamlessly provided throughout the application transitioning from foreground to background and back to foreground.
  • For brevity and simplicity, implementations herein are described with respect to a media item that includes an audio portion and a video portion. The media item can include any number of portions of any type. For example, portions of the media item can include audio, video, subtitle data, a stream of overlaid data, annotations, advertisements, comments, metadata, information about the content of the media item (e.g., actors, related movies, music tracks, facial recognition, and the like). Any of the portions can be handled using the techniques described herein.
  • FIG. 1 illustrates an example system architecture 100, in accordance with one implementation of the disclosure, for independently providing a playback of a media item that includes a video portion and an audio portion. The system architecture 100 includes any number of client devices 102, a network 104, a data store 106, and a content sharing platform 110. In one implementation, network 104 may include a public network (e.g., the Internet), a private network (e.g., a local area network (LAN) or wide area network (WAN)), a wired network (e.g., Ethernet network), a wireless network (e.g., an 802.11 network or a Wi-Fi network), a cellular network (e.g., a Long Term Evolution (LTE) network), routers, hubs, switches, server computers, and/or a combination thereof.
  • In one implementation, the data store 106 may be a memory (e.g., random access memory), a cache, a drive (e.g., a hard drive), a flash drive, a database system, a distributed database, a distributed storage, or another type of component or device capable of storing data. The data store 106 may also include multiple storage components (e.g., multiple drives or multiple databases) that may also span multiple computing devices (e.g., multiple server computers). The data store 106 can store media items and portions of media items, such as audio portions and video portions. The data store 106 can include a video source 118 of a video portion of a media item. Similarly, the data store 106 can include an audio source 120 of an audio portion of a media item. In implementations, the video source 118 and the audio source 120 are stored on the same data store 106. In another implementation, the video source 118 and the audio source 120 are stored on different data stores. In implementations, the different data stores can be owned and/or operated by one entity. Alternatively, or the different data stores can be owned and/or operated by multiple separate entities For example, the video source 118 is owned and operated by a first entity and the audio source 120 is owned and operated by a second entity. The client device 102 can receive the separate video and audio streams from these two different entities.
  • The client devices 102 may each include computing devices such as personal computers (PCs), laptops, mobile phones, smart phones, tablet computers, netbook computers etc. In some implementations, client device 102 may also be referred to as a “user device.” Each client device includes a media player 112. In one implementation, the media player 112 may be an application or part of an application that allows users to view content, such as images, videos, web pages, documents, etc. For example, the media player 112 may be a web browser that can access, retrieve, present, and/or navigate content (e.g., web pages such as Hyper Text Markup Language (HTML) pages, digital media items, etc.) served by a web server. The media player 112 may render, display, and/or present the content (e.g., a web page, a media viewer) to a user, such as via a graphical user interface (GUI). The media player 112 may also display an embedded media player (e.g., a Flash® player or an HTML5 player) that is embedded in a web page (e.g., a web page that may provide information about a product sold by an online merchant). In another example, the media player 112 may be a standalone application that allows users to view digital media items (e.g., digital videos, digital images, electronic books, etc.).
  • The media player 112 may be provided to the client devices 102 by a server (not shown) and/or the content sharing platform 110. For example, the media player 112 may be an embedded media player that is embedded in a web page provided by the content sharing platform 110. In another example, the media player 112 may be an application that is downloaded from the server.
  • In one implementation, the content sharing platform 110 may be one or more computing devices (such as a rackmount server, a router computer, a server computer, a personal computer, a mainframe computer, a laptop computer, a tablet computer, a desktop computer, etc.), data stores (e.g., hard disks, memories, databases), networks, software components, and/or hardware components that may be used to provide a user with access to media items and/or provide the media items to the user. For example, the content sharing platform 110 may allow a user to consumer, upload, search for, approve of (“like”), dislike, and/or comment on media items. The content sharing platform 110 may also include a website (e.g., a webpage) that may be used to provide a user with access to the media items. Content sharing platform 110 may include any type of content delivery network providing access to content and/or media items and can include a social network, a news outlet, a media aggregator, and the like. The content sharing platform 110 can use the media item data store 106 to provide a media item to the client device 102. The content sharing platform 110 causes the client device 102 to receive media items from one or more data stores, such as from media item data store 106. In implementations, the content sharing platform 110 includes the media item data store 106. In other implementations, the media item data store 106 is not part of the content sharing platform 110. The content sharing platform 110 can be communicably coupled to the media item data store 106. When handling a user request for a media item, the content sharing platform 110 can interact with the media item data store 106 to provide the requested media item to the client device 102.
  • The content sharing platform 110 can present or provide a list of available media items to a client device 102. Examples of a media item can include, and are not limited to, digital video, digital movies, digital photos, photo albums, digital music, website content, social media updates, video-on-demand, live-streamed media, electronic books (ebooks), electronic magazines, digital newspapers, digital audio books, electronic journals, web log (blog) entries, real simple syndication (RSS) feeds, electronic comic books, software applications, advertisements, etc. In some implementations, media item is also referred to as a content item.
  • A media item may be consumed via the Internet and/or via a client application, such as the media player 112 of client device 102. For brevity and simplicity, an online video (also herein referred to as a video) is used as an example of a media item throughout this document. As used herein, “media,” media item,” “online media item,” “digital media,” “digital media item,” “content,” and “content item” can include one or more electronic files that can be executed or loaded using software, firmware or hardware configured to present the digital media item to an entity.
  • In implementations, the client device 102 includes a media item playback manager 114. The media item playback manager 114 controls playback of a media item that includes two separate portions or streams. One of the portions can be an audio portion and the other can be a video portion. The client device 102 receives the video portion from the video source 118 and the audio portion from the audio source 120. The video and audio portions can be delivered in any format and using any technology or protocol, such as HTTP Live Streaming (HLS), Dynamic Adaptive Streaming over HTTP (DASH), Adobe Dynamic Streaming for Flash®, Microsoft® Smooth Streaming, Octoshape Multi-BitRate, etc. The media item playback manager 114 coordinates synchronized playback of the video and audio portions of the media item in the media player 112 and can reserve resources, as described herein.
  • The media item playback manager 114 can receive an indication to alter playback of the media item. The indication can be, for example, a message (such as a preference or command) from an operating system 116. Alternatively, the media item playback manager 114 can receive the indication from any component of the client device 102 or from a user. The indication can be at least one of: an indication that the media player 112 is no longer a primary application, an indication that the media player 112 has entered a background state on the client device 102, an indication that a display of the client device 102 is powered off, an indication that a second application has entered a foreground state on the client device 102, that video associated with the media player 112 has a lower priority, or a request received from a user via an interface or button to background the media player 112. The indication may be provided by the operating system 116 to the media item playback manager 114 to stop playback of the video portion of the media player. In response to the indication, the media item playback manager 114 can stop the playback of the video portion of the media item within the media player 112 while continuing to provide the audio portion of the media item.
  • In implementations, the operating system 116 can track when an application presenting a media item (e.g., the media player 112) enters the foreground state and when it enters the background state. The media item playback manager 114 can receive a signal, indication or message from the operating system 116 when the application (media player 112) has entered the background state.
  • Upon receiving the signal, the media item playback manager 114 can release any resources associated with downloading and playing a video. The media item playback manager 114 can start a process to clear a video buffer and release video memory, a video decoder, close a network socket, and the like. In some implementations, the media item playback manager 114 can hold some resources for some period of time. Holding resources can be beneficial for performance reasons, such as when an application enters a background state and then quickly enters a foreground state. In another example, the media item playback manager 114 can release a video decoder, but continue to download data such that when the application enters the foreground state, the media player can begin decoding the video portion for a quick resume. In some implementations, the media item playback manager 114 can hold some resources and then release them after a period of time.
  • Subsequently, the media item playback manager 114 can receive a second indication to alter playback of the media item while the media player 112 is providing the audio portion of the media item. The second indication can be to initiate or resume playback of the video portion during playback of the audio portion (e.g., when the media player 112 enters a foreground state). The second indication can be, for example, a message (such as a preference or command) from an operating system 116 that the media player 112 is now a primary application, an indication that the media player 112 has entered a foreground state on the client device 102, an indication that a display of the client device 102 is powered on, or that video associated with the media player 112 has a high display priority. In response to the second indication, the media item playback manager 114 can instruct the media player 112 to resume the playback of the video portion of the media item in synchronization with the audio portion that is being provided.
  • In implementations, the content sharing platform 110 provides an index of the audio portion and the video portion that informs the client device 102 how to synchronize the audio and video portions during playback. For example, the audio and video portions can be time-indexed. When the media player 112 begins playing the media item from the beginning, the media player 112 plays back the audio and video portions from time=0, as indicated by the index. When resuming the video portion, the media item playback manager 114 can identify where in time the playing audio falls in relation to the index. The media item playback manager 114 can then instruct the media player 112 to resume the video portion at a corresponding place in time. Also, when resuming the video portion, the media item playback manager 114 can reacquire any released resources (e.g., video decoder, buffer (reallocate memory), open network sockets to download the video portion, etc.). While this is happening, the media player 112 continues to play the audio. Once the video portion is available and ready for playback, the media item playback manager 114 can again identify where in time the playing audio falls in relation to the index. By knowing the position of the audio, the media item playback manager 114 can instruct the media player 112 to resume playback of the video portion in synchronization with the playing audio portion.
  • In general, functions described in one implementation as being performed by the client device 102 can also be performed on the content sharing platform 110 in other implementations if appropriate. Similarly, functions described in one implementation as being performed by the content sharing platform 110 can also be performed on the client devices 102 in other implementations if appropriate. In addition, the functionality attributed to a particular component can be performed by different or multiple components operating together. For example, the media player 112 and the media item playback manager 114 can operate within a single application. The content sharing platform 110 can also be accessed as a service provided to other systems or devices through appropriate application programming interfaces (APIs), and thus is not limited to use in websites.
  • In some implementations, the media player 112 and the media item playback manager 114 are part of the same application. In an implementation, the media item playback manager 114 is part of the operating system 116.
  • In further implementations, the media item playback manager 114 can download a lower quality video when in a background state for fast resume and minimal network usage. The media item playback manager 114 can request higher quality video which the media player 112 can play when received. In some implementations, when resuming the video portion download, the media item playback manager 114 can decrease the quality of the audio portion to give more bandwidth to resume the video.
  • In implementations, the media item playback manager 114 can identify current network conditions. When the media item playback manager 114 determines that it cannot reliably receive the video portion, such as due to the current network conditions, the media item playback manager 114 can trigger an audio-only mode until the video portion can be reliably received. The media item playback manager 114 can prompt a user via a GUI of the low video reliability. The user can elect to proceed with audio only and the media item playback manager 114 can receive such input via the GUI and can proceed accordingly.
  • In some implementations, the audio portion and the video portion are delivered via a single connection, stream or file. The media item playback manager 114 or the operating system 116 can inform the content platform 110 when the media player 112 enters the background state or to send the audio portion and not the video portion. The content platform 110 can stop the delivery of the video portion via the single connection. In some implementations, when the audio portion and the video portion are delivered as a single combined file, the file can include identifiers of the video and audio portions that the media item playback manager 114 can use to separate the two portions at the client.
  • In implementations, the media item can be associated with a live stream or live event (e.g., an online lecture or presentation, a video game stream). For live streams and events, the media item can be packaged in segments, such as by time (e.g., five second packets) or by size (e.g., one megabyte packets). Audio packets can have a different size than the video packets. For example, the audio packets can be five seconds in length and the video packets can be ten seconds. The audio and video portions of the media item can correspond to each other but can be packaged separately. For example, a media item can be divided into 200 sequenced packets. The audio and video portions can likewise be divided into sequenced 200 packets, with each packet corresponding to the same moment in time (e.g., the 27th packet of the video and audio portions correspond to the same moment in time). When stopping the video portion, the media item playback manager 114 can instruct the media player 112 to not play the next sequenced packet. When resuming the video portion, the media item playback manager 114 can identify which audio packet is currently playing and can instruct the media player 112 to start playing the next video packet at the same time it starts to play the next audio packet. In implementations, the media item playback manager 114 can also request and/or the next video packet from the content sharing platform 110.
  • In other implementations, a software developer or an end user can configure the media item playback manager 114. For example, a GUI can be provided to allow an end user to view some or all of the functionality of the media item playback manager 114 and modify it as needed. In another example, the content sharing platform 110 can provide an API to allow a software developer to configure the media item playback manager 114.
  • In some implementations, information, such as metadata, that corresponds to the media item currently playing back in the media player 112 in the background may be shown in a media item information portion of a user interface of the client device 102. For example, information about a video, such as its title, view count, likes, etc. can be presented. Additional menu options related to the media item currently playing in the background may also be presented, such as play, pause, stop, fast forward, rewind, social post, add comment, etc. In another implementation, the information that corresponds to the media item may be presented in a semi-transparent format (e.g., as a semi-transparent overlay or layer) while the user is performing other activities on the client device 102.
  • FIG. 2 is a flow diagram illustrating a method 200 for providing efficient media application backgrounding on a client device, according to some implementations of the disclosure. The method 200 may be performed by processing logic that comprises hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (e.g., instructions run on a processing device to perform hardware simulation), or a combination thereof.
  • For simplicity of explanation, the methods of this disclosure are depicted and described as a series of acts. However, acts in accordance with this disclosure can occur in various orders and/or concurrently, and with other acts not presented and described herein. Furthermore, not all illustrated acts may be required to implement the methods in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the methods could alternatively be represented as a series of interrelated states via a state diagram or events. Additionally, it should be appreciated that the methods disclosed in this specification are capable of being stored on an article of manufacture to facilitate transporting and transferring such methods to computing devices. The term “article of manufacture,” as used herein, is intended to encompass a computer program accessible from any computer-readable device or storage media. In one implementation, method 200 may be performed by an application, such as the media player 112 or the media item playback manager 114, or a combination thereof, as further described in conjunction with FIG. 1.
  • Referring to FIG. 2, method 200 begins at block 202 when processing logic receives a request to playback a media item having a video portion and an audio portion. The request can be received as user input, such as via a touch screen of a client device. At block 204, the processing logic reserves one or more resources associated with the playback of the media item in response to the request. The one or more resources can be, for example, a memory, a buffer, a video decoder, network socket, or a video player. The processing logic receives the media item from one or more sources.
  • At block 206, the processing logic presents the media item, which can include providing a playback of the media item via a display of the client device. At block 208, the processing logic receives a first message during the playback of the media item.
  • At block 210, in response to the first message, the processing logic stops the playback of the video portion of the media item while continuing to provide the audio portion of the media item. At block 212, the processing logic releases at least one of the one or more resources associated with the playback of the media item. For example, when releasing the at least one of the one or more resources, the processing logic can include at least one of: clearing video data from a buffer, stopping a download of the video portion, or closing a network connection used for requesting the video portion.
  • At block 214, the processing logic receives a second message while providing the audio portion of the media item. In implementations, the second message is an indication that an application executed by the processing logic has entered a foreground state on the client device. At block 216, the processing logic reacquires the released resource(s). At block 218 and in response to the second message, the processing logic resumes the playback of the video portion of the media item in synchronization with the audio portion being provided. In implementations, the processing logic resumes presentation of the video portion of the media item without interrupting the presentation of the audio portion.
  • In implementations, the processing logic presents the video portion initially at a first quality level or bitrate and when resuming the presentation of the video portion of the media item, the processing logic presents the video portion at a second quality level or bitrate. For example, the second quality can be higher than the first quality due to improved network resources.
  • FIG. 3 illustrates two example timelines 302, 304 for providing efficient media application backgrounding on a client device, according to some implementations of the disclosure. Intervals and events are used to illustrate activities. Length or dimensions of the intervals are not indicative of a particular time or duration, nor are they indicative of any time or duration relative to each other. Any interval can be any amount or duration of time.
  • In a first example, run 302, an application on a client device, such as the client device 102 illustrated in conjunction with FIG. 1, plays both an audio portion and a video portion of a media item during interval 306. At event 308, the application receives a first message and, in response to the message, stops playback of the video portion while continuing to playback the audio portion, as described herein. During interval 310, the application releases video resources and by the end of interval 312, the video resources are released. At event 314, the application receives a second message. In response to the second message received at event 314, the application acquires video resources during interval 316. Once the video resources are acquired, the application can resume playback of the video portion of the media item while continuing to playback the audio portion during interval 306.
  • In a second example, run 304, the application functions similarly to the first example, run 302. The second example further includes a joining window 320 which is an allowable duration for the application to acquire video resources and to begin playing the video portion in synchronization with the audio portion. The duration of the joining window 320 can be a predetermined amount of time (e.g., five seconds). Alternatively, the duration of the joining window 320 can be dynamic and can depend on any variable pertaining to the media item. For example, a duration of the joining window 320 can be a remaining duration of an audio data packet that is currently being played. During the joining window 320, the client device can render a prompt to indicate to a user that the video is loading, buffering, etc. For example, the prompt can include a thumbnail, a spinner, a message, and the like. The prompt can be displayed until the video portion resumes. In implementations, the client device can resume the video portion at a lower quality or bitrate for fast resume and can dynamically adjust the video quality or bitrate. If the video portion is not played or resumed within the joining window 320, then the application can pause the audio (e.g., for duration of interval 322) until the video portion is ready for playback.
  • FIG. 4 illustrates a diagrammatic representation of a machine in the example form of a computer system 400 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. In alternative implementations, the machine may be connected (e.g., networked) to other machines in a LAN, an intranet, an extranet, or the Internet. The machine may operate in the capacity of a server or a client machine in client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a server, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • The example computer system 400 includes a processing device (processor) 402, a main memory 404 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 406 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 416, which communicate with each other via a bus 408.
  • Processor 402 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processor 402 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processor 402 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processor 402 is configured to execute instructions 426 for performing the operations and steps discussed herein.
  • The computer system 400 may further include a network interface device 422. The computer system 400 also may include a video display unit 410 (e.g., a liquid crystal display (LCD), a cathode ray tube (CRT), or a touch screen), an alphanumeric input device 412 (e.g., a keyboard), a cursor control device 414 (e.g., a mouse), and a signal generation device 420 (e.g., a speaker).
  • The data storage device 416 may include a computer-readable storage medium 424 on which is stored one or more sets of instructions 426 (e.g., software) embodying any one or more of the methodologies or functions described herein. The instructions 426 may also reside, completely or at least partially, within the main memory 404 and/or within the processor 402 during execution thereof by the computer system 400, the main memory 404 and the processor 402 also constituting computer-readable storage media. The instructions 426 may further be transmitted or received over a network 418 via the network interface device 422.
  • In one implementation, the instructions 426 include instructions for a media player or a media item playback manager, which may correspond, respectively, to the media player 112 or the media item playback manager 114 described with respect to FIG. 1, and/or a software library containing methods that provide a media player or a media item playback manager. While the computer-readable storage medium 424 is shown in an example implementation to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
  • In the foregoing description, numerous details are set forth. It will be apparent, however, to one of ordinary skill in the art having the benefit of this disclosure, that the present disclosure may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present disclosure.
  • Some portions of the detailed description have been presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
  • It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “segmenting”, “analyzing”, “determining”, “enabling”, “identifying,” “modifying” or the like, refer to the actions and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (e.g., electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
  • The disclosure also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may include a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions.
  • The words “example” or “exemplary” are used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the words “example” or “exemplary” is intended to present concepts in a concrete fashion. As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to mean any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, then “X includes A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form. Moreover, use of the term “an implementation” or “one implementation” throughout is not intended to mean the same implementation unless described as such.
  • Reference throughout this specification to “one implementation” or “an implementation” means that a particular feature, structure, or characteristic described in connection with the implementation is included in at least one implementation. Thus, the appearances of the phrase “in one implementation” or “in an implementation” in various places throughout this specification are not necessarily all referring to the same implementation. In addition, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.”
  • It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other implementations will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims (20)

What is claimed is:
1. A method comprising:
providing, by an application executed by a processing device, a playback of a media item comprising a video portion and an audio portion;
receiving, by the application, a first message during the playback of the media item;
in response to the first message, stopping the playback of the video portion of the media item while continuing to provide the audio portion of the media item;
receiving, by the application, a second message while providing the audio portion of the media item; and
in response to the second message, resuming the playback of the video portion of the media item in synchronization with the audio portion being provided.
2. The method of claim 1, wherein:
providing the playback of the media item comprises receiving a request to playback the media item, and reserving one or more resources associated with the playback of the media item in response to the request; and
stopping the playback of the video portion of the media item comprises releasing at least one of the one or more resources.
3. The method of claim 2, wherein the one or more resources comprises at least one of: a memory, a buffer, a video decoder, or a video player.
4. The method of claim 2, wherein releasing at least one of the one or more resources comprises at least one of: clearing video data from a buffer, stopping a download of the video portion, or closing a network connection used for requesting the video portion.
5. The method of claim 2, wherein resuming the presentation of the video portion of the media item comprises reacquiring the released one or more resources.
6. The method of claim 1, wherein the first message is at least one of: an indication that the application has entered a background state on a mobile device, an indication that a display of the mobile device is powered off, or an indication that a second application has entered a foreground state on the mobile device.
7. The method of claim 1, wherein the second message is an indication that the application has entered a foreground state on a mobile device.
8. The method of claim 1, wherein the presentation of the video portion of the media item is resumed without interrupting the presentation of the audio portion.
9. The method of claim 1, wherein the video portion is initially presented at a first quality and wherein resuming the presentation of the video portion of the media item comprises presenting the video portion at a second quality.
10. The method of claim 1 further comprising presenting, via a graphical user interface, a message that the presentation of the video portion is to resume.
11. The method of claim 1 further comprising receiving the video portion and the audio portion from different sources.
12. The method of claim 1 further comprising:
receiving the video portion and the audio portion of the media item as a single file; and
separating the video portion and the audio portion into separate files.
13. An apparatus comprising:
a display device;
a memory communicably coupled to the display device; and
a processing device communicably coupled to the memory, the processing device to execute instructions to:
provide a playback of a media item comprising a video portion and an audio portion;
receive a first message during the playback of the media item;
in response to the first message, stop the playback of the video portion of the media item while continuing to provide the audio portion of the media item;
receive a second message while providing the audio portion of the media item; and
in response to the second message, resume the playback of the video portion of the media item in synchronization with the audio portion being provided.
14. The apparatus of claim 13, wherein:
when providing the playback of the media item, the processing device is to receive a request to playback the media item, and reserve one or more resources associated with the playback of the media item in response to the request; and
when stopping the playback of the video portion of the media item, the processing device is to release at least one of the one or more resources.
15. The apparatus of claim 14, wherein the one or more resources comprises at least one of: a memory, a buffer, a video decoder, or a video player.
16. The apparatus of claim 14, wherein when releasing at least one of the one or more resources, the processing device is to perform at least one of: clear video data from a buffer, stop a download of the video portion, or close a network connection used for requesting the video portion.
17. The apparatus of claim 14, wherein when resuming the presentation of the video portion of the media item, the processing device is to reacquire the released one or more resources.
18. A non-transitory machine-readable storage medium storing instructions which, when executed, cause a processing device to perform operations comprising:
providing, by an application executed by the processing device, a playback of a media item comprising a video portion and an audio portion;
receiving, by the application, a first message during the playback of the media item;
in response to the first message, stopping the playback of the video portion of the media item while continuing to provide the audio portion of the media item;
receiving, by the application, a second message while providing the audio portion of the media item; and
in response to the second message, resuming the playback of the video portion of the media item in synchronization with the audio portion being provided.
19. The non-transitory machine-readable storage medium of claim 18, wherein the first message is at least one of: an indication that the application has entered a background state on a mobile device, an indication that a display of the mobile device is powered off, or an indication that a second application has entered a foreground state on the mobile device.
20. The non-transitory machine-readable storage medium of claim 18, wherein the second message is an indication that the application has entered a foreground state on a mobile device.
US14/228,199 2014-01-29 2014-03-27 Media application backgrounding Active 2034-10-07 US9558787B2 (en)

Priority Applications (18)

Application Number Priority Date Filing Date Title
US14/228,199 US9558787B2 (en) 2014-01-29 2014-03-27 Media application backgrounding
CN202110381902.4A CN113518070A (en) 2014-01-29 2015-01-29 Media application background processing
AU2015210937A AU2015210937B2 (en) 2014-01-29 2015-01-29 Media application backgrounding
EP15743253.5A EP3100452B1 (en) 2014-01-29 2015-01-29 Media application backgrounding
PCT/US2015/013537 WO2015116827A1 (en) 2014-01-29 2015-01-29 Media application backgrounding
KR1020177035850A KR102108949B1 (en) 2014-01-29 2015-01-29 Media application backgrounding
CN201580006269.3A CN105940671A (en) 2014-01-29 2015-01-29 Media application backgrounding
KR1020207012803A KR102233785B1 (en) 2014-01-29 2015-01-29 Media application backgrounding
BR112016017559A BR112016017559A2 (en) 2014-01-29 2015-01-29 BACKGROUND MEDIA APPLICATION
KR1020167023225A KR20160113230A (en) 2014-01-29 2015-01-29 Media application backgrounding
EP20176571.6A EP3739584A1 (en) 2014-01-29 2015-01-29 Media application backgrounding
JP2016549106A JP2017508368A (en) 2014-01-29 2015-01-29 Media application background processing
US15/404,045 US10432695B2 (en) 2014-01-29 2017-01-11 Media application backgrounding
AU2018202378A AU2018202378B2 (en) 2014-01-29 2018-04-04 Media application backgrounding
JP2018092056A JP6557380B2 (en) 2014-01-29 2018-05-11 Media application background processing
JP2018092812A JP2018125891A (en) 2014-01-29 2018-05-14 Background processing for media application
US16/589,054 US10841359B2 (en) 2014-01-29 2019-09-30 Media application backgrounding
AU2020204121A AU2020204121B2 (en) 2014-01-29 2020-06-19 Media application backgrounding

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461933296P 2014-01-29 2014-01-29
US14/228,199 US9558787B2 (en) 2014-01-29 2014-03-27 Media application backgrounding

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/404,045 Continuation US10432695B2 (en) 2014-01-29 2017-01-11 Media application backgrounding

Publications (2)

Publication Number Publication Date
US20150213839A1 true US20150213839A1 (en) 2015-07-30
US9558787B2 US9558787B2 (en) 2017-01-31

Family

ID=53679607

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/228,199 Active 2034-10-07 US9558787B2 (en) 2014-01-29 2014-03-27 Media application backgrounding
US15/404,045 Active 2035-01-24 US10432695B2 (en) 2014-01-29 2017-01-11 Media application backgrounding
US16/589,054 Active US10841359B2 (en) 2014-01-29 2019-09-30 Media application backgrounding

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/404,045 Active 2035-01-24 US10432695B2 (en) 2014-01-29 2017-01-11 Media application backgrounding
US16/589,054 Active US10841359B2 (en) 2014-01-29 2019-09-30 Media application backgrounding

Country Status (8)

Country Link
US (3) US9558787B2 (en)
EP (2) EP3100452B1 (en)
JP (3) JP2017508368A (en)
KR (3) KR20160113230A (en)
CN (2) CN113518070A (en)
AU (3) AU2015210937B2 (en)
BR (1) BR112016017559A2 (en)
WO (1) WO2015116827A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016010698A1 (en) * 2014-07-15 2016-01-21 Google Inc. Adaptive background playback behavior
CN105979355A (en) * 2015-12-10 2016-09-28 乐视网信息技术(北京)股份有限公司 Method and device for playing video
US20160323482A1 (en) * 2015-04-28 2016-11-03 Rovi Guides, Inc. Methods and systems for synching supplemental audio content to video content
US20170171288A1 (en) * 2014-06-27 2017-06-15 Cheetah Mobile Inc. Video playing method and device for video playing application program
US20170201560A1 (en) * 2016-01-12 2017-07-13 Funai Electric Co., Ltd. Distribution device and information device
CN106998495A (en) * 2016-01-22 2017-08-01 百度在线网络技术(北京)有限公司 A kind of video broadcasting method and device
CN108012584A (en) * 2015-08-06 2018-05-08 谷歌有限责任公司 Offer is suitable for the only method of the video content of audio playback, system and medium
CN108566561A (en) * 2018-04-18 2018-09-21 腾讯科技(深圳)有限公司 Video broadcasting method, device and storage medium
CN108900889A (en) * 2018-06-29 2018-11-27 上海哔哩哔哩科技有限公司 Barrage echo display methods, device, system and computer readable storage medium
US20190132398A1 (en) * 2017-11-02 2019-05-02 Microsoft Technology Licensing, Llc Networked User Interface Back Channel Discovery Via Wired Video Connection
US20190238939A1 (en) * 2016-09-07 2019-08-01 Lg Electronics Inc. Image display device and system thereof
US11200022B2 (en) 2017-11-24 2021-12-14 Tencent Music Entertainment Technology [Shenzhen] Co., Ltd. Method and apparatus of playing audio data
CN114125576A (en) * 2021-11-29 2022-03-01 广州繁星互娱信息科技有限公司 Multimedia resource synchronization method and device, storage medium and electronic equipment
EP3866481A4 (en) * 2019-01-30 2022-06-29 Shanghai Bilibili Technology Co., Ltd. Audio/video switching method and apparatus, and computer device and readable storage medium

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
MX2017012275A (en) * 2015-03-26 2018-05-28 Maxxian Tech Inc � Systems and methods for detecting and interfering with compromised devices and unauthorized device relocation in a communication network.
CN110708579B (en) * 2016-08-04 2021-09-07 联咏科技股份有限公司 Electronic device capable of executing video playing
JP6786342B2 (en) * 2016-10-18 2020-11-18 キヤノン株式会社 Information processing equipment, information processing methods and programs
CN108459837A (en) * 2017-02-22 2018-08-28 深圳市中兴微电子技术有限公司 A kind of audio data processing method and device
CN106937167A (en) * 2017-02-25 2017-07-07 杭州领娱科技有限公司 A kind of background audio processing method and its mobile terminal
CN109714640B (en) * 2017-10-26 2022-01-21 创盛视联数码科技(北京)有限公司 Method for playing live video
CN109088997B (en) * 2018-10-26 2021-05-21 努比亚技术有限公司 Game audio control method, terminal and computer readable storage medium
US10992490B2 (en) * 2018-12-17 2021-04-27 Rovi Guides, Inc. System and method for controlling playback or recording of media assets based on a state of a secondary device
CN111510755A (en) * 2019-01-30 2020-08-07 上海哔哩哔哩科技有限公司 Audio and video switching method and device, computer equipment and readable storage medium
CN113796090A (en) * 2019-05-10 2021-12-14 电影音频私人有限公司 System and method for synchronizing audio content on a mobile device to a separate visual display system
US11005909B2 (en) 2019-08-30 2021-05-11 Rovi Guides, Inc. Systems and methods for providing content during reduced streaming quality
US10986378B2 (en) 2019-08-30 2021-04-20 Rovi Guides, Inc. Systems and methods for providing content during reduced streaming quality
US11184648B2 (en) * 2019-08-30 2021-11-23 Rovi Guides, Inc. Systems and methods for providing content during reduced streaming quality
CN112911364A (en) * 2021-01-18 2021-06-04 珠海全志科技股份有限公司 Audio and video playing method, computer device and computer readable storage medium
KR20220104548A (en) * 2021-01-18 2022-07-26 삼성전자주식회사 Method and apparatus for synchronizing audio and video signals of multimedia content
US11893651B2 (en) 2022-04-04 2024-02-06 Motorola Solutions, Inc. Systems for collecting digital witness statements and detecting electronic resources referenced during collection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5596420A (en) * 1994-12-14 1997-01-21 Cirrus Logic, Inc. Auto latency correction method and apparatus for MPEG playback system
US5818547A (en) * 1995-06-30 1998-10-06 Sony Corporation Timing detection device and method
US20090271831A1 (en) * 2007-09-26 2009-10-29 Joe Louis Binno Mobile Instructional Videos
US20110299586A1 (en) * 2010-06-04 2011-12-08 Mobitv, Inc. Quality adjustment using a fragmented media stream
US20130279877A1 (en) * 2012-04-19 2013-10-24 Qnx Software Systems Limited System and Method Of Video Decoder Resource Sharing
US20140259046A1 (en) * 2013-03-08 2014-09-11 Verizon Patent And Licensing, Inc. User censoring of content delivery service streaming media

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4450488A (en) * 1980-10-31 1984-05-22 Discovision Associates System for recording continuous-play and stop-motion signal
AU2001288592B2 (en) 2000-09-01 2008-03-13 Alcatel Dynamic quality adjustment based on changing streaming constraints
JP2003087748A (en) * 2001-09-17 2003-03-20 Funai Electric Co Ltd Optical disk reproducing device
JP3660649B2 (en) * 2002-06-07 2005-06-15 株式会社東芝 File information reproducing apparatus and file information reproducing method
US8064755B2 (en) * 2002-11-08 2011-11-22 Lg Electronics Inc. Method and apparatus for recording a multi-component stream and a high-density recording medium having a multi-component stream recorded thereon and reproducing method and apparatus of said recording medium
JP2005004840A (en) * 2003-06-10 2005-01-06 Mitsubishi Electric Corp Disk reproducing equipment
KR100585718B1 (en) 2003-07-31 2006-06-07 엘지전자 주식회사 Multimedia streaming service method for mobile communication terminal
JP2005210636A (en) 2004-01-26 2005-08-04 Nec Micro Systems Ltd Digital data playback method, and apparatus
US20080231686A1 (en) * 2007-03-22 2008-09-25 Attune Interactive, Inc. (A Delaware Corporation) Generation of constructed model for client runtime player using motion points sent over a network
JP2009016907A (en) * 2007-06-29 2009-01-22 Toshiba Corp Conference system
CN104263291A (en) 2007-08-08 2015-01-07 日立化成工业株式会社 Adhesive Composition, Film-like Adhesive, And Connection Structure For Circuit Member
CN101378496A (en) * 2007-08-30 2009-03-04 朱晓阳 Highgrade integration management system for monitoring remote video dynamically
CN101222296B (en) * 2008-01-31 2010-06-09 上海交通大学 Self-adapting transmission method and system in ascending honeycomb video communication
US8347210B2 (en) * 2008-09-26 2013-01-01 Apple Inc. Synchronizing video with audio beats
CN101753924B (en) * 2008-11-28 2013-01-02 康佳集团股份有限公司 Standby control method for television
CN101510998B (en) * 2009-02-24 2010-12-01 山东大学 Self-adapting flow control method for data transmission of wireless video monitoring system
JP2011044976A (en) 2009-08-24 2011-03-03 Canon Inc Video playback apparatus
CN101753946A (en) * 2009-12-22 2010-06-23 北京中星微电子有限公司 Merge method and system for video file and audio file
KR20110092713A (en) 2010-02-10 2011-08-18 삼성전자주식회사 System and method for offering real time multimedia service
CN102163073B (en) * 2010-02-23 2012-11-21 华为终端有限公司 Terminal power consumption optimization processing method and device
CN101984648A (en) * 2010-11-02 2011-03-09 中兴通讯股份有限公司 Audio video file playing method and terminal thereof
CN102045595B (en) * 2010-11-05 2012-11-14 中国华录集团有限公司 System for realizing standby and awakening of set top box by utilizing singlechip
US20120209413A1 (en) * 2011-02-14 2012-08-16 Microsoft Corporation Background Audio on Mobile Devices
CN102123303B (en) * 2011-03-25 2012-10-24 天脉聚源(北京)传媒科技有限公司 Audio/video file playing method and system as well as transmission control device
WO2011100901A2 (en) * 2011-04-07 2011-08-25 华为技术有限公司 Method, device and system for transmitting and processing media content
KR20120134440A (en) * 2011-06-02 2012-12-12 크루셜텍 (주) Sound reproductoin device providing time synchronized music video and method therefor
CN103842941B (en) * 2011-09-09 2016-12-07 泰利斯航空电子学公司 Gesticulate action in response to the passenger sensed and perform the control of vehicle audio entertainment system
US8250228B1 (en) * 2011-09-27 2012-08-21 Google Inc. Pausing or terminating video portion while continuing to run audio portion of plug-in on browser
JP5821523B2 (en) * 2011-10-25 2015-11-24 株式会社Jvcケンウッド Distribution system, distribution method, and receiving apparatus
TWI502977B (en) 2012-02-13 2015-10-01 Acer Inc Audio/video playing device, audio/video processing device, systems, and method thereof
US8856815B2 (en) * 2012-04-27 2014-10-07 Intel Corporation Selective adjustment of picture quality features of a display
US9009619B2 (en) * 2012-09-19 2015-04-14 JBF Interlude 2009 Ltd—Israel Progress bar for branched videos
CN102930881B (en) * 2012-11-20 2015-08-05 广东欧珀移动通信有限公司 A kind of Blu-ray player and control method thereof
US8955060B2 (en) * 2013-05-10 2015-02-10 Verizon Patent And Licensing Inc. Collaborative parental control of streaming media
US20150095758A1 (en) * 2013-10-01 2015-04-02 Microsoft Corporation Web content suspension compatibility and suspended web content lifetime

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5596420A (en) * 1994-12-14 1997-01-21 Cirrus Logic, Inc. Auto latency correction method and apparatus for MPEG playback system
US5818547A (en) * 1995-06-30 1998-10-06 Sony Corporation Timing detection device and method
US20090271831A1 (en) * 2007-09-26 2009-10-29 Joe Louis Binno Mobile Instructional Videos
US20110299586A1 (en) * 2010-06-04 2011-12-08 Mobitv, Inc. Quality adjustment using a fragmented media stream
US20130279877A1 (en) * 2012-04-19 2013-10-24 Qnx Software Systems Limited System and Method Of Video Decoder Resource Sharing
US20140259046A1 (en) * 2013-03-08 2014-09-11 Verizon Patent And Licensing, Inc. User censoring of content delivery service streaming media

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170171288A1 (en) * 2014-06-27 2017-06-15 Cheetah Mobile Inc. Video playing method and device for video playing application program
US10656803B2 (en) 2014-07-15 2020-05-19 Google Llc Adaptive background playback behavior
US9665248B2 (en) 2014-07-15 2017-05-30 Google Inc. Adaptive background playback behavior
WO2016010698A1 (en) * 2014-07-15 2016-01-21 Google Inc. Adaptive background playback behavior
US20160323482A1 (en) * 2015-04-28 2016-11-03 Rovi Guides, Inc. Methods and systems for synching supplemental audio content to video content
US10142585B2 (en) * 2015-04-28 2018-11-27 Rovi Guides, Inc. Methods and systems for synching supplemental audio content to video content
US11722746B2 (en) 2015-08-06 2023-08-08 Google Llc Methods, systems, and media for providing video content suitable for audio-only playback
CN108012584A (en) * 2015-08-06 2018-05-08 谷歌有限责任公司 Offer is suitable for the only method of the video content of audio playback, system and medium
EP4024874A1 (en) * 2015-08-06 2022-07-06 Google LLC Methods, systems, and media for providing video content suitable for audio-only playback
US11109109B2 (en) 2015-08-06 2021-08-31 Google Llc Methods, systems, and media for providing video content suitable for audio-only playback
CN105979355A (en) * 2015-12-10 2016-09-28 乐视网信息技术(北京)股份有限公司 Method and device for playing video
US20170168542A1 (en) * 2015-12-10 2017-06-15 Le Holdings (Beijing) Co., Ltd. Method for playing video and electronic device
US20170201560A1 (en) * 2016-01-12 2017-07-13 Funai Electric Co., Ltd. Distribution device and information device
US10862935B2 (en) * 2016-01-12 2020-12-08 Funai Electric Co., Ltd. Distribution device and information device
CN106998495A (en) * 2016-01-22 2017-08-01 百度在线网络技术(北京)有限公司 A kind of video broadcasting method and device
US20190238939A1 (en) * 2016-09-07 2019-08-01 Lg Electronics Inc. Image display device and system thereof
US11095939B2 (en) * 2016-09-07 2021-08-17 Lg Electronics Inc. Image display device and system thereof
US20190132398A1 (en) * 2017-11-02 2019-05-02 Microsoft Technology Licensing, Llc Networked User Interface Back Channel Discovery Via Wired Video Connection
US11200022B2 (en) 2017-11-24 2021-12-14 Tencent Music Entertainment Technology [Shenzhen] Co., Ltd. Method and apparatus of playing audio data
CN108566561A (en) * 2018-04-18 2018-09-21 腾讯科技(深圳)有限公司 Video broadcasting method, device and storage medium
CN108900889A (en) * 2018-06-29 2018-11-27 上海哔哩哔哩科技有限公司 Barrage echo display methods, device, system and computer readable storage medium
US11706496B2 (en) * 2018-06-29 2023-07-18 Shanghai Bilibili Technology Co., Ltd. Echo bullet screen
US20200007940A1 (en) * 2018-06-29 2020-01-02 Shanghai Bilibili Technology Co., Ltd. Echo bullet screen
EP3866481A4 (en) * 2019-01-30 2022-06-29 Shanghai Bilibili Technology Co., Ltd. Audio/video switching method and apparatus, and computer device and readable storage medium
US11490173B2 (en) 2019-01-30 2022-11-01 Shanghai Bilibili Technology Co., Ltd. Switch of audio and video
CN114125576A (en) * 2021-11-29 2022-03-01 广州繁星互娱信息科技有限公司 Multimedia resource synchronization method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
US10432695B2 (en) 2019-10-01
AU2020204121B2 (en) 2021-10-07
AU2018202378A1 (en) 2018-04-26
CN113518070A (en) 2021-10-19
EP3739584A1 (en) 2020-11-18
JP2018157579A (en) 2018-10-04
CN105940671A (en) 2016-09-14
KR102108949B1 (en) 2020-05-12
EP3100452A4 (en) 2017-10-04
EP3100452B1 (en) 2020-07-08
JP2018125891A (en) 2018-08-09
JP2017508368A (en) 2017-03-23
US10841359B2 (en) 2020-11-17
AU2018202378B2 (en) 2020-03-19
AU2015210937A1 (en) 2016-07-28
KR20200051061A (en) 2020-05-12
EP3100452A1 (en) 2016-12-07
KR20160113230A (en) 2016-09-28
KR102233785B1 (en) 2021-03-29
BR112016017559A2 (en) 2017-08-08
US20170126774A1 (en) 2017-05-04
AU2020204121A1 (en) 2020-07-09
KR20170141281A (en) 2017-12-22
WO2015116827A1 (en) 2015-08-06
US20200106825A1 (en) 2020-04-02
US9558787B2 (en) 2017-01-31
AU2015210937B2 (en) 2018-02-01
JP6557380B2 (en) 2019-08-07

Similar Documents

Publication Publication Date Title
US10841359B2 (en) Media application backgrounding
US9946449B2 (en) Display mode based media player switching
US10298902B1 (en) Previewing and playing media items based on scenes
US20240036714A1 (en) Presenting content items and performing actions with respect to content items
US11907279B2 (en) Mechanism to handle interrupted playback experience for media items in playlists
EP3100267B1 (en) Method for improving offline content playback
US20190018572A1 (en) Content item players with voice-over on top of existing media functionality
US20150281317A1 (en) Requesting media content segments via multiple network connections
US9733794B1 (en) System and method for sharing digital media item with specified start time

Legal Events

Date Code Title Description
AS Assignment

Owner name: GOOGLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WOODMAN, OLIVER JOHN;DOUCLEFF, MATT;REEL/FRAME:032546/0452

Effective date: 20140327

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: GOOGLE LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044097/0658

Effective date: 20170929

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4