CN112585986A - Synchronization of digital content consumption - Google Patents

Synchronization of digital content consumption Download PDF

Info

Publication number
CN112585986A
CN112585986A CN201980054904.3A CN201980054904A CN112585986A CN 112585986 A CN112585986 A CN 112585986A CN 201980054904 A CN201980054904 A CN 201980054904A CN 112585986 A CN112585986 A CN 112585986A
Authority
CN
China
Prior art keywords
computing device
user computing
user
digital content
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980054904.3A
Other languages
Chinese (zh)
Other versions
CN112585986B (en
Inventor
陈泰嘉
阿迪蒂亚·阿吉
奥利维尔·阿兰·皮埃尔·诺特戈姆
格雷戈里·斯蒂芬·威廉姆斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Facebook Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Facebook Technologies LLC filed Critical Facebook Technologies LLC
Publication of CN112585986A publication Critical patent/CN112585986A/en
Application granted granted Critical
Publication of CN112585986B publication Critical patent/CN112585986B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/632Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing using a connection between clients on a wide area network, e.g. setting up a peer-to-peer communication via Internet for retrieving video segments from the hard-disk of other client devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L7/00Arrangements for synchronising receiver with transmitter
    • H04L7/0008Synchronisation information channels, e.g. clock distribution lines
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

In one embodiment, a method includes connecting two or more user computing devices to provide a synchronized presentation session. Each user computing device may present the digital content through an instance of a third-party application associated with a third-party content producer. If one or more of these user computing devices lose synchronization with one or more other devices, a synchronization message may be sent to the unsynchronized devices, whereby the synchronization message may synchronize the presentation of the digital content on all devices.

Description

Synchronization of digital content consumption
Technical Field
The present disclosure relates generally to consuming digital content in virtual reality.
Background
Traditionally, a user who wants to consume digital content, such as a movie or television program, would be limited to viewing the content on a traditional display device, such as a phone, a personal computer (e.g., a laptop or desktop computer), or a television. These conventional display devices are typically only suitable for use by individuals or small groups of users due to the small size and limited display capabilities of the screens associated with these devices. In some cases, users may congregate at movie theaters or friend's homes for a more social viewing experience, but in all cases, the viewing experience is limited to the physical environment (e.g., movie theaters, living rooms, etc.).
To address some of these limitations, methods of consuming digital content in a Virtual Reality (VR) environment have been developed. VR devices allow users to consume digital content in any virtual environment, which may provide users with a more immersive and diverse viewing experience. For example, a user may watch a movie at a virtual cinema at sea, rather than watching the movie on a living room television. Conventional VR devices have the significant disadvantage that they are stand alone devices. Because VR devices are typically head-mounted devices (headsets) that are intended to provide a visual and auditory experience for only one user, typical content viewing experiences through VRs lack social interaction between users in the VR environment (e.g., each user will watch a movie alone).
Summary of the specific embodiments
Embodiments of the present invention relate to allowing multiple users of a virtual reality device to collectively view (co-watch) digital content. It should be understood that "co-viewing" may mean any synchronized consumption of digital content, and does not necessarily mean merely synchronized co-consumption of visual content. Instead, co-viewing may include synchronized consumption of content based on visual, auditory, olfactory, tactile, or gustatory senses. Further, it should be understood that embodiments described herein that relate to virtual reality devices may also relate to any other type of personal computing device, including a telephone, a television, a desktop or laptop computer, a gaming device, or any other device capable of presenting digital content to a user.
In some embodiments, multiple users may share a space in a virtual environment. These users may wish to collectively consume digital content, such as movies, music, or television programs. In some embodiments, a single server may host digital content and stream (stream) it directly, simultaneously, to the user devices of each user of the virtual space. These embodiments may track which user devices should be included in the viewing session and track playback of content on each user device. Some embodiments may allow direct control over the playback of each stream and may allow playback controls operating on any one device to affect playback on all remaining devices. However, these embodiments require the operator of the virtual space to have direct access and rights to the digital content being streamed. This can take a significant amount of time and money to complete and lacks scalability. Thus, other embodiments of the invention may allow multiple users in a shared virtual space to seamlessly and simultaneously consume digital content, where each user independently streams the content directly from a third-party content application. In this manner, embodiments of the invention may allow multiple users (each running an instance of their own third-party content application) to collectively consume the same digital content and enjoy a social collective viewing experience within virtual reality.
In some embodiments, multiple users may use third-party content applications to stream content directly to the users' virtual reality devices. Some embodiments may begin when each of a plurality of users downloads a third-party application associated with a third-party content provider onto a corresponding plurality of virtual reality devices. The third party application may communicate with the third party content provider to stream content to the user and enable and control playback through the virtual reality device. In some embodiments, the content may be played on the entire screen of the virtual reality device. In other embodiments, the content may be re-projected onto a virtual screen within the virtual environment. Re-projection is a technique that may allow third-party content to be shared directly from a third-party content application to a virtual reality device and projected directly into a virtual reality environment on the virtual reality device. In some embodiments, this may include re-projecting content from the third-party content application onto a virtual television or movie theater screen in the VR environment. In some embodiments, the user may be represented by an avatar (avatar) within the virtual environment. Avatars may be represented as human characters and may sit together on a couch, a movie theater, or any other suitable location or position. The user's reaction to the presented content may be reflected in real-time by the avatar, providing the user with a social community viewing experience.
Some embodiments may utilize one of at least three models to synchronize digital content. These models are referred to as the "host-client" model, the "server" model, and the "omni" model. Each model may differ in how the virtual reality devices connect and communicate and in the level of control any one device has over the remaining devices. Each of these three models will be described in more detail below.
The embodiments disclosed herein are merely examples, and the scope of the present disclosure is not limited to them. Particular embodiments may include all, some, or none of the components, elements, features, functions, operations, or steps of the above-disclosed embodiments. Embodiments in accordance with the present invention are specifically disclosed in the accompanying claims directed to methods, storage media, systems, and computer program products, wherein any feature referred to in one claim category (e.g., method) may also be claimed in another claim category (e.g., system). The dependencies or back-references in the appended claims are chosen for formal reasons only. However, any subject matter resulting from an intentional back-reference (especially multiple references) to any preceding claim may also be claimed, such that any combination of a claim and its features is disclosed and may be claimed, irrespective of the dependencies chosen in the appended claims. The subject matter which can be claimed comprises not only the combination of features as set forth in the appended claims, but also any other combination of features in the claims, wherein each feature mentioned in the claims can be combined with any other feature or combination of other features in the claims. Furthermore, any of the embodiments and features described or depicted herein may be claimed in a separate claim and/or in any combination with any of the embodiments or features described or depicted herein or in any combination with any of the features of the appended claims.
Brief Description of Drawings
FIG. 1 illustrates an embodiment in which multiple user computing devices present the same digital content.
FIG. 2 illustrates an example embodiment in which users of multiple user computing devices collectively view content provided by an instance of a third-party application.
FIG. 3 illustrates an example embodiment utilizing a "host-client" model.
FIG. 4 illustrates an example embodiment utilizing a "server" model.
FIG. 5 illustrates an example embodiment utilizing a "everyone" model.
FIG. 6 illustrates an example method for presenting and synchronizing digital content across multiple user computing devices.
FIG. 7 illustrates an example network environment 700 associated with a social networking system.
FIG. 8 illustrates an example computer system.
Description of example embodiments
In some embodiments, users of multiple user computing devices may seek to collectively view the same digital content. For example, two or more users may be separated by a geographic distance, but may still wish to view the same content together. As another example, two or more users may be in the same geographic location, but may only have access to, or may otherwise wish to use, a user computing device (e.g., a virtual reality headset) that is best suited for use by a single user. In such embodiments, the two or more users may desire a method by which they may view the digital content together such that the digital content is synchronized between the two or more user computing devices.
In some embodiments, users of multiple user computing devices may consume the same digital content without the content being synchronized. FIG. 1 illustrates an embodiment in which multiple user computing devices present the same digital content. In this embodiment, user computing devices 101 and 102 each present digital content from the movie "big white shark (Jaws)". In some embodiments, the device may retrieve the content from a local storage of the device. In other embodiments, the content may be obtained through a streaming service (including a third party content provider's streaming service). User computing devices 101 and 102 may be virtual reality devices. In some embodiments, the user computing device may also be any other type of user computing device, including a cellular telephone, a television, a desktop monitor (desktop monitor), a laptop, or any other device capable of presenting digital content to one or more users. In some embodiments, the user computing device 101 may be presenting frames 103 of a movie, while the user computing device 102 may be presenting frames 104. In some embodiments, a user of a user computing device may attempt to synchronize playback of content between multiple user computing devices by selecting similar or identical viewing settings (screen size, volume, playback speed, etc.) and selecting play buttons 105 and 106 at the same time. Despite these efforts, in some embodiments, the content presented on the two devices may not be synchronized (e.g., frame 104 may be a short time before or after frame 103). For example, the computing device 102 may have displayed the head of the flying shark at the same point in time that the computing device 101 displayed the head of the shark (frame 103), and may have displayed the tail of the shark (frame 104). This time difference may be caused by a number of reasons, including, for example, a slight difference in the selection of the play buttons 105 and 106 on each user computing device, or a slight difference due to differences in network connectivity, buffering, or playback speed between the devices. Many additional reasons, including software or hardware based reasons (e.g., differences in software versions or hardware models) may also lead to playback differences between devices.
FIG. 2 illustrates an example embodiment in which users of multiple user computing devices collectively view content provided by an instance of a third-party application. In some embodiments, users 208 and 209 may interact with their respective user computing devices through viewing applications 201 and 202. In some embodiments, these applications may be virtual reality applications, co-viewing applications, or any other application on the user computing device with which the user may interact. These user computing devices may include any user computing device capable of rendering digital content. In some embodiments, instances 203 and 204 of the same third-party application may host digital content that users 208 and 209 desire to view collectively on their respective user computing devices. Each instance of the third party application 203 and 204 may share information back and forth (back and forth) with the user computing devices 201 and 202. The information may include digital content to be rendered on the user computing device, playback and other controls related to the digital content, and state information related to playback of the digital content on the user computing device. Each instance 203 and 204 may receive content as a digital stream from each third- party content provider 210 and 211. In some embodiments, third- party content providers 210 and 211 may be the same third-party content provider. In other embodiments, third- party content providers 210 and 211 may be different third-party content providers. In some embodiments where 210 and 211 are different third-party content providers, the third-party content providers may each provide the same digital content for display to users 208 and 209. In some embodiments, the information that is communicated may be communicated from the instances 203 and 204 of the third party application by software written using Software Development Kits (SDKs) 205 and 206 implemented by each instance of the third party application. In some embodiments, the SDK based software 205 and 206 may communicate with the user computing device by generating a broadcast intent and sending the broadcast intent to a corresponding viewing application running on the user computing device (e.g., the SDK based software 205 may communicate with 201 and the SDK based software 206 may communicate with 202). In some embodiments, the third party applications 203 and 204 may contain two or more states, where at least one state is directed to sending information to the user computing device and one state is directed to receiving information from the user computing device. In some embodiments, the information being delivered may include status information including a timestamp associated with a piece of digital content (e.g., the current playback time of a movie or television program), identifying information related to a piece of digital content (e.g., the name of a movie or television program), play or pause status information, volume, brightness, color, or playback speed information, or information related to whether a piece of digital content or an advertisement is being presented on user computing devices 201 and 202.
As an example where two users are viewing content together, the first user 208 may desire to fast forward through the content. The user may select a fast forward control on his user computing device such that a fast forward command is passed to the first user instance of the third party application 203 via the viewing application 201 through the SDK based software 205. The third party application may in turn request third party content from the third party content provider 210. At the same time, the viewing application of the first user may send a fast forward command to the viewing application 202 of the second user 209, which in turn may pass the command to the third-party content provider 211 through the third-party application 204 by the viewing application 202 of the second user 209.
In some embodiments, the user of each user computing device may retain separate or personalized control over some aspects or settings related to the presentation of the digital content. In other embodiments, all settings may be collectively controlled by the user. In some embodiments, a single user may have separate controls for aspects or settings including volume and subtitles. As an example, the user 208 may use the user computing device 201, the user computing device 201 having a louder native audio than the audio from the user computing device 202 of the user 209. This may be because, for example, the user computing device 201 may be a virtual reality headset with speakers very close to the ears of the user 208, while the user computing device 202 may be a television mounted on a wall remote from the user 209.
In some embodiments, each user 208 and 209 may be presented with the same video content but different audio content. For example, users 208 and 209 may both speak english to communicate with each other, but it may be more comfortable to listen to sports commentary in their native language, which may be spanish for user 208 and russian for user 209. In this embodiment, user 208 may be presented with video content that is the same as user 209 but with spanish audio content, while user 209 may be presented with this video content with russian audio content. In some embodiments, each user may be presented with personalized or personalized video content, which may vary from user to user. For example, in some embodiments, user 208 may be presented with a football game using a standard viewing angle and a chat window consisting of comments from other users to the football game, while user 209 may be presented with a bird's eye view of the game but no chat window. As another example, each user's presentation options and digital environment may be different from any other user. For example, each user may individually control the lighting within the virtual reality environment in which they are viewing digital content. Similarly, each user may individually control the position of the screen, settings, or the background in which the screen is located in the virtual reality environment (e.g., each user may independently select to view content in the virtual reality environment corresponding to one of a movie theater, a family living room, a bed tablet, etc.). It should be understood that these are merely examples, and that the personalization, or customization of video, audio, or other digital content may take any form.
Since each user computing device 201 and 202 communicates independently with each instance of the third party applications 203 and 204, they can do so through the communication method 207. In some embodiments, the communication method 207 may be a direct communication between user computing devices, such as Bluetooth Low Energy (Bluetooth) or infrared technology. In other embodiments, 207 may be an indirect communication through a server associated with viewing applications 201 and 202 or some other computing device associated with viewing applications 201 and 202. Additional details regarding the communication method are described in the following figures.
FIG. 3 illustrates an example embodiment utilizing a "host-client" model. The host-client model may allow the first user computing device 301 to be identified as a host and all other users (e.g., 302 and 303) may be identified as clients. In some embodiments, the host may directly control the digital content (e.g., selection of content, volume, playback of content, etc.), while the client may not have any control over the digital content. In these embodiments, playback selections made on the host user computing device 301 may be communicated from the host user computing device 301 to one or more client user computing devices 302 and 303. For example, during a particular scene in the movie "big white shark," with all three user computing devices paused, a play button on the host user computing device 301 may be selected and the content of the movie may begin playing in a synchronized manner on all three user computing devices 301, 302, and 303. In some embodiments, only the host user computing device may have visible control features associated with playback of the digital content. For example, in FIG. 3, only the host user computing device 301 has a play button 304, while the client user computing devices 302 and 303 do not. In other embodiments, all user computing devices may display the same playback functionality, but only the control features of the host may be operable.
In some embodiments, the client user computing devices 302 and 303 may communicate their playback status and other information to the host computing device 301. Based on the timestamp information and other playback information received from the client user computing devices, the host user computing device 301 may send instructions to take action to the client user computing devices 302 and 303 so that they remain in a synchronized presentation state with the host user computing devices and with each other. The action may include pausing the digital content or altering a timestamp of the digital content such that playback jumps forward or backward. In some embodiments, the action may also include speeding up or slowing down playback of the digital content such that content playback on all devices is resynchronized without requiring the user computing device to completely pause or jump forward, skipping over previously unviewed content. In some embodiments, the host user computing device 301 may relay its playback status to each of the client user computing devices 302 and 303 so that the client user computing devices may synchronize the playback of the content. In other embodiments, the client user computing devices 302 and 303 may relay their status to the host user computing device 301, and the host user computing device 301 may consider the status of all client user computing devices in determining which action should be taken (e.g., what synchronization to synchronize content playback on all devices). It should be understood that these actions and all other possible actions described in this application to synchronize content may be performed by all embodiments of the present invention and need not be associated only with the host-client model or any other model described therein.
FIG. 4 illustrates an example embodiment utilizing a "server" model. The server model may allow a server associated with the viewing application, or some other computing device associated with the viewing application, to track the presentation of digital content on all user computing devices, and may be configured to keep the user synchronized in their consumption. In some embodiments, the server 401 may send and receive playback information associated with all user computing devices 402, 403, and 404. In some embodiments, all user computing devices may include playback controls (e.g., play/pause button 405) that may allow a user of any one user computing device to control playback on all user computing devices. In some embodiments, the server 401 may be one of the user's user computing devices. In other embodiments, the server 401 may be a stand-alone device, distinct from the user computing device on which the content is presented. In some embodiments, the standalone device may be a user's device, or it may be a remote server accessed by the user's computing device via a network (e.g., the internet).
In the server model, a server can track playback of content on all user computing devices and employ rules to keep playback of the content synchronized. In some embodiments, any user (from 0 to all) may have full control over the digital content. In some embodiments, the server may maintain an independently running clock that may determine the speed at which the digital content should be presented to the user device. If any one or a subset of users leads or lags the independent clock, the user may be forced to skip or re-watch the content in order to return to the same speed as the clock. In some embodiments, the rules for keeping content synchronized across all user computing devices may be preset by the user computing devices, by third-party content applications, or may be determined by one or more of the users of the user computing devices. In some embodiments, the rules may be behavior rules based on previous behaviors of the user computing device. For example, if the playback of digital content on user computing device 402 has lagged behind the playback of digital content on user computing devices 403 and 404 two or more times in a session, and each time the user of user computing device 402 has selected to jump forward, skipping the content but regaining synchronization with the other user computing devices, the rules for user computing device 402 may adapt to automatically jump forward whenever user computing device 402 lags behind in the presentation of the content. These rules may be learned on a per session, per user, per content item, or lifecycle basis, or on any other basis.
FIG. 5 illustrates an example embodiment utilizing the "omni" model. The omni model may allow each user's device to track and synchronize with each other user's device. For example, under the omni model, each user device may communicate playback status or other information related to the playback of content on a particular user device (e.g., playback control selections) separately from each other user device connected. In some embodiments, user devices 501, 502, and 503 may all communicate with each other and may all provide equal playback control (represented by 504). For example, each user of the user computing devices 501, 502, 503 may each be presented with the same playback control options. If any one user selects a playback control option (e.g., fast forward), the option may be communicated to each of the other user computing devices. In some embodiments, any user may then stop fast forwarding, pausing, or rewinding the content regardless of any other user-selected playback control options. In some embodiments, when the user of user computing device 501 makes a selection to fast forward content, the selection may be communicated to the other user computing devices 502 and 503. Each user computing device may then pass the selection to the viewing application, then to an instance of the third-party content application, and in addition to that, to a third-party content provider who pays to provide the correct, fast-forward content. It should be understood that 504 and any other representation of playback control may include any variable, controllable information or settings associated with the content or user computing device.
Like the server model, the omni model may allow various rules to be implemented to ensure that all user computing devices are synchronized in the presentation of their digital content. These rules may be presented to the user as a selection as the user consumes the content. For example, if two users are not synchronized, with one user leading the other, the users may be presented with the option to: for example, 1) let the slower user jump forward, 2) force the faster user to jump back and re-watch, or wait until the slower user catches up, or 3) do nothing. As another example, if there are three or more users viewing content together, where the playback of the content on each user's device is not synchronized with all other users, the user may be presented with the option to: 1) jump forward to the fastest user, 2) jump back to the slowest user, 3) average playback time across all devices, and jump to the average playback time. Any number of additional rule options may be utilized. In some embodiments, a user of a user computing device may select a threshold at which the user computing device must be out of sync and thus utilize a rule or policy. In other embodiments, the threshold may be learned based on past user selections. Similarly, in some embodiments, selection of an option may be selected as a default value or may be learned based on a user selection history.
FIG. 6 illustrates an example method 600 for presenting and synchronizing digital content across multiple user computing devices. The method may begin at step 610, where at step 610, a first user computing device connects with at least one other user computing device, the at least one other user computing device being in a synchronized presentation session with the first user computing device. In some embodiments, step 610 may include determining whether other user computing devices are within range, are presenting the same content, or may otherwise be identified as being in a synchronized presentation session with the first user computing device. As an example, in some embodiments, if a first user desires to view content in conjunction with a second user, the first user and the second user may log in or otherwise activate an application on their respective user computing devices. Once logged in, the user may choose to join the synchronous presentation session so that the application may begin searching for and connect the user computing devices of the first user and the second user. At step 620, the first user computing device may present the digital content through an instance of the third-party application. In some embodiments, while the first user computing device is presenting digital content through an instance of the third-party application, one or more other user computing devices may also be presenting the same digital content through another instance of the third-party application. At step 630, a determination may be made as to whether to send the synchronization message to one or more additional user computing devices in the same presentation session as the first user computing device. If the user computing devices are all synchronized in their presentation of digital content, the synchronization message may not be sent. However, if one or more of the user computing devices are not synchronized, or if a synchronization message should be sent based on some other determination (e.g., automatically per unit time, each time the user selects a playback control option, each time a new episode or scene from a piece of digital content begins, etc.), the method will proceed to step 640. At step 640, a synchronization message may be generated based on the state of the digital content being rendered by the first user computing device. For example, if playback of digital content on a first user computing device precedes playback of the same content on a second user computing device by two seconds, the message may include instructions to pause the presentation of the digital content on the first user computing device for two seconds or rewind for two seconds. In step 650, the synchronization message may be sent to the at least one other user computing device or to a server associated with the at least one other user computing device. The synchronization message may be configured to synchronize the presentation of the digital content by the other user computing devices with the presentation of the digital content by the first user computing device. Particular embodiments may repeat one or more steps of the method of fig. 6 where appropriate. Although this disclosure describes and illustrates particular steps of the method of fig. 6 as occurring in a particular order, this disclosure contemplates any suitable steps of the method of fig. 6 occurring in any suitable order. Moreover, although this disclosure describes and illustrates an example method for presenting and synchronizing digital content on multiple user computing devices that includes particular steps of the method of fig. 6, this disclosure contemplates any suitable method for presenting and synchronizing digital content on multiple user computing devices that includes any suitable steps, which may include all, some, or none of the steps of the method of fig. 6, where appropriate. Moreover, although this disclosure describes and illustrates particular components, devices, or systems performing particular steps of the method of fig. 6, this disclosure contemplates any suitable combination of any suitable components, devices, or systems performing any suitable steps of the method of fig. 6.
Inventor-examples began herein
FIG. 7 illustrates an example network environment 700 associated with a social networking system. Network environment 700 includes client system 730, social-networking system 760, and third-party system 770 connected to each other via network 710. Although fig. 7 illustrates a particular arrangement of client system 730, social-networking system 760, third-party system 770, and network 710, this disclosure contemplates any suitable arrangement of client system 730, social-networking system 760, third-party system 770, and network 710. By way of example and not by way of limitation, two or more of client system 730, social-networking system 760, and third-party system 770 may be directly connected to each other, bypassing network 710. As another example, two or more of client system 730, social-networking system 760, and third-party system 770 may all or partially be physically or logically co-located with each other. Moreover, although fig. 7 illustrates a particular number of client systems 730, social-networking systems 760, third-party systems 770, and networks 710, the present disclosure contemplates any suitable number of client systems 730, social-networking systems 760, third-party systems 770, and networks 710. By way of example, and not by way of limitation, network environment 700 may include a plurality of client systems 730, social-networking systems 760, third-party systems 770, and networks 710.
The present disclosure contemplates any suitable network 710. By way of example and not limitation, one or more portions of network 710 may include an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (wlan), a Wide Area Network (WAN), a wireless WAN (wwan), a Metropolitan Area Network (MAN), a portion of the internet, a portion of a Public Switched Telephone Network (PSTN), a cellular telephone network, or a combination of two or more of these. The network 710 may include one or more networks 710.
Link 750 may connect client system 730, social-networking system 760, and third-party system 770 to communication network 710 or to each other. The present disclosure contemplates any suitable link 750. In particular embodiments, one or more links 750 include one or more wired (e.g., Digital Subscriber Line (DSL) or cable-based data service interface specification (DOCSIS)) links, wireless (e.g., Wi-Fi or Worldwide Interoperability for Microwave Access (WiMAX)) links, or optical (e.g., Synchronous Optical Network (SONET) or Synchronous Digital Hierarchy (SDH)) links. In particular embodiments, one or more links 750 each include an ad hoc network, an intranet, an extranet, a VPN, a LAN, a WLAN, a WAN, a WWAN, a MAN, a portion of the internet, a portion of the PSTN, a cellular technology-based network, a satellite communication technology-based network, another link 750, or a combination of two or more such links 750. Link 750 need not be the same throughout network environment 700. One or more first links 750 may differ in one or more respects from one or more second links 750.
In particular embodiments, client system 730 may be an electronic device that includes hardware, software, or embedded logic components, or a combination of two or more such components, and is capable of performing the appropriate functions implemented or supported by client system 730. By way of example, and not limitation, client system 730 may include a computer system such as a desktop computer, notebook or laptop computer, netbook, tablet computer, e-book reader, GPS device, camera, Personal Digital Assistant (PDA), handheld electronic device, cellular telephone, smartphone, augmented/virtual reality device, other suitable electronic device, or any suitable combination thereof. The present disclosure contemplates any suitable client systems 730. Client system 730 may enable a network user at client system 730 to access network 710. Client system 730 may enable its user to communicate with other users at other client systems 730.
In particular embodiments, client system 730 may include a web browser 732, such as MICROSOFT INTERNET EXPLORER, GOOGLE CHROME, or MOZILLA FIREFOX, and may have one or more add-ons (add-on), plug-ins (plug-in), or other extensions (extensions), such as TOOLBAR or YAHOO TOOLBAR. A user at client system 730 may enter a Uniform Resource Locator (URL) or other address directing web browser 732 to a particular server (e.g., server 762 or a server associated with third-party system 770), and web browser 732 may generate and communicate a hypertext transfer protocol (HTTP) request to the server. The server may accept the HTTP request and communicate one or more hypertext markup language (HTML) files to client system 730 in response to the HTTP request. Client system 730 may render a web page based on an HTML file from a server for presentation to a user. The present disclosure contemplates any suitable web page files. By way of example and not limitation, web pages may be rendered from HTML files, extensible hypertext markup language (XHTML) files, or extensible markup language (XML) files, according to particular needs. Such pages may also execute scripts, such as, without limitation, scripts written in JAVASCRIPT, JAVA, MICROSOFT SILVERLIGHT, combinations of markup languages and scripts (e.g., AJAX (asynchronous JAVASCRIPT and XML)), and the like. Herein, reference to a web page includes one or more corresponding web page files (which a browser may use to render the web page), and vice versa, where appropriate.
In particular embodiments, social-networking system 760 may be a network-addressable computing system that may host an online social network. Social-networking system 760 may generate, store, receive, and send social-networking data (e.g., user-profile data, concept-profile data, social-graph information, or other suitable data related to an online social network). Social-networking system 760 may be accessed by other components of network environment 700, either directly or via network 710. By way of example and not limitation, client system 730 may access social-networking system 760 directly or via network 710 using web browser 732 or a native application associated with social-networking system 760 (e.g., a mobile social-networking application, a messaging application, another suitable application, or any combination thereof). In particular embodiments, social-networking system 760 may include one or more servers 762. Each server 762 may be a single server (unity server) or a distributed server spanning multiple computers or multiple data centers. The server 762 may be of various types, such as, for example and without limitation, a web server, a news server, a mail server, a messaging server, an advertising server, a file server, an application server, an exchange server, a database server, a proxy server, another server suitable for performing the functions or processes described herein, or any combination thereof. In particular embodiments, each server 762 may include hardware, software, or embedded logic components, or a combination of two or more such components, for performing the appropriate functions implemented or supported by server 762. In particular embodiments, social-networking system 760 may include one or more data stores 764. The data storage 764 may be used to store various types of information. In particular embodiments, the information stored in the data store 764 may be organized according to particular data structures. In particular embodiments, each data store 764 may be a relational database, column (column) database, a relational database, or other suitable database. Although this disclosure describes or illustrates a particular type of database, this disclosure contemplates any suitable type of database. Particular embodiments may provide an interface that enables client system 730, social-networking system 760, or third-party system 770 to manage, retrieve, modify, add, or delete information stored in data store 764.
In particular embodiments, social-networking system 760 may store one or more social graphs in one or more data stores 764. In particular embodiments, the social graph may include a plurality of nodes, which may include a plurality of user nodes (each corresponding to a particular user) or a plurality of concept nodes (each corresponding to a particular concept), and a plurality of edges connecting the nodes. Social-networking system 760 may provide users of an online social network with the ability to communicate and interact with other users. In particular embodiments, a user may join an online social network via social networking system 760, and then add connections (e.g., relationships) with a plurality of other users in social networking system 760 to which they want to be related. As used herein, the term "friend" may refer to any other user of social-networking system 760 with whom the user forms a connection, association, or relationship via social-networking system 760.
In particular embodiments, social-networking system 760 may provide a user with the ability to take actions on various types of items or objects supported by social-networking system 760. By way of example and not by way of limitation, items and objects may include groups or social networks to which a user of social-networking system 760 may belong, events or calendar entries that may be of interest to the user, computer-based applications that may be used by the user, transactions that allow the user to purchase or sell goods via a service, interactions with advertisements that the user may perform, or other suitable items or objects. The user may interact with anything that can be represented in social-networking system 760 or by a system external to third-party system 770, third-party system 770 being separate from social-networking system 760 and coupled to social-networking system 760 via network 710.
In particular embodiments, social-networking system 760 is capable of linking various entities. By way of example and not limitation, social-networking system 760 may enable users to interact with each other and receive content from third-party systems 770 or other entities, or allow users to interact with these entities through an Application Programming Interface (API) or other communication channel.
In particular embodiments, third-party system 770 may include one or more types of servers, one or more data stores, one or more interfaces (including but not limited to APIs), one or more web services, one or more content sources, one or more networks, or any other suitable components (e.g., with which a server may communicate). Third-party system 770 may be operated by an entity different from the entity operating social-networking system 760. However, in particular embodiments, social-networking system 760 and third-party system 770 may operate in conjunction with each other to provide social-networking services to users of social-networking system 760 or third-party system 770. In this sense, the social networking system 760 may provide a platform or backbone (backbone) that other systems (e.g., third party systems 770) may use to provide social networking services and functionality to users over the internet.
In particular embodiments, third party system 770 may include a third party content object provider. The third-party content object provider may include one or more sources of content objects that may be transmitted to the client system 730. By way of example and not limitation, content objects may include information about things or activities of interest to a user, such as movie show times, movie reviews, restaurant menus, product information and reviews, or other suitable information. As another example and not by way of limitation, the content object may include an incentive content object (e.g., a coupon, discount coupon, gift coupon, or other suitable incentive object).
In particular embodiments, social-networking system 760 also includes user-generated content objects that may enhance a user's interaction with social-networking system 760. User-generated content may include any content that a user may add, upload, send, or "post" to social-networking system 760. By way of example and not by way of limitation, a user communicates a post from client system 730 to social-networking system 760. Posts may include data such as status updates or other textual data, location information, photos, videos, links, music, or other similar data or media. Content may also be added to social-networking system 760 by third parties through "communication channels" (e.g., dynamic messages (news feeds) or streams).
In particular embodiments, social-networking system 760 may include various servers, subsystems, programs, modules, logs, and data stores. In particular embodiments, social-networking system 760 may include one or more of the following: web servers, action recorders, API request servers, relevance and ranking engines, content object classifiers, notification controllers, action logs, third-party content object exposure logs, inference modules, authorization/privacy servers, search modules, ad-targeting modules, user interface modules, user profile storage, connected storage, third-party content storage, or location storage. Social-networking system 760 may also include suitable components, such as network interfaces, security mechanisms, load balancers, failover servers, management and network operations consoles, other suitable components, or any suitable combination thereof. In particular embodiments, social-networking system 760 may include one or more user-profile stores for storing user profiles. The user profile may include, for example, biographical information, demographic information, behavioral information, social information, or other types of descriptive information (e.g., work experience, educational history, hobbies or preferences, interests, affinities, or locations). The interest information may include interests associated with one or more categories. The categories may be general or specific. By way of example and not by way of limitation, if a user "likes" an article about a brand of shoes, that category may be the brand, or a general category of "shoes" or "clothing". The associative memory may be used to store information that is associative with the user. The relational information may indicate users who have similar or common work experiences, group memberships, hobbies, educational history, or are related or share common attributes in any manner. The relational information may also include user-defined relations between different users and the content (internal and external). A web server may be used to link social-networking system 760 to one or more client systems 730 or one or more third-party systems 770 via network 710. The web server may include a mail server or other messaging functionality for receiving and routing (routing) messages between social-networking system 760 and one or more client systems 730. The API request server may allow third party systems 770 to access information from social-networking system 760 by calling one or more APIs. The action recorder may be used to receive communications from the web server regarding the user's actions on or off of social-networking system 760. In conjunction with the action log, a third-party content object log of user exposure to third-party content objects may be maintained. The notification controller may provide information about the content object to client system 730. The information may be pushed to client system 730 as a notification, or the information may be pulled from client system 730 in response to a request received from client system 730. The authorization server may be used to enforce one or more privacy settings of the user of social-networking system 760. The privacy settings of the user determine how particular information associated with the user may be shared. The authorization server may allow users to opt-in or opt-out of having their actions logged by social-networking system 760 or shared with other systems (e.g., third-party system 770), for example, by setting appropriate privacy settings. The third party content object store may be used to store content objects received from third parties (e.g., third party systems 770). The location store may be used to store location information associated with the user received from client system 730. The advertisement pricing module may combine social information, current time, location information, or other suitable information to provide relevant advertisements to the user in the form of notifications.
Fig. 8 illustrates an example computer system 800. In particular embodiments, one or more computer systems 800 perform one or more steps of one or more methods described or illustrated herein. In certain embodiments, one or more computer systems 800 provide the functionality described or illustrated herein. In particular embodiments, software running on one or more computer systems 800 performs one or more steps of one or more methods described or illustrated herein or provides functions described or illustrated herein. Particular embodiments include one or more portions of one or more computer systems 800. Herein, reference to a computer system may include a computing device, and vice versa, where appropriate. Further, references to a computer system may include one or more computer systems, where appropriate.
This disclosure contemplates any suitable number of computer systems 800. This disclosure contemplates computer system 800 taking any suitable physical form. By way of example and not limitation, computer system 800 may be an embedded computer system, a system on a chip (SOC), a single board computer System (SBC) (e.g., a Computer On Module (COM) or a System On Module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a computer system mesh, a mobile phone, a Personal Digital Assistant (PDA), a server, a tablet computer system, an augmented/virtual reality device, or a combination of two or more of these. Where appropriate, computer system 800 may include one or more computer systems 800; may be monolithic or distributed; spanning a plurality of locations; spanning multiple machines; spanning multiple data centers; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 800 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. By way of example, and not by way of limitation, one or more computer systems 800 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 800 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
In a particular embodiment, the computer system 800 includes a processor 802, a memory 804, a storage device 806, an input/output (I/O) interface 808, a communication interface 810, and a bus 812. Although this disclosure describes and illustrates a particular computer system with a particular number of particular components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.
In a particular embodiment, the processor 802 includes hardware for executing instructions (e.g., those making up a computer program). By way of example, and not limitation, to execute instructions, processor 802 may retrieve (or retrieve) instructions from an internal register, an internal cache, memory 804, or storage 806; decode the instructions and execute them; and then write the one or more results to an internal register, internal cache, memory 804, or storage 806. In particular embodiments, processor 802 may include one or more internal caches for data, instructions, or addresses. The present disclosure contemplates processor 802 including any suitable number of any suitable internal caches, where appropriate. By way of example, and not limitation, processor 802 may include one or more instruction caches, one or more data caches, and one or more Translation Lookaside Buffers (TLBs). The instructions in the instruction cache may be a copy of the instructions in the memory 804 or storage 806, and the instruction cache may accelerate retrieval of those instructions by the processor 802. The data in the data cache may be: a copy of the data in memory 804 or storage 806 for manipulation by instructions executed at processor 802; the results of previous instructions executed at processor 802 for access by subsequent instructions executed at processor 802 or for writing to memory 804 or storage 806; or other suitable data. The data cache may speed up read or write operations by the processor 802. The TLB may accelerate virtual address translations for the processor 802. In particular embodiments, processor 802 may include one or more internal registers for data, instructions, or addresses. The present disclosure contemplates processor 802 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, the processor 802 may include one or more Arithmetic Logic Units (ALUs); may be a multi-core processor; or may include one or more processors 802. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.
In a particular embodiment, the memory 804 includes a main memory for storing instructions for execution by the processor 802 or data for operation by the processor 802. By way of example, and not limitation, computer system 800 may load instructions from storage 806 or another source (e.g., another computer system 800) to memory 804. The processor 802 may then load the instructions from the memory 804 into an internal register or internal cache. To execute instructions, processor 802 may retrieve instructions from an internal register or internal cache and decode them. During or after execution of the instructions, the processor 802 may write one or more results (which may be intermediate results or final results) to an internal register or internal cache. The processor 802 may then write one or more of these results to the memory 804. In a particular embodiment, the processor 802 executes only instructions in one or more internal registers or internal caches or in the memory 804 (but not elsewhere in the storage 806) and operates only on data in one or more internal registers or internal caches or in the memory 804 (but not the storage 806 or elsewhere). One or more memory buses (which may each include an address bus and a data bus) may couple processor 802 to memory 804. The bus 812 may include one or more memory buses, as described below. In particular embodiments, one or more Memory Management Units (MMUs) reside between processor 802 and memory 804 and facilitate accesses to memory 804 requested by processor 802. In a particular embodiment, the memory 804 includes Random Access Memory (RAM). The RAM may be volatile memory, where appropriate. The RAM may be dynamic RAM (dram) or static RAM (sram), where appropriate. Further, the RAM may be single-port RAM or multi-port RAM, where appropriate. The present disclosure contemplates any suitable RAM. The memory 804 may include one or more memories 804, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
In a particular embodiment, the storage 806 includes mass storage for data or instructions. By way of example, and not limitation, storage 806 may include a Hard Disk Drive (HDD), a floppy disk drive, flash memory, an optical disk, a magneto-optical disk, magnetic tape, or a Universal Serial Bus (USB) drive, or a combination of two or more of these. Storage 806 may include removable or non-removable (or fixed) media, where appropriate. Storage 806 may be internal or external to computer system 800, where appropriate. In a particular embodiment, the storage 806 is non-volatile solid-state memory. In certain embodiments, storage 806 comprises Read Only Memory (ROM). Where appropriate, the ROM may be mask-programmed ROM, programmable ROM (prom), erasable prom (eprom), electrically erasable prom (eeprom), electrically variable ROM (earom), or flash memory, or a combination of two or more of these. The present disclosure contemplates mass storage 806 taking any suitable physical form. The storage 806 may include one or more storage control units that facilitate communication between the processor 802 and the storage 806, where appropriate. Storage 806 may include one or more storage 806, where appropriate. Although this disclosure describes and illustrates a particular storage device, this disclosure contemplates any suitable storage device.
In certain embodiments, the I/O interface 808 comprises hardware, software, or both that provide one or more interfaces for communication between the computer system 800 and one or more I/O devices. Computer system 800 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may enable communication between a person and computer system 800. By way of example, and not limitation, an I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet computer, touch screen, trackball, video camera, another suitable I/O device, or a combination of two or more of these. The I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 808 for them. The I/O interface 808 may include one or more device or software drivers that enable the processor 802 to drive one or more of these I/O devices, where appropriate. The I/O interfaces 808 may include one or more I/O interfaces 808, where appropriate. Although this disclosure describes and illustrates particular I/O interfaces, this disclosure contemplates any suitable I/O interfaces.
In particular embodiments, communication interface 810 includes hardware, software, or both that provide one or more interfaces for communication (e.g., packet-based communication) between computer system 800 and one or more other computer systems 800 or one or more networks. By way of example, and not limitation, communication interface 810 may include a Network Interface Controller (NIC) or network adapter for communicating with an ethernet or other wire-based network, or a wireless NIC (wnic) or wireless adapter for communicating with a wireless network (e.g., a WI-FI network). The present disclosure contemplates any suitable network and any suitable communication interface 810 for it. By way of example, and not by way of limitation, computer system 800 may communicate with an ad hoc network, a Personal Area Network (PAN), a Local Area Network (LAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), or one or more portions of the internet, or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. By way of example, computer system 800 may communicate with a Wireless PAN (WPAN) (e.g., a Bluetooth WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (e.g., a Global System for Mobile communications (GSM) network), or other suitable wireless network, or a combination of two or more of these. Computer system 800 may include any suitable communication interface 810 for any of these networks, where appropriate. Communication interface 810 may include one or more communication interfaces 810, where appropriate. Although this disclosure describes and illustrates a particular communication interface, this disclosure contemplates any suitable communication interface.
In particular embodiments, bus 812 includes hardware, software, or both coupling components of computer system 800 to each other. By way of example, and not limitation, the bus 812 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Extended Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a hypertransport (ht) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a Low Pin Count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-express (pcie) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or any other suitable bus or combination of two or more of these. Bus 812 may include one or more buses 812, where appropriate. Although this disclosure describes and illustrates a particular bus, this disclosure contemplates any suitable bus or interconnect.
Herein, where appropriate, the one or more computer-readable non-transitory storage media may include one or more semiconductor-based or other Integrated Circuits (ICs) (e.g., Field Programmable Gate Arrays (FPGAs) or application specific ICs (asics)), Hard Disk Drives (HDDs), hybrid hard disk drives (HHDs), optical disks, Optical Disk Drives (ODDs), magneto-optical disks, magneto-optical disk drives, floppy disks, Floppy Disk Drives (FDDs), magnetic tape, Solid State Drives (SSDs), RAM drives, SECURE DIGITAL (SECURE DIGITAL) cards or drives, any other suitable computer-readable non-transitory storage media, or any suitable combination of two or more of these. Computer-readable non-transitory storage media may be volatile, nonvolatile, or a combination of volatile and nonvolatile, where appropriate.
Herein, unless explicitly indicated otherwise or indicated otherwise by context, "or" is inclusive and not exclusive. Thus, herein, "a or B" means "A, B or both" unless explicitly indicated otherwise or indicated otherwise by context. Further, "and" are both conjunctive and disjunctive unless expressly indicated otherwise or indicated otherwise by context. Thus, herein, "a and B" means "a and B, either jointly or individually," unless expressly indicated otherwise or indicated otherwise by context.
The scope of the present disclosure includes all changes, substitutions, variations, alterations, and modifications to the example embodiments described or illustrated herein that a person having ordinary skill in the art would comprehend. The scope of the present disclosure is not limited to the example embodiments described or illustrated herein. Moreover, although the present disclosure describes and illustrates respective embodiments herein as including particular components, elements, features, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, features, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would understand. Furthermore, reference in the appended claims to an apparatus or system or a component of an apparatus or system that is suitable for, arranged to, capable of, configured to, implemented, operable to, or operative to perform a particular function includes the apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, provided that the apparatus, system, or component is so adapted, arranged, enabled, configured, implemented, operable, or operative. Moreover, although this disclosure describes or illustrates particular embodiments as providing particular advantages, particular embodiments may provide some, all, or none of these advantages.

Claims (20)

1. A method, comprising:
connecting, by a user computing device, to at least one other user computing device in a synchronous presentation session with the user computing device;
presenting, by the user computing device, digital content obtained by an instance of a third-party application on the user computing device;
determining, by the user computing device, to send a synchronization message;
generating, by the user computing device, the synchronization message based on a state of the digital content being presented by the user computing device; and
transmitting, by the user computing device, the synchronization message to the other user computing device or a server associated with the other user computing device, wherein the synchronization message is configured to synchronize presentation of the digital content by the other user computing device with presentation of the digital content by the user computing device.
2. The method of claim 1, wherein connecting to the at least one other user computing device in a synchronized presentation session with the user computing device comprises directly connecting with the at least one other user computing device.
3. The method of claim 1, wherein the digital content obtained by the instance of the third-party application is streamed from a third-party content source associated with the third-party application.
4. The method of claim 3, wherein the user computing device and the at least one other user computing device comprise virtual reality devices, and wherein presenting the digital content on the user computing device comprises re-projecting digital content streamed from the third-party content source onto a screen in a virtual reality environment.
5. The method of claim 4, wherein the user of the user computing device is represented by a digital avatar in the virtual reality environment.
6. The method of claim 1, wherein connecting to the at least one other user computing device in a synchronized presentation session with the user computing device comprises connecting with the at least one other user computing device through a server.
7. The method of claim 1, wherein determining to transmit the synchronization message is based at least on one of: a predetermined schedule, user input, or a comparison between the state of the digital content being presented by the user computing device and state information from the at least one other user computing device.
8. The method of claim 1, wherein the digital content presented by the at least one other user computing device is obtained through a second instance of the third-party application on the at least one other user computing device.
9. The method of claim 8, wherein the synchronization message is configured to control playback of the digital content presented by the at least one other user computing device by the second instance of the third-party application.
10. A system, comprising:
one or more processors; and
one or more computer-readable non-transitory storage media coupled to one or more of the processors and comprising instructions that when executed by one or more of the processors are operable to cause the system to:
connecting to at least one other user computing device in a synchronized presentation session with the user computing device;
presenting digital content obtained by an instance of a third-party application on the user computing device;
determining to send a synchronization message;
generating the synchronization message based on a state of the digital content being presented by the user computing device; and
sending the synchronization message to the other user computing device or a server associated with the other user computing device, wherein the synchronization message is configured to synchronize presentation of the digital content by the other user computing device with presentation of the digital content by the user computing device.
11. The system of claim 10, wherein connecting to the at least one other user computing device in a synchronized presentation session with the user computing device comprises directly connecting with the at least one other user computing device.
12. The system of claim 10, wherein the digital content obtained by the instance of the third-party application is streamed from a third-party content source associated with the third-party application.
13. The system of claim 12, wherein the user computing device and the at least one other user computing device comprise virtual reality devices, and wherein presenting the digital content on the user computing device comprises re-projecting digital content streamed from the third-party content source onto a screen in a virtual reality environment.
14. The system of claim 13, wherein the user of the user computing device is represented by a digital avatar in the virtual reality environment.
15. The system of claim 10, wherein connecting to the at least one other user computing device in a synchronized presentation session with the user computing device comprises connecting with the at least one other user computing device through a server.
16. The system of claim 10, wherein determining to transmit the synchronization message is based at least on one of: a predetermined schedule, user input, or a comparison between the state of the digital content being presented by the user computing device and state information from the at least one other user computing device.
17. The system of claim 10, wherein the processor, when executing the instructions, is further operable to: presenting, by the at least one other user computing device, the digital content through a second instance of the third-party application on the at least one other user computing device.
18. The system of claim 17, wherein the synchronization message is configured to control playback of the digital content presented by the at least one other user computing device by the second instance of the third-party application.
19. One or more computer-readable non-transitory storage media embodying software that is operable when executed by a computer system to:
connecting to at least one other user computing device in a synchronized presentation session with the user computing device;
presenting digital content obtained by an instance of a third-party application on the user computing device;
determining to send a synchronization message;
generating the synchronization message based on a state of the digital content being presented by the user computing device; and
sending the synchronization message to the other user computing device or a server associated with the other user computing device, wherein the synchronization message is configured to synchronize presentation of the digital content by the other user computing device with presentation of the digital content by the user computing device.
20. The media of claim 19, wherein connecting to the at least one other user computing device in a synchronized presentation session with the user computing device comprises directly connecting with the at least one other user computing device.
CN201980054904.3A 2018-08-21 2019-08-14 Synchronization of digital content consumption Active CN112585986B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16/107,817 2018-08-21
US16/107,817 US10628115B2 (en) 2018-08-21 2018-08-21 Synchronization of digital content consumption
PCT/US2019/046548 WO2020041071A1 (en) 2018-08-21 2019-08-14 Synchronization of digital content consumption

Publications (2)

Publication Number Publication Date
CN112585986A true CN112585986A (en) 2021-03-30
CN112585986B CN112585986B (en) 2023-11-03

Family

ID=67811021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980054904.3A Active CN112585986B (en) 2018-08-21 2019-08-14 Synchronization of digital content consumption

Country Status (5)

Country Link
US (3) US10628115B2 (en)
EP (1) EP3841757A1 (en)
JP (1) JP2021534606A (en)
CN (1) CN112585986B (en)
WO (1) WO2020041071A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113032491A (en) * 2021-04-07 2021-06-25 工银科技有限公司 Method, device, electronic equipment and medium for realizing static data synchronization

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11508128B2 (en) * 2018-09-21 2022-11-22 New York University Shared room scale virtual and mixed reality storytelling for a multi-person audience that may be physically co-located
EP3934264A1 (en) 2020-06-30 2022-01-05 Spotify AB Systems and methods for creating a shared playback session

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070271338A1 (en) * 2006-05-18 2007-11-22 Thomas Anschutz Methods, systems, and products for synchronizing media experiences
CN102450031A (en) * 2009-05-29 2012-05-09 微软公司 Avatar integrated shared media experience
CN103091844A (en) * 2011-12-12 2013-05-08 微软公司 Connecting head mounted displays to external displays and other communication networks
US20150120817A1 (en) * 2013-10-30 2015-04-30 Samsung Electronics Co., Ltd. Electronic device for sharing application and control method thereof
US20160184708A1 (en) * 2014-12-29 2016-06-30 Ebay Inc. Audience adjusted gaming

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8190680B2 (en) * 2004-07-01 2012-05-29 Netgear, Inc. Method and system for synchronization of digital media playback
US7627890B2 (en) * 2006-02-21 2009-12-01 At&T Intellectual Property, I,L.P. Methods, systems, and computer program products for providing content synchronization or control among one or more devices
US20090262084A1 (en) * 2008-04-18 2009-10-22 Shuttle Inc. Display control system providing synchronous video information
US20100083324A1 (en) * 2008-09-30 2010-04-01 Microsoft Corporation Synchronized Video Playback Among Multiple Users Across A Network
US8836771B2 (en) * 2011-04-26 2014-09-16 Echostar Technologies L.L.C. Apparatus, systems and methods for shared viewing experience using head mounted displays
WO2013103580A1 (en) * 2012-01-06 2013-07-11 Thomson Licensing Method and system for providing a display of social messages on a second screen which is synched to content on a first screen
KR20140138763A (en) * 2012-03-23 2014-12-04 톰슨 라이센싱 Method of buffer management for synchronization of correlated media presentations
US20150189012A1 (en) * 2014-01-02 2015-07-02 Nvidia Corporation Wireless display synchronization for mobile devices using buffer locking
US20170180769A1 (en) * 2015-12-18 2017-06-22 Cevat Yerli Simultaneous experience of media
US20170295229A1 (en) * 2016-04-08 2017-10-12 Osterhout Group, Inc. Synchronizing head-worn computers

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070271338A1 (en) * 2006-05-18 2007-11-22 Thomas Anschutz Methods, systems, and products for synchronizing media experiences
CN102450031A (en) * 2009-05-29 2012-05-09 微软公司 Avatar integrated shared media experience
CN103091844A (en) * 2011-12-12 2013-05-08 微软公司 Connecting head mounted displays to external displays and other communication networks
US20150120817A1 (en) * 2013-10-30 2015-04-30 Samsung Electronics Co., Ltd. Electronic device for sharing application and control method thereof
US20160184708A1 (en) * 2014-12-29 2016-06-30 Ebay Inc. Audience adjusted gaming

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113032491A (en) * 2021-04-07 2021-06-25 工银科技有限公司 Method, device, electronic equipment and medium for realizing static data synchronization

Also Published As

Publication number Publication date
WO2020041071A1 (en) 2020-02-27
US20200065051A1 (en) 2020-02-27
US10990345B2 (en) 2021-04-27
US20200218495A1 (en) 2020-07-09
US11334310B2 (en) 2022-05-17
US10628115B2 (en) 2020-04-21
CN112585986B (en) 2023-11-03
US20210208836A1 (en) 2021-07-08
EP3841757A1 (en) 2021-06-30
JP2021534606A (en) 2021-12-09

Similar Documents

Publication Publication Date Title
US10499010B2 (en) Group video session
US11223868B2 (en) Promotion content push method and apparatus, and storage medium
US10558675B2 (en) Systems and methods for capturing images with augmented-reality effects
JP6730335B2 (en) Streaming media presentation system
US10498781B2 (en) Interactive spectating interface for live videos
US10897637B1 (en) Synchronize and present multiple live content streams
US11334310B2 (en) Synchronization of digital content consumption
US20230134355A1 (en) Efficient Processing For Artificial Reality
US20200169586A1 (en) Perspective Shuffling in Virtual Co-Experiencing Systems
US11406896B1 (en) Augmented reality storytelling: audience-side
US10425378B2 (en) Comment synchronization in a video stream
WO2020060856A1 (en) Shared live audio
US11950165B2 (en) Provisioning content across multiple devices
US20180191999A1 (en) Group Video Session
EP3316204A1 (en) Targeted content during media downtimes
US20230221797A1 (en) Ephemeral Artificial Reality Experiences
EP3389049B1 (en) Enabling third parties to add effects to an application
US20240005608A1 (en) Travel in Artificial Reality
EP3509024A1 (en) Multivision display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: California, USA

Applicant after: Yuan Platform Technology Co.,Ltd.

Address before: California, USA

Applicant before: Facebook Technologies, LLC

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant