WO2023059586A1 - Methods, architectures, apparatuses and systems directed to dynamically enhance interaction of multiple users consuming content - Google Patents

Methods, architectures, apparatuses and systems directed to dynamically enhance interaction of multiple users consuming content Download PDF

Info

Publication number
WO2023059586A1
WO2023059586A1 PCT/US2022/045610 US2022045610W WO2023059586A1 WO 2023059586 A1 WO2023059586 A1 WO 2023059586A1 US 2022045610 W US2022045610 W US 2022045610W WO 2023059586 A1 WO2023059586 A1 WO 2023059586A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
content
content consumption
interaction
data
Prior art date
Application number
PCT/US2022/045610
Other languages
French (fr)
Inventor
Lu Liu
Dale Seed
Sanghoon Kim
Paul Dougherty
Original Assignee
Interdigital Patent Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Interdigital Patent Holdings, Inc. filed Critical Interdigital Patent Holdings, Inc.
Publication of WO2023059586A1 publication Critical patent/WO2023059586A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/335Filtering based on additional data, e.g. user or group profiles
    • G06F16/337Profile generation, learning or modification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/251Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/252Processing of multiple end-users' preferences to derive collaborative data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/4661Deriving a combined profile for a plurality of end-users of the same client, e.g. for family members within a home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/4667Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/4668Learning process for intelligent management, e.g. learning user preferences for recommending movies for recommending content, e.g. movies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4786Supplemental services, e.g. displaying phone caller identification, shopping application e-mailing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N21/8133Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program

Definitions

  • the present disclosure generally relates to networking websites and more particularly to techniques for enhancing user interaction between users of a network or website simultaneously as pertaining to consumption of a content.
  • Video conferencing applications may be one such technology that may allow people to communicate with each other in remote locations through use of audio and video media in real time or otherwise.
  • Similar other technologies may allow the broadcast and dissemination of such viewings and interactions.
  • Single content can be consumed in such manners simultaneously or otherwise and be enjoyed during get togethers, meetings and conferences, and other popular events. During many of these events or conferences, participant responses may be captured or disseminated live.
  • Technology may be providing secondary users and consumers of content to connect together in a social manner e.g., when located remotely.
  • a popular method to enable user interaction may be through a video commenting feature and may be provided by some streaming platforms and video sharing services.
  • the commenting feature may accompany the media content (e.g., a movie, a video clip, a live event streaming) and users may post their comments regarding the same media content.
  • the comments may be posted by the users after watching the content or while watching the content.
  • the management of comments may be performed by the service/content provider or the platform manager/moderator. This may not allow user interactions to be further customized according to user preferences and/or based on a particular user interaction. Embodiments described herein have been designed with the foregoing in mind.
  • a method for collecting user data may be implemented in a content consumption device.
  • (e.g., first social interaction) data may be collected based on a consumed content or in anticipating with consuming an upcoming content.
  • the (e.g., first social interaction) data may be associated with a consumption of a media content on the content consumption device, wherein at least a part of the first social interaction data may be for transmission to other content consumption devices consuming the media content.
  • the (e.g., first social interaction) data may include any of (i) one or more first comments related to the media content and (ii) one or more first chat messages occurring during the consumption of the media content.
  • a user profile may be established (e.g., generated) based on the collected (e.g., first social interaction) data and e.g., analyzed for preferences.
  • second (e.g., social interaction) data may be received including any of (i) one or more second comments related to the media content and (ii) one or more second chat messages occurring during the consumption of the media content.
  • the second (e.g., social interaction) data may be filtered based on the user profile; and the filtered second (e.g., social interaction) data may be displayed on the content consumption device or transmitted to at least one other content consumption device consuming the media content.
  • FIG. 1 is a block diagram illustrating an example of a communication network configured for facilitating one or more interactive real-time communications between users and/or (e.g., content consumption) devices, according to one environment;
  • FIG. 2 is a diagram illustrating an example of a dynamic adaption process
  • FIG. 3 is a diagram illustrating an example of a flowchart for establishing an interaction session
  • FIG. 4 is a diagram illustrating an example of a dynamic adaptation process
  • FIG. 5 is a system diagram illustrating an example of deployment of an interaction enhancement service
  • FIG. 6 schematically illustrates a general overview of an encoding and decoding system according to one or more embodiments
  • FIG. 7 is a diagram illustrating an example of a method for enhancing an interaction service
  • FIG. 8 is a diagram illustrating another example of a method for enhancing an interaction service
  • FIG. 9 is a diagram illustrating another example of a method for enhancing an interaction service
  • FIG. 10 is a diagram illustrating another example of a method for enhancing an interaction service.
  • FIG. 11 is a diagram illustrating another example of a method for enhancing an interaction service.
  • identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
  • FIG. 1 is a block diagram illustrating an example of a communication network 110 that may be coupled to a variety of (e.g., content consumption) devices 100 via different alternative means such as busses and cables (shown generally as 150).
  • content consumption devices e.g., client computer”, and “devices”, collectively devices 100 may be used interchangeably to refer to devices capable to consume (e.g., any of render, display) any kind of content.
  • the devices 100 are illustratively shown in FIG. 1 as desktop computers, laptops or mobile or smart devices like mobile phones. These are just a few examples of the devices that can be used, and it is understood that in alternate embodiments other devices can be added or substituted for these examples.
  • the network may be capable of facilitating one or more interactive real-time communication such as teleconference between streaming devices and mini networks that may provide audio and/or video components.
  • the network may contain at least one processor 120 that may be, for example, in one or more servers (shown as 120 as well).
  • the server/processor may be remote or local and can be in processing communication with other processor/servers, via e.g., network interfaces (e.g., any of transmitter, receiver, transceiver).
  • network interfaces e.g., any of transmitter, receiver, transceiver
  • one or more of these processor/servers can be part of a coupled or group of wireless or portable devices via a base station.
  • server 120 may represent an instance among a large set of instances of application servers in a data center, cloud computing environment, or any other mass computing environment.
  • the network 110/130 can also provide internet connectivity and include other components such as a digital television (DTV) as illustratively shown at 140 that may allow viewers to stream any of on-demand videos, online interactive media, over-the-top content, music, browse the internet, view photos, and the like.
  • DTV digital television
  • Some examples of these technologies include Google ChromecastTM and Apple TVTM.
  • This technology may enable a user to share content from their personal devices onto a larger screen such that it can be more easily viewed by others. For example, displaying pictures or movies stored on a personal device onto the larger screen of a DTV can be used for others to view the content more easily.
  • At least one processor is provided in one of the devices for example in the server computer 120 which may be configured to host a conferencing meeting and may transmit and receive any of video, image, and audio data to and from (e.g., each of) the client computers (through the cloud or the network when that is the case).
  • Each of the client computers may also include their own computing device with at least a processor, processing units, graphics processing units (GPU), one or more buses, memory organized as volatile and/or nonvolatile storage, one or more data input devices, I/O interfaces (such as e.g., any of transmitter, receiver, transceiver) and output devices such as loudspeakers or a LINE-OUT jack and associated drivers.
  • Each of the client computers may also include an integrated or separate display unit such as a computer screen, TV screen or other display and have any of mobile or stationary computers including desktop computers, laptops, netbooks, ultrabooks, tablet computers, smartphones, et cetera.
  • the network 130 may comprise the Internet.
  • each (e.g., content consumption) device 100 may have a particular profile and one or more users can have a profile that may reside in the device or client computer or can be located on the cloud under a user account.
  • the network through one or more servers may also maintain a list of accounts or profiles that may (e.g., each) be associated with one of the client computers and/or one or more users of the client computers.
  • the client computers can be used by an attendee of a conference session that may enable any of video (any of live, streaming and recorded images) and audio operations.
  • Devices/client computers 100 can be (e.g., simultaneously) a presenting device or a recipient device (attendee) of a video conference session.
  • a socially suboptimal setting may be one that may lack social cues of someone who would otherwise enhance the experience if present.
  • Such systems may inject or overlay social effects with the primary media content during synchronous or asynchronous viewing.
  • valuable social effects may be lacking from current systems for several reasons.
  • users may be reluctant to share information due to privacy concerns (e.g., for effects that may utilize user-provided audio, a user may be concerned that a microphone may pick up unrelated conversation in a different room).
  • users may be reluctant to share information that may interrupt the viewing experience (e.g., writing a chat message or selecting an emoticon may involve attention from the user that may distract from the primary content).
  • users may be unavailable for viewing of a live event referenced as a media item (e.g., a live broadcast of a sporting event or new show).
  • Media items may be part of a larger media content being consumed or be the entirety of the media content.
  • users may not have viewed content that may be being viewed by other members of their social groups (e.g., asynchronous viewing).
  • Graphical emoticons may be another form of text-based cues that may be selected and inserted during playback of the media item.
  • An example of a system that supports insertion of emoticons is Facebook Live which allows users to select “reaction emojis” which appear layered with the display of a live-streamed media item. These reaction emojis may be preserved with the media item and made available when the media item may be replayed by other users of the system but still do not address the prior art shortcomings.
  • User’s content consumption experience may be accompanied by interactions between the (e.g., content consumption device of the) user and (e.g., other content consumption devices of) other users that may be consuming or may have consumed the same content. For example, a user may post a comment to a video and read comments posted by others, or multiple users may use text chat to share their thoughts on the content while watching at the same time. The interactions among users regarding the content consumption may be viewed as part of the viewing experience.
  • Embodiments described herein provide (e.g., management) mechanisms enabling the user to better control the interaction with others, or tailor the interactions to better fit the user’s interest or preference.
  • Streaming platforms and video sharing services may enable user interaction through the commenting feature that may accompany the media content (e.g., a movie, a video clip, a live event streaming).
  • users may post their comments (e.g., which may be transmitted by content consumption devices) regarding the same media content.
  • the comments may be posted by the users (e.g., transmitted by the content consumption devices) after watching (e.g., displaying) the content or while watching (e.g., displaying) the content.
  • the comments could be organized in a separate section that may be displayed independently of the media content. For comments that may be input during the content consumption process, the display of such comments may be synchronized with the content.
  • a user may react to the other users’ comments through replying or rating (e.g., upvote or downvote).
  • the commenting features, the management of comments may be performed by the service/content provider or the platform manager/moderator.
  • the manager may block or mute a certain user that violates the terms of service.
  • users e.g., content consumption devices
  • Some services may allow the user (e.g., content consumption device) to specify (e.g., indicate) keywords to be filtered so that comments containing the keywords will not be displayed to the user (e.g., by the content consumption device).
  • the commenting feature may be applied to group-based content consumption services (e.g., watch party, teleparty).
  • group-based content consumption services e.g., watch party, teleparty
  • a user may join a group of family members or friends to form a group prior to the content consumption process, and group-wide interaction could be enabled (e.g., group chat window).
  • group-based content consumption services e.g., watch party, teleparty
  • group-wide interaction could be enabled (e.g., group chat window).
  • group chat window e.g., group chat window
  • User 1 may be watching a content.
  • the content may be a live sports game between Team A and Team B.
  • User 1 may be watching by himself, and he would like to interact with the other sports fans that may be also watching the live game to share his excitements and thoughts. He may start to read the real-time comment section and may post his comments.
  • the live game may have attracted a huge number of viewers.
  • the comment section may be filled with an overwhelming number of comments such that User 1 may find difficult to keep up with what the others may be saying.
  • a fan of Team A User 1 may find that he may not want to see the comments that are cheering for Team B.
  • User 1’s favorite player in Team A may be Bob, so he may prefer to have more discussions with the other fans of Bob.
  • User 1 may find from the comment section that several other viewers may also be talking about Bob, so he would like to have more discussions with them.
  • Voice chatting with those viewers may be preferred as typing comments may be time consuming.
  • those viewers may be strangers to User 1 and he may not be able to have voice chats with them.
  • User X may be watching a suspense movie which may be based on the parallel universe theory. She may like this theory and would like to see what the other viewers of this movie may think about this topic. She may take out her smart phone to check on social media for more discussions on this topic (e.g., searching with “#ParallelUniverseMovie”). While she may be looking at her smart phone, she may mis an important scene that may explain the plot. Having difficulty understanding the plot, she may decide to turn on the synchronized commenting window so see if there may be any other viewer talking about this. But she may see a spoiler comment, which may make her unhappy.
  • Embodiments described herein provide features and services that can enable management and control over these types of interactions by enabling customizing the user interaction based on user context or preference or by allowing the user to manage or control the interaction.
  • FIG. 2 is an illustration of an example providing customized user interaction techniques.
  • the input may come from a variety of sources.
  • sources may be a user interface of a content consumption device associated with the user, shown in FIG. 2 by numerals 200 or from a user device associated with the content consumption device (automatically generated or otherwise) as shown by reference numerals 202.
  • Other users e.g., other content consumption devices
  • a user interaction enhancement service 204 can also be provided to improve user-awareness and customizability of the users’ interactions that may accompany the media consumption process.
  • the latter service may be capable of collecting and analyzing users’ content consumption and interaction activity to detect user preference, based on which, customizing and processing user interaction to enhance the media consumption experience may be enabled.
  • the interaction enhancement service 204 can also be implemented at the user/client side to provide customized enhancement for the client user.
  • the interaction enhancement service 204 at one user (e.g., content consumption device) may interact with another user (e.g., content consumption device) directly or through the counterpart service enabled at another user (e.g., content consumption device).
  • the described service may also be implemented in a centralized manner, providing services to multiple users (e.g., content consumption devices).
  • the service may support the customization of individual users (e.g., content consumption devices) and the coordination among multiple users (e.g., content consumption devices).
  • the described service may be implemented as a combination of the above, with, for example, the client-side service supporting user (e.g., content consumption device) customization and the central service supporting multi-user (e.g., content consumption devices) coordination.
  • Interaction data in this disclosure may refer to any data associated with the media consumption user inputs responsive to or related to the content or to other interaction data.
  • the interaction data may be made visible (e.g., public, for transmission) to other users (e.g., content consumption devices) consuming the same content. Examples may include comments to a video, messages in real-time chat window accompanying a live event, etc.
  • User interaction may refer the interaction among users associated with the same media content.
  • the interaction may take place among the users (e.g., content consumption devices) that may be consuming the content at the same time, or among users (e.g., content consumption devices) that may consume the same content at different times.
  • the different steps of how the interaction can be further customized can be reviewed by looking at the steps of this embodiment.
  • Step 1 enumerated as S210, provides for an interaction-related user data collection step.
  • the data collection process may include collecting data that could be used for computing user preference and enhancing user interaction.
  • the data to be collected may include the interaction data from the user (e.g., content consumption device) and other users (e.g., content consumption devices), user’s feedback or reaction to the content or the other users’ interaction data, user context information, etc.
  • Step 2 or S220 provides for the user (e.g., preference analysis and) profile generation step.
  • the system may perform analysis to gain insights on how to improve user interaction, such as the user preference of an interaction topic and an interaction mode, user expectations on interaction frequency and data volume, etc.
  • the user preference and/or expectations could be maintained, for example, in a user profile.
  • Step 3 or S230 provides for an interaction enablement and enhancement step.
  • the interaction process may be customized, and the interaction data may be processed according to (e.g., the user preference captured in) the user profile.
  • Step 4 or S240 provides for a dynamic adaptation.
  • the user preference and/or context may change during the content consumption process.
  • the user profile may be dynamically updated, and interaction enhancement mechanisms may be adapted according to the (e.g., latest) updated user profile.
  • step 5 or S250 provides for a session-based interaction enablement step.
  • the user interaction process may involve multiple users (e.g., content consumption devices), and the interaction enhancement service may be applied to more than one user (e.g., content consumption device) jointly.
  • an interaction session may be established between the involved (e.g., selected) users (e.g., content consumption devices) to perform session-based enhancement services.
  • Step S210 (1) - User Data Collection
  • a variety of user data may be collected as the basis of e.g., user awareness and analyzing user preference, including the interaction data, user responses and other context information.
  • the interaction data to be collected may include both the interaction data generated by the user (e.g., content consumption device) for transmission to other users (e.g., content consumption devices) and the interaction data received from the other users (e.g., content consumption devices).
  • the interaction data generated by the user e.g., content consumption device
  • the user e.g., content consumption device
  • the information may indicate the user’s interests, which may be used to filter future social interaction data to better suit the user’s interest.
  • the user’s reaction or feedback to the content or to the interaction data may provide supplemental information in identifying user preference or interest.
  • the user may show interest in a particular component/object in the content which may be indicated by the user gazing at that component/object for a while.
  • the user following or subscribing to another user on social media may indicate the user’s interest in other users.
  • the user after seeing (e.g., receiving) a comment from another user (e.g., content consumption device), the user (e.g., of the content consumption device) may agree/disagree to the comment by putting (transmitting) an upvote/downvote to that comment.
  • the action of showing agreement/disagreement to the social interaction data may indicate the user may have similar/opposite opinions with another user and thus help identify the likeminded users or the interaction data from such users.
  • social interaction data will be used to refer to any kind of data representative of a social interaction between users of content consumption devices related to a consumption of a same media content, such as any of interaction data and responsive data as previously described.
  • context information of the user may be used to enhance the interaction and media consumption experience.
  • any of the user location information, language preference, and information of the device used for content consumption process may be collected and used for any of customizing and filtering the (e.g., social) interaction data.
  • the user contact list (based on user IE, emails, social networking contacts) and social map may be used to detect a potential interaction target that the user may be interested in.
  • the context information may include whether the user is consuming the content alone or with a pre-made group, or whether the user is consuming the content with a child.
  • the information of whether the user is watching a movie for the first time may help to determine if the user may desire a spoiler-free interaction with the other users.
  • the information of whether the user is watching a show with their child may determine if additional filtering on language usage should be applied to the (e.g., social) interaction data.
  • context information may indicate the device(s) the user may be using for any of media consumption and user interaction. For example, certain users may only want to interact via the secondary device (e.g., phone) rather than primary device (e.g., DTV).
  • users may only want to interact if they have their phone present with them and it is operating in a specified mode (e.g., interaction app is open, and interaction is enabled) while watching content on the primary device.
  • a specified mode e.g., interaction app is open, and interaction is enabled
  • Any of the presence and the status of a user (e.g., secondary) device may be monitored as (e.g., user) context information.
  • the user may manually configure the interaction enhancement service by setting preferences and expectations e.g., via a user interface.
  • the service may be requested (e.g., configured) to filter out negative or irrelevant interaction data from the other users (e.g., content consumption devices), or to keep both positive and negative comments but throttle the comments to maintain an acceptable amount.
  • the user may also specify (e.g., indicate, configure) what sources may be allowed for the data collection.
  • the social interaction data and other relevant information may be collected from a variety of sources (or specified by the user, if applicable).
  • Many content providing or sharing platforms may provide a commenting feature where a user may post their comments.
  • the comments as well as the user response to the comments may be collected as social interaction data.
  • Social media and communication tools may be the source where the users may share and exchange their thoughts regarding the media content, which could be collected as social interaction data.
  • User context information may be collected from the user content consumption history. For example, from the user’s movie watching history, it may be determined if the user has watched a certain movie and that information may be used to determine if a spoiler-free interaction may be preferred in a case where the user watches the movie in the future.
  • the context information may be collected from the physical environment of content consumption and the devices within the environment. For example, it may be detected from sensing devices if the user is watching the content alone or with family members and/or with a child, then the interaction preference could be adjusted accordingly.
  • Data collection may be (e.g., constantly) performed, including any of before, during, and after a content consumption process.
  • social interaction data such as any of the user’s posts and replies on their social media, may be collected to predict topics of interest for the user for enhancing the interaction of a future content consumption process.
  • the users may talk about how they are expecting to see the performance of a player in a live event which may be on air in two days.
  • user’s feedbacks, and responses to the content and/or to the interaction data may be collected for future reference.
  • the users may write a review of the movie or TV show that they just watched or post a comment to a video.
  • the social interaction data that may be generated by the user may be used to make dynamic adjustments to the interaction.
  • the social interaction data received from the other users may be processed and integrated into the content to be displayed by the content consumption device to the user.
  • Step 220 (2) - User Preference Analysis and Profile Generation
  • the system may perform analysis to establish user awareness and gain insights on how to improve user interaction, such as e.g., any of the user preference of an interaction topic, an interaction mode, user expectations on interaction volume, etc.
  • the user preference and expectations may include the following aspects:
  • the user may be interested in certain topics related to the content and would like to interact with others around those topics. For example, the user may prefer to see comments on a specific team, a player, an actor/actress, a theme. For example, the user may prefer to avoid social interaction data related to a certain topic.
  • the topics of interest may be identified by extracting frequently appeared keywords from the interaction data and/or leveraging natural language processing techniques on any of the user interaction data and responsive data.
  • the social interaction data may convey certain emotional stance or attitude of the user who generated the social interaction data.
  • the user may prefer receiving social interaction data in a certain attitude than the others. For example, the user may prefer to see positive, optimistic, or cheerful comments rather than negative or aggressive comments.
  • the emotion/attitude of social interaction may be identified with e.g., emotion detection algorithms. For example, based on user response to interaction data with certain emotion/attitude, the user preference may be determined.
  • the user may specify (e.g., configure via a user interface) a restriction on the volume of interaction data that could be received/displayed during the content consumption process.
  • the restriction may not be a fixed value and may be dynamically adjusted during the content consumption process. For example, the user may be able to follow more comments when the content is at a slower pace but fewer or no comments when the pace of the content is increased.
  • the allowed intensity of interaction may be estimated based on whether the user may be focusing on the content, which may be done by analyzing the desired level of concentration of the specific content (e.g., the climax of a movie, a scoring moment of a sports game) or tracking the user’s gaze.
  • the user may be willing to interact differently with different users.
  • Such specific interaction targets may be specified (e.g., configured) by the user (e.g., via a user interface) or identified based on the user interaction data.
  • the user may be more willing to interact with their acquaintances or close contacts during the content consumption process, who may be identified from the user context information.
  • the latter may be identified as a potential target that the user may like to be engaged with more often. For example, if multiple comments from the same person are manually blocked by the user or automatically blocked based on any other criteria, then the person may be blacklisted so that future comments from this person may not be displayed by the content consumption device to the user.
  • Categorization of users could be viewed as a special case of specifying (e.g., configuring, identifying) an interaction target.
  • the interaction process may be configured so that only the social interaction data from the same or a particular user category may be visible to (e.g., displayed by) the user (e.g., content consumption device).
  • users e.g., content consumption devices
  • a user e.g., content consumption device
  • can only display the comments from other users e.g., content consumption devices
  • the comments in other languages may be filtered out (e.g., not displayed).
  • only the comments from a first-time viewer may be visible to the other first-time viewers, which may help prevent spoilers in the social interaction data.
  • the user may indicate (e.g., configure via a user interface) the preferred interaction mode, such as any of text, voice, and video.
  • the preferred interaction mode may be specified (e.g., configured) by the user or detected by the system during the content consumption process.
  • the interaction mode may be initialized as text messages or comments (default mode) and adjusted as the interaction atmosphere may change (e.g., the users feel they are getting closer socially, the discussions heat up). For example, during the interaction process, a pair of users may share highly similar comments and frequently upvote the comments from the other. An emotional resonance could be detected between them, which may indicate the preference of additional modality of interaction, such as switching from text chatting to audio/video chatting.
  • the above-mentioned user preferences may apply to the social interaction data received from the other users (e.g., content consumption devices).
  • the user preference may also apply to the social interaction data generated by the user (e.g., content consumption device), such as keywords and/or emotion-based filtering.
  • the user may also specify (e.g., configure) other configurations such as where the interaction data could be displayed or presented (e.g., as chat window on the primary screen, on a companion device).
  • a user profile may be created to record the user preference and expectations that may be identified from the data analysis or configured by the user.
  • the service may also provide pre-configured template profiles which the user may directly apply for interaction enhancement or apply with minor modifications.
  • a “friendly mode” profile may be pre-configured featuring mild language usage, non-aggressive messages, positive and encouraging attitude, etc.
  • the system may customize the user interaction during the content consumption process according to the user profile, which may include any of the processing and presenting (e.g., displaying) of received (e.g., social) interaction data, managing and assisting the generation of (e.g., social) interaction data.
  • One procedure in interaction enhancement may comprise processing the received (e.g., social) interaction data.
  • the processing of (e.g., social) interaction data may be performed any of before the content consumption process, and during the content consumption process. Different types of operations may be applied to the received (e.g., social) interaction data.
  • a. Filtering The received (e.g., social) interaction data may be filtered based on e.g., the user profile (such as e.g., the user preference and expectations). For example, (e.g., social) interaction data containing keywords or topics that the user may not be willing to see or may not belong to topics of interest may be removed or made invisible (e.g., not displayed) to the user.
  • a comment from a person that may have been red flagged or blacklisted by the user may be blocked.
  • Compression If the user requested (e.g., via user configuration) the amount of interaction data to be restricted, then the (e.g., social) interaction data volume may be compressed by methods such as any of throttling and combining comments with similar meaning.
  • the term “filtering” may be used to also cover the compression operation (e.g., reducing the volume of the interaction data)
  • Integration There could be multiple sources of (e.g., social) interaction data.
  • (e.g., social) interaction data from multiple resources may be integrated in any of one location and one device such that the user may not need to check multiple locations.
  • filtering may be used to also cover the integration operation, e.g., in a case where interaction data are received from multiple sources, filtered and combined in a single piece of interaction data.
  • Transformation Any of the format and the modality of the (e.g., social) interaction data may be transformed (e.g., adapted) according to the user preference. For example, the user may find it distracting to read the comments during the content consumption process, and the text comments may be transformed into audio comments (e.g., text to speech) such that they could be read to the user.
  • the processed (e.g., social) interaction data may be displayed (e.g., presented) to the user during and/or after the content consumption process in different formats.
  • Differentiated display (e.g., presentation) The (e.g., social) interaction data may be categorized and presented (e.g., displayed) to the user in a differentiated manner. For example, if a comment is on (e.g., belongs to) a topic of interest, the font of the comment may be set larger than the others. If multiple comments are expressing similar meaning, they can be combined and displayed in a larger font indicating that this comment is echoed by many users. If a comment is from a close contact, the comment may be any of annotated and highlighted for easy recognition.
  • an alternative way of differentiating comments may be to transform a highlighted comment into audio comment and read to the user as compared to the rest of the comments that may be displayed to the user in text format.
  • some of the data may be filtered out and not displayed to the user during the content consumption process.
  • the filtered-out data may be any of permanently removed, stored such that it could be displayed to the user at a later time. For example, due to the volume restriction, it may not be able to display all the (e.g., social) interaction data during the content consumption process. In this case, only a portion of (e.g., social) interaction data may be displayed to the user in real-time and the rest of the data may be stored such that it may be accessible by the user later.
  • the service may be configured to any of display the (e.g., social) interaction data on the common primary device, and on a (e.g., each) individual user companion device. This may be based in part on whether all of the users have the same or differing interaction preferences.
  • the (e.g., social) interaction data generated by the user may be processed before it is sent to or shared with the other users (e.g., content consumption devices) with any of filtering, compression, integration, and transformation.
  • the user may request to transform their voice commenting into text comments.
  • the user may apply self-censoring on the generated (e.g., social) interaction data, such as blocking or replacing expletive words, blocking entire phrases with expletive words.
  • a time delay may be applied before the generated (e.g., social) interaction data may be shared with (e.g., transmitted to) the others (e.g., content consumption devices) so that additional processing may be performed on the data.
  • special moments of user’s emotion such as any of wow, excitement, frustration, and sigh may be captured by any of eye gaze tracking, facial expression, and voice recognition.
  • This captured information can be converted to current communication format on the streaming session. For example, if the system detects that the user is laughing, a smile icon or emoji may be inserted into the interaction data on behalf of the user. If the user got angry at certain comments made by (e.g., received from) another user (e.g., content consumption device), the angry facial expression may be automatically converted to angry icon/emoji, and inserted in the chatting message.
  • the user may be assisted in generating the interaction data.
  • the user may want to stay involved in the interaction, but it may feel distracted to constantly post comments while watching the content.
  • the service may assist the user in generating comments and replying to others’ comments (e.g., by using message bot) so that the user could stay connected without spending excessive amount of time in the conversation.
  • the bot can be used to auto reply on behalf of a user. This could be controlled by user configuration (e.g., the bot may be enabled in a case where the user is detected as busy, e.g., doesn’t want to be bothered, etc.).
  • the bot may be used as an assistant/helper to generate a draft reply for a user, which may be updated/tweaked before sending, e.g., by the user if he so chooses.
  • This bot can take into account the computed user preferences and/or expectations when formulating (e.g., generating) interaction message data on behalf of a user.
  • the users in the group may specify if the (e.g., social) interaction data generated by them (e.g., the content consumption device) will be shown (e.g., transmitted) to the other users (e.g., content consumption devices) as data originated from a single user or from multiple users (e.g., of the content consumption device).
  • the user preference or context may change during the content consumption process.
  • the interaction enhancement mechanisms may adapt to the changes. For example, the user may have no preference on the attitude of comments at the beginning of the content consumption process. Later, the user’s mood may change (e.g., the team that the user is supporting is losing). If the user’s mood turns to bad, seeing negative comments may further upset the user, in which case the filter may be dynamically adjusted to remove negative comments.
  • the data collection and e.g., any of the user profile and user preference analysis process may be continuously performed during the content consumption process so that the (e.g., social) interaction data may be processed based on the real-time updated user preference (e.g., profile).
  • the change of the user’s mood, or more generally, the change of user context may be detected as the user responsive data (e.g., bio-signals, emotional state,) may be collected.
  • the user profile may be updated correspondingly, e.g., the preference of non-negative comments may be added.
  • the updated profile may then be reflected by the processing of received (e.g., social) interaction data where a new (e.g., updated) filter may be applied to the (e.g., social) interaction data.
  • the user may be exploring some media content belonging to a genre which the user may not be familiar with. In that case, there may be little information in the user profile that may describe the user preference on interaction, such as whether the user may be interested in a certain topic mentioned in the content.
  • a default profile may be applied at the beginning of the content consumption process.
  • the user profile may be dynamically built up (e.g., updated), based on which the future (e.g., social) interaction data may be processed accordingly. For example, a minimum length of data collection period may allow to ensure enough data may be collected for analysis and creating (e.g., updating) the user profile.
  • a user preference or profile can be temporarily modified due to (e.g., certain) events during the content consumption process, which may apply temporary changes to the (e.g., social) interaction data. For example, the user may post a question during the content consumption process, asking for answers from the other viewers. Then in a following period of time, all the comments that may be answering the question may be highlighted, or the comments not related to the question may be temporarily blocked.
  • events during the content consumption process may apply temporary changes to the (e.g., social) interaction data. For example, the user may post a question during the content consumption process, asking for answers from the other viewers. Then in a following period of time, all the comments that may be answering the question may be highlighted, or the comments not related to the question may be temporarily blocked.
  • timeline of (e.g., user) data collection is shown 310.
  • the (e.g., user) data collection may be performed (e.g., constantly) any of during the content consumption process (timestamp A to B, timestamp C to D) and outside the process (before timestamp A, timestamp B to C, after timestamp D).
  • timestamp A to B, timestamp C to D timestamp A to B, timestamp C to D
  • timestamp A to B, timestamp C to D e.g., constantly
  • timestamp A to B, timestamp C to D e.g., timestamp C to D
  • the service may generate (e.g., update) a user profile based on previously gathered user data as well as information of the content. Interaction enablement and enhancement may be performed, for example, based on the latest user profile.
  • the user profile may be (e.g., constantly) updated based on the newly collected information, such that the interaction process of the user can be dynamically adjusted accordingly. Note that, Error! Reference source not found.3 only shows the interaction enhancement 330 applied during the content consumption process, the enhancement service may also be applied to the interaction process outside the content consumption process.
  • the interaction enhancement service was described from the perspective of individual users (e.g., content consumption devices). Since the interaction process may involve multiple users (e.g., content consumption devices), the interaction enhancement service may apply to more than one user (e.g., content consumption device) jointly in certain cases, such as to obtain consensus among multiple users (e.g., content consumption devices), or to apply additional functionality or limitation within a specific set of users (e.g., content consumption devices). An interaction session may be established between the involved users (e.g., content consumption devices) to perform session-based enhancement services.
  • the interaction enhancement service may apply to more than one user (e.g., content consumption device) jointly in certain cases, such as to obtain consensus among multiple users (e.g., content consumption devices), or to apply additional functionality or limitation within a specific set of users (e.g., content consumption devices).
  • An interaction session may be established between the involved users (e.g., content consumption devices) to perform session-based enhancement services.
  • a specific subset of users e.g., content consumption devices
  • an interaction session may be established among these users (e.g., content consumption devices).
  • the interaction session may involve operations in addition to the above-mentioned steps.
  • a voice chatting session may be established among a subset of users (e.g., content consumption devices) in a case where voice chatting is the preferred interaction mode.
  • the users may be identified as potential targets for further interaction enhancement where more capable or additional modality of communication functionality may be opened between these users (e.g., content consumption devices) such as e.g., from text chatting only mode to video/audio capable communication.
  • An interaction session may be established to enable the additional communication functionality. For example, if the functionality is not supported by the current app, other capable app (Zoom, Skype, etc.) may be provided.
  • any of a competition and a cooperation mechanism may be applied to users (e.g., content consumption devices) in an interaction session.
  • the service may enable the users’ interaction activity to be evaluated by each other. Some user comment or interaction activity may get momentum or support from the other users, then this user could be considered as “popular” in this session and the user may get recognized with a highlighted icon or identity (e.g., a visual crown, cheer audio).
  • session-specific content modification or adjustment may be applied to users within a session. For example, the content consumption process may be paused in a case where the users in the session are discussing the content.
  • session-specific content modification or adjustment may be applied to users (e.g., content consumption devices) within a session.
  • the content consumption process may be paused in a case where the users (e.g., content consumption devices) in the session are discussing (e.g., transmitting and receiving social interaction data) about the content.
  • FIG. 4 shows an example of a procedure for establishing an interaction session with the interaction enhancement service.
  • user A e.g., content consumption device
  • user B e.g., content consumption device
  • numerals 400 and 406 respectively
  • their respective interaction session service by 402 and 404 respectively.
  • a trigger may be detected to establish an interaction session.
  • a subset of (e.g., two or more) users e.g., content consumption devices
  • the interaction enhancement service hosted locally at user A (e.g., content consumption device) 400 or centrally at a cloud server) where additional features or configurations may be applied to the interactions among the content consumption devices, such as advanced interaction mode, more closely connected interaction, competition or cooperation mechanism, etc.
  • the trigger may be detected based on any of the user preferences and profile (e.g., updated in real-time during the content consumption process).
  • the user e.g., content consumption device
  • a Session Establishment Request message may be sent from the interaction enhancement service to user A (e.g., content consumption device) 400.
  • the Session Establishment Request message may include information indicating a description of the session to be established, such as the specific feature of the session as compared to the baseline interaction process.
  • the identities (e.g., identifiers) of the other users (e.g., content consumption devices) in the session may or may not be included in the Session Establishment Request message.
  • Step 3 - user A may respond by transmitting a Session Establishment Response message, indicating accepting (or declining) the request to establish the interaction session.
  • a Session Establishment Request message may further be sent to each user (e.g., content consumption device) such as e.g., user B that may be involved in (e.g., selected for) the session to get the consensus of users (e.g., content consumption devices) in the session.
  • the target user e.g., content consumption device
  • the Session Establishment Request message may be processed by the corresponding service hosted at the target user (e.g., content consumption device). If not, the Session Establishment Request message may be directed to corresponding applications/services that may be enabling user interactions at the target user (e.g., content consumption device).
  • the user(s) e.g., content consumption devices
  • the interaction enhancement service which initiated the session establishment by transmitting a Session Establishment Response message.
  • the interaction enhancement service may send a notification to each user (e.g., content consumption device). If not, all users (e.g., content consumption devices) agree on the establishment, then the session may be established among (e.g., only) the users (e.g., content consumption devices) that may have agreed.
  • an interaction session may be established among the selected set of users (e.g., content consumption devices), and session-specific interaction enhancement features or configurations may be applied to the (e.g., social) interaction data of these users (e.g., content consumption devices).
  • users e.g., content consumption devices
  • session-specific interaction enhancement features or configurations may be applied to the (e.g., social) interaction data of these users (e.g., content consumption devices).
  • a first embodiment can be titled as the System Level Embodiment.
  • the interaction enhancement service may be deployed, for example, in a centralized manner, such as in a cloud server.
  • the interaction enhancement service may be accessed by the client hosted on user (e.g., content consumption) devices.
  • the interaction enhancement service may be deployed in a distributed manner where the interaction enhancement service may be hosted (e.g., implemented) at the user (e.g., content consumption) device.
  • the interaction enhancement services hosted on different user (e.g., content consumption) devices may coordinate with each other to achieve certain functionalities, such as the session-based enhancement as described herein.
  • the interaction enhancement service may be deployed in a hybrid manner where a portion of the functionality may be provided on user (e.g., content consumption) device(s) and the rest provided by the centralized server.
  • the hybrid deployment may provide the flexibility of adjusting the footprint at the user side according to different implementation considerations, such as the capability of a device, latency requirements, privacy preservation, etc.
  • FIG. 5 illustrates an example of deployment of the interaction enhancement service.
  • the client may provide limited functionality and the majority of the interaction enhancement service may be carried out at the cloud server (such as e.g., computationally intensive operations such as e.g., user data analysis) to achieve a light-weight deployment (from the device perspective) in a case where the (e.g., content consumption) device has limited computation capability.
  • a distributed deployment may be preferred for privacy preservation where the functionalities may be mainly provided by the (e.g., content consumption) device, such as user data collection and processing.
  • a media (e.g., content) consumption device 510 at the user side may be hosting the interaction enhancement service client 517.
  • Various types of multimedia consumption devices may support (e.g., include, implement) the described interaction enhancement functionality.
  • Such content consumption devices may include but are not limited to a DTV, smart phone, tablet, laptop, HMD, etc.
  • the content consumption devices 510 may support (e.g., include, implement) the capabilities such as collecting user data from applications, computing user profile (e.g., preference), processing and presenting (e.g., displaying) interaction data.
  • the user may interact with the service client to provide any of (e.g., social) interaction data and instructions (e.g., configuration information) to build user profile.
  • the interaction enhancement service 502 may also leverage assistance from supporting capabilities provided by other related applications or services 520, such as any of monitoring user context and gathering user’s interaction data from various sources (e.g., search engine, social media), providing additional modality of interaction (e.g., voice chatting, video calls), obtaining social mining results, etc.
  • applications or services 520 may be located on the same media consumption device 510 where the interaction enhancement service (client) 517 may be hosted, or on other devices 525 that may interact with the media consumption device 510 to support the interaction process.
  • the interaction enhancement service client may interact with the server hosted in the cloud 501.
  • the interaction between the client and server may differ depending on how the interaction enhancement service may be deployed.
  • user e.g., social interaction
  • data may be collected and processed locally at the (e.g., content consumption) device.
  • user e.g., social interaction
  • data collection and processing is done at the server hosted in the cloud
  • user e.g., social interaction
  • the user (e.g., social interaction) data from the client may be combined with the user (e.g., social interaction) data collected from the content/service provider (e.g., the interaction data from other users).
  • the combined user (e.g., social interaction) data may be analyzed and processed, and then transmitted to the client (and other users) e.g., content consumption device(s).
  • the interaction enhancement service hosted at one user (e.g., content consumption) device 515 may interact with another user (e.g., content consumption) device 515 any of directly and through the centralized server.
  • the interaction among (e.g., content consumption) devices may include the request and response messages exchanged during the establishment of interaction session.
  • the interaction enhancement service may not be enabled at all the users (e.g., content consumption devices) in an interaction session. In this case, a certain functionality of the interaction enhancement service may still be achieved (e.g., performed).
  • the service hosted on one (e.g., content consumption) device may directly interact with the applications/services hosted on a remote (e.g., content consumption) device to apply (e.g., perform the processing associated with) the enhancement.
  • a second embodiment can be labelled as a Protocol Embodiment.
  • the interaction enhancement service may be carried out via the use of an interaction enhancement protocol.
  • This protocol may be supported by applications and services hosted on user (e.g., content consumption) devices (e.g., DTVs) or the centralized server, as well as other entities in the system that may interact with the service such as those shown in FIG. 5.
  • the applications and/or services hosted by the content consumption devices and related entities can support the exchange of interaction enhancement protocol messages as described herein.
  • an interaction enhancement message protocol can be realized as a client/server messaging protocol where users and/or their media consumption devices can function in the role of a client and/or a server to exchange interaction enhancement request and response messages with other entities in the system (e.g., other supporting applications/services/devices).
  • the information elements of any of the interaction enhancement request and response protocol messages can be encapsulated and carried within the payloads of existing client/server protocols such HTTP or Web Sockets.
  • these information elements can be encapsulated and carried within lower-level protocols such as any of TCP and UDP e.g., without the use of higher layer protocols.
  • the interaction enhancement service messages can be encapsulated and carried within publish/subscribe messaging protocols.
  • entities in the system can support message broker functionality. This broker functionality can be used by the devices to exchange the interaction enhancement service message with other entities in the system. This exchange can be facilitated by each entity subscribing to the message broker to receive messages from other entities.
  • each entity can publish a message to the message broker that may target other entities.
  • the information elements of the interaction enhancement protocol messages can be encapsulated and carried within the payloads of existing publish/subscribe protocols such a message queuing telemetry transport (MQTT) or advanced message queuing protocol (AMQP).
  • MQTT message queuing telemetry transport
  • AMQP advanced message queuing protocol
  • the interaction enhancement service protocol may employ a combination of the aforementioned protocol types.
  • request and response protocol messages can be supported by applications and/or services with interaction enhancement service capability.
  • any of request and response protocol messages of the interaction enhancement protocol may comprise information indicating a type of message which may include but may not be limited to the types of messages described in Table 1 .
  • an interaction enhancement service may be capable of any of: a. Collecting and analyzing user data related to interaction (such as e.g., social interaction data) to any of identify user preference and generate a user profile; b. Enabling and dynamically enhancing the interaction process of users based on any of their preference and profile to improve the user content consumption experience; c. Identifying (e.g., selecting) specific subset of users (e.g., content consumption devices) to establish interaction session, and applying session-specific enhancement to users (e.g., content consumption devices) within the session.
  • user data related to interaction such as e.g., social interaction data
  • the user interaction enhancement service can be designed to be capable of collecting (e.g., user) data related to any of the user interaction and context, wherein a.
  • the (e.g., user) data may include (e.g., social) interaction data, which may include the user inputs responsive to the media content (e.g., consumption) and may be made public by the user (e.g., content consumption device) to be visible (e.g., transmitted) to the other users (e.g., content consumption devices) that may I will be consuming the same content (e.g., any of comments to a video, chat messages during a live event, posts on social media that may be linked to (e.g., associated with) a certain content).
  • the media content e.g., consumption
  • the other users e.g., content consumption devices
  • the sources of (e.g., user) data may include but are not limited to, the platform that may be streaming the content on the content consuming device (e.g., DTV), related applications (e.g., social media, communication) on the content consuming device or other devices of the user.
  • the (e.g., user) data regarding a particular content may be generated while the user may be consuming the content, or before the content may be consumed (e.g., the user talked about a live event that he/she was expecting to be on air in two days), or after the content may be consumed (e.g., user writing a review of a movie or posting comments to a video).
  • the user context may include information of the userorthe userviewing environment (e.g., any of the user may be watching the content for the first time, the user may be watching with a child, status of devices used by the user during content consumption).
  • the (e.g., user) data may include user responsive data such as feedback to the content or response to the interaction data generated by other users (e.g., content consumption devices).
  • Examples may include any of user showing interest to a particular component of the content, user showing interest to another user (e.g., following/subscribing to another user’s social media), user reaction to the interaction data input (e.g., transmitted) by other users (e.g., content consumption devices) such as e.g., any of like, dislike, upvote, downvote to a comment made (e.g., transmitted) by another user (e.g., content consumption device).
  • the (e.g., user) data may include user explicit request or instruction on how to process the (e.g., social) interaction data (e.g., user-specified keywords to be blocked, user-specified interaction data sources).
  • user preference and/or expectation on interaction may be computed, and, for example, a user profile may be generated to store and maintain information of the user preference and/or expectation, wherein: a.
  • the user preference may include the topics/keywords that the user may be willing to see in the interaction (e.g., topics/keywords of interest), and/or the topics/keywords that the user may not be willing to see in the interaction (e.g., topics/keywords to be blacklisted).
  • the user preference may include the preferred targets (other users) that the user may be willing to interact with, or targets that the user may not be willing to interact with.
  • the user preference may include the preferred interaction mode (e.g., text, voice, video, emote).
  • the user context e.g., the user may be watching the content for the first time, the user may be watching with a child
  • expectation on interaction may include specific restriction on the interaction (e.g., any of spoiler-free, mild language usage, intensity and volume of interaction data, user will participate in interaction only when a secondary device or a related application is present).
  • enabling and enhancing user interaction based on the computed user preferences and/or expectations on interaction may include any of the following: a. processing received (e.g., social) interaction data such as any of (i) filtering the (e.g., social) interaction data based on any of user preference and expectations (e.g., filtering spoiler interaction data if the user is watching the content for the first time), (ii) adjusting the interaction mode, and (iii) adjusting the intensity and volume of interaction. b. presenting (e.g., displaying) the received and processed interaction data to the user, where the (e.g., social) interaction data may include the data integrated from other platforms/applications other than the platform that may be streaming the content. c.
  • processing received (e.g., social) interaction data such as any of (i) filtering the (e.g., social) interaction data based on any of user preference and expectations (e.g., filtering spoiler interaction data if the user is watching the content for the first time), (ii
  • session-based enhancement may be performed to a subset of users (e.g., content consumption devices) that may be identified (e.g., selected) based on any of (e.g., users) data and social interaction.
  • users e.g., content consumption devices
  • the subset of users may be determined based on them sharing similar/same preference or expectations on the interaction (e.g., any of same topic of interest, same preferred interaction mode).
  • Requests may be sent to the subset of users (e.g., content consumption devices), proposing session-based enhancement to the interaction and/or viewing experience among the subset of users (e.g., content consumption devices).
  • the subset of users e.g., content consumption devices
  • proposes session-based enhancement to the interaction and/or viewing experience among the subset of users e.g., content consumption devices.
  • the enhancement may include changing the interaction mode specific to this subset of users (e.g., content consumption devices) or include content modification/adjustment specific to this subset of users (e.g., content consumption devices) such as e.g., pausing the content in a case where the subset of users is discussing about the content).
  • content modification/adjustment specific to this subset of users e.g., content consumption devices
  • a response may be received, based on which, adapting the interaction and/or viewing experience of the subset of user (e.g., content consumption device) may be performed.
  • the subset of user e.g., content consumption device
  • FIG. 6 schematically illustrates a general overview of an encoding and decoding system according to one or more embodiments.
  • the system of FIG. 6 is configured to perform one or more functions of embodiments described herein and can have a pre-processing module 630 to prepare a received content (including one more images or videos) for encoding by an encoding device 640.
  • Encoding device 640 packages the content in a form suitable for transmission and/or storage for recovery by a compatible decoding device 670.
  • the encoding device 640 provides a degree of compression, allowing the common space to be represented more efficiently (i.e., using less memory for storage and/or less bandwidth required for transmission.
  • the data is sent to a network interface 650, which may be typically implemented in any network interface, for instance present in a gateway.
  • the data can be then transmitted through a communication network, such as the internet.
  • a communication network such as the internet.
  • Various other network types and components e.g., wired networks, wireless networks, mobile cellular networks, broadband networks, local area networks, wide area networks, Wi-Fi networks, and/or the like
  • network interface 660 which may be implemented in a gateway, in an access point, in the receiver of an end user device, or in any device comprising communication receiving capabilities.
  • the data are sent to a decoding device 670.
  • Decoded data are then processed by device 680 that can be also in communication with sensors or users input data.
  • the decoder 870 and the device 880 may be integrated in a single device (e.g., a smartphone, a game console, a STB, a tablet, a computer, etc.).
  • a rendering device 690 may also be incorporated.
  • FIG. 7 is a diagram illustrating an example of a method 700 for enhancing an interaction service.
  • Method 700 may comprise, in step 710, establishing an interaction enhancement service for collecting a plurality of user data related to at least a user interaction while consuming content.
  • Method 700 may further comprise, in step 720, analyzing the user interaction and identifying user preferences based on the user interaction.
  • Method 700 may further comprise, in step 730, generating a user profile based on the user preferences.
  • a user interaction session may be established between at least two users by identifying common interests amongst the at least two users by analyzing the user preferences.
  • functions may be generated when the user consumes another content based on the user preferences established in the user profile.
  • a subset of users with common interests may be identified based on their user profiles and interaction session(s) may be established amongst the subset of users.
  • session-specific enhancements may be generated in a case where the subset of users are consuming another content and the session specific enhancements may be generated based on the user preferences in the user profiles of the subset of users.
  • the user data may include (e.g., social) interaction data.
  • (e.g., social) interaction data may comprise data inputted by at least one user responsive to the consumed content and the input may be made visible to the other plurality of users that may be consuming the same content.
  • the input may include at least one of a user comment related to a video, a chat message during a live event, and/or a post on a social media that may be linked to the content.
  • the user data may include streaming content on a content consuming device.
  • the content consuming device may be a DTV.
  • the user data may be collected from a social media user profile or other related social media applications.
  • the user data may be collected from a user device or a content consuming device in communication with the user device.
  • the user data may be collected based on comments generated by the users prior to consuming a content but relating to the content.
  • the user data may include data generated based on comments collected prior consuming the content and the user data may include the viewing environment(s) of the user(s) and/or the user schedule(s) for viewing the content.
  • the user data may include whether each user is consuming the content alone or with other users.
  • FIG. 8 is a diagram illustrating another example of a method 800 for enhancing an interaction service.
  • Method 800 may comprise, in step 810, capturing a plurality of user responses generated in relation to consuming a media content.
  • Method 800 may comprise, in step 820, analyzing the responses to identify a common response pattern amongst at least two users.
  • Method 800 may further comprise, in step 830, establishing an interaction session amongst the at least two users.
  • a user profile may be established for each user.
  • the user profile may include user preferences established at least based on the responses generated in relation to consuming the media content.
  • the user profile may include additional data relating to user preferences added by at least one user.
  • the user may be the owner of the user profile.
  • method 800 may further comprise generating session-specific enhancements when the users may be consuming new content based on the user preferences.
  • a subset of users may be identified to have common preferences and a session may be established for the subset of users.
  • method 800 may further comprise applying session-specific enhancement to users within the session.
  • data may be generated relating to at least one user to identify user preferences.
  • a subset of users may be identified to have common interests based on their user profile and an interaction session may be established for the subset.
  • FIG. 9 is a diagram illustrating an example of a method 900 for enhancing an interaction service.
  • Method 900 may comprise, in step 910, collecting user data for a plurality of users consuming at least one content.
  • Method 900 may further comprise, in step 920, analyzing the user data and identifying user preferences based on the user data.
  • Method 900 may further comprise, in step 930, establishing a user profile including the user preferences.
  • Method 900 may further comprise, in step 940, identifying common user preferences amongst at least two users.
  • Method 900 may further comprise, in step 950, notifying the at least two users of their common interests and establishing a session between the at least two users if response is received from at least one of the two users that a session should be established.
  • the user data may include (e.g., social) interaction data.
  • the (e.g., social) interaction data may comprise data input by at least one user responsive to the consumed content and the input may be made visible to the other plurality of users that may be consuming the same content.
  • the input may include at least one of a user comment made to a video, a chat message during a live event, and/or posts on a social media that may be linked to the content.
  • the user data may include streaming content on a content consuming device.
  • the content consuming device may be a DTV.
  • the user data may be collected from a social media user profile or other related social media applications.
  • the user data may be collected from a user device or a content consuming device in communication with the user device.
  • the user data may be collected based on comments generated by the users prior to consuming a content but relating to the content.
  • the user data may include data generated based on comments collected prior consuming the content and the user data may include the viewing environment(s) of the user(s) and/or the user schedule(s) for viewing the content.
  • the user data may include whether each user is consuming the content alone or with other users.
  • the user data may include the age of the user.
  • the user data may include the age of the user and of the other users consuming the content with the user.
  • the user data may include the number of times the content may have been consumed.
  • the user data may include user responsive data.
  • the user responsive data may include user feedback to the content and/or user response data generated by other users.
  • the user data may include user request(s) on how to process user interaction data.
  • the user request(s) may include blocking at least one of a word, a topic and/or a user.
  • the user profile may include user collected data and user added data.
  • the data in the user profile may include any of user interaction data, generated context relating to a content and/or user feedback and preferences and expectations.
  • the user preferences may include any of topics, keywords, and another user.
  • the user preferences may include topics, keywords, and another user that at least each user may be willing to interact with or topics, keywords and/or other users that each user may be unwilling to see or interact with.
  • the user preferences may include an interaction mode.
  • the interaction mode may include any of text, voice, video, emote and email
  • the user preferences may include a restriction.
  • the restriction may only to be observed if another condition is available.
  • the restriction may include any of language, availability of a secondary device, volume intensity, and/or availability of an application.
  • user viewing of future content may be enhanced by incorporating features based on the user profile.
  • future content to be consumed may be filtered based on the user profile or the user preferences.
  • the filtering may include spoiler information, users, interaction mode, and/or intensity of volume.
  • method 900 may further comprise rendering processed interaction data to each user, wherein the interaction data may include data integrated from other platforms and/or applications.
  • method 900 may further comprise storing the data that may be desirous for each user to access at a future time based on the user preferences in the user profile.
  • method 900 may further comprise generating for each user interaction data based on the user preferences to communicate with other users and/or user devices.
  • method 900 may further comprise dynamically updating each user profile based on new data and/or changed user preferences.
  • FIG. 10 is a diagram illustrating another example of a method 1000 for enhancing an interaction service.
  • Method 1000 may be implemented in a content consumption device and may comprise, in step 1010, collecting first social interaction data associated with a consumption of a media content on the content consumption device, wherein at least a part of the first social interaction data may be for transmission to other content consumption devices consuming the media content, wherein the first social interaction data may comprise any of (i) one or more first comments related to the media content and (ii) one or more first chat messages occurring during the consumption of the media content.
  • Method 1000 may further comprise, in step 1020, generating a user profile based on the collected first social interaction data.
  • Method 1000 may further comprise, in step 1030, receiving second social interaction data comprising any of (i) one or more second comments related to the media content and (ii) one or more second chat messages occurring during the consumption of the media content.
  • Method 1000 may further comprise, in step 1040, filtering the second social interaction data based on the user profile.
  • Method 1000 may further comprise, in step 1050, displaying the filtered second social interaction data on the content consumption device or transmitting the filtered second social interaction data to at least one other content consumption device consuming the media content.
  • embodiments are described herein by referring to any of a user profile and user preferences that may be obtained by processing the first social interaction data, based on which the second social interaction data may be filtered. Any technique for processing the first social interaction data, based on which second social interaction data may be filtered, may be applicable to embodiments described herein.
  • the second social interaction data may be received from any of a local user interface of the content consumption device and a user device associated with the content consumption device.
  • the second social interaction data may be received from any of the other content consumption devices consuming the media content.
  • the first comments may be collected prior to the consumption of the media content on the content consumption device.
  • the first comments may be collected during the consumption of the media content on the content consumption device.
  • the first social interaction data may further comprise one or more responses on one or more previously received comments from any of the other content consumption devices consuming the media content, and wherein the user profile may be further based on the one or more responses.
  • method 1000 may further comprise collecting one or more posts on a social media that may be linked to the media content, wherein the user profile may be further based on the collected one or more posts.
  • method 1000 may further comprise collecting user data from any of a social media user profile and a social media application, wherein the user profile may be further based on the collected user data.
  • method 1000 may further comprise collecting user schedule information associated with the content consumption device, wherein the user profile may be further based on the collected user schedule information.
  • method 1000 may further comprise collecting context information, wherein the user profile may be further based on the collected context information.
  • generating the user profile may comprise identifying user preferences based on the collected first social interaction data.
  • the user preferences may include any of topics of interest, keywords of interest, blocked topics, blocked keywords, an interaction mode, an interaction intensity, an interaction device, and an interaction application.
  • method 1000 may further comprise any of adjusting an interaction mode and adjusting an interaction intensity based on the user profile.
  • method 1000 may further comprise dynamically updating the user profile based on changing user preferences and adapting the filtering of the second social interaction data according to the updated user profile.
  • the at least one other content consumption device may be selected from the other content consumption devices consuming the media content, based on common interests between the content consumption device and the at least one other content consumption device.
  • common interests may be determined based on the first social interaction data and the second social interaction data.
  • method 1000 may further comprise establishing an interaction session between the content consumption device and the at least one other content consumption device.
  • method 1000 may further comprise generating session specific enhancements when another media content may be consumed by the content consumption device and the at least one other content consumption device.
  • the content consumption device may be a digital TV.
  • FIG. 11 is a diagram illustrating another example of a method 1100 for enhancing an interaction service.
  • Method 1100 may be implemented in a processing device such as e.g., a server.
  • Method 1100 may comprise, in step 1110, collecting social interaction data from a plurality of content consumption devices consuming a same media content, wherein the social interaction data may comprise any of (i) one or more comments related to the same media content, and (ii) one or more chat messages occurring during a consumption of the same media content.
  • Method 1100 may further comprise, in step 1120, selecting at least two content consumption devices from the plurality of content consumption devices consuming the same media content, wherein the at least two content consumption devices may be selected based on the collected social interaction data by identifying common interests between the at least two content consumption devices.
  • Method 1100 may further comprise, in step 1130, transmitting information to the at least two content consumption devices, indicating the identified common interests.
  • Method 1100 may further comprise, in step 1140, establishing a session between the at least two content consumption devices if a response is received from at least one of the at least two content consumption devices that the session should be established.
  • the social interaction data may further comprise one or more responses on one or more previously received comments from any of the plurality of content consumption devices consuming the same media content, and the at least two content consumption devices may be further selected according to the one or more responses.
  • method 1100 may further comprise collecting one or more posts on a social media that may be linked to the same media content, and the at least two content consumption devices may be further selected according to the collected one or more posts.
  • method 1100 may further comprise collecting user schedule information associated with at least one content consumption device of the plurality of content consumption devices, and the at least two content consumption devices may be further selected according to the collected user schedule information.
  • method 1100 may further comprise collecting context information, wherein the at least two content consumption devices may be further selected according to the collected context information.
  • method 1100 may further comprise generating a plurality of user profiles respectively associated with the plurality of content consumption devices based on the collected social interaction data, wherein the common interests may be identified based on the plurality of user profiles.
  • method 1100 may further comprise identifying a plurality of user preferences respectively associated with the plurality of content consumption devices based on the collected social interaction data, wherein the common interests may be identified based on the plurality of user preferences.
  • the user preferences may include any of topics of interest, keywords of interest, blocked topics, blocked keywords, an interaction mode, an interaction intensity, an interaction device, and an interaction application.
  • method 1100 may further comprise generating session specific enhancements when another media content may be consumed by the at least two content consumption devices.
  • Any characteristic, variant or embodiment described for a method is compatible with an apparatus device comprising means for processing the disclosed method, with a device comprising a processor configured to process the disclosed method, with a computer program product comprising program code instructions and with a non-transitory computer-readable storage medium storing program instructions.
  • ROM read only memory
  • RAM random access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
  • processing platforms, computing systems, controllers, and other devices containing processors are noted. These devices may contain at least one Central Processing Unit (“CPU”) and memory.
  • CPU Central Processing Unit
  • an electrical system represents data bits that can cause a resulting transformation or reduction of the electrical signals and the maintenance of data bits at memory locations in a memory system to thereby reconfigure or otherwise alter the CPU's operation, as well as other processing of signals.
  • the memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, optical, or organic properties corresponding to or representative of the data bits. It should be understood that the representative embodiments are not limited to the above-mentioned platforms or CPUs and that other platforms and CPUs may support the provided methods.
  • the data bits may also be maintained on a computer readable medium including magnetic disks, optical disks, and any other volatile (e.g., Random Access Memory (“RAM”)) or non-volatile (e.g., Read-Only Memory (“ROM”)) mass storage system readable by the CPU.
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • the computer readable medium may include cooperating or interconnected computer readable medium, which exist exclusively on the processing system or are distributed among multiple interconnected processing systems that may be local or remote to the processing system. It is understood that the representative embodiments are not limited to the above- mentioned memories and that other platforms and memories may support the described methods.
  • any of the operations, processes, etc. described herein may be implemented as computer-readable instructions stored on a computer-readable medium.
  • the computer-readable instructions may be executed by a processor of a mobile unit, a network element, and/or any other computing device.
  • Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs); Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
  • DSP digital signal processor
  • ASICs Application Specific Integrated Circuits
  • ASSPs Application Specific Standard Products
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuits
  • FPGAs Field Programmable Gate Arrays
  • DSPs digital signal processors
  • a signal bearing medium examples include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a CD, a DVD, a digital tape, a computer memory, etc., and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
  • a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a CD, a DVD, a digital tape, a computer memory, etc.
  • a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
  • any two components so associated may also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated may also be viewed as being “operably couplable” to each other to achieve the desired functionality.
  • operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
  • the phrase “A or B” will be understood to include the possibilities of “A” or”B” or “A and B.”
  • the terms “any of' followed by a listing of a plurality of items and/or a plurality of categories of items, as used herein, are intended to include “any of,” “any combination of,” “any multiple of,” and/or “any combination of multiples of' the items and/or the categories of items, individually or in conjunction with other items and/or other categories of items.
  • the term “set” or “group” is intended to include any number of items, including zero.
  • the term “number” is intended to include any number, including zero.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

In an embodiment, (e.g., first social interaction) data may be collected based on a consumed content or in anticipating with consuming an upcoming content. For example, the (e.g., first social interaction) data may be associated with a consumption of a media content on the content consumption device, wherein at least a part of the first social interaction data may be for transmission to other content consumption devices consuming the media content. A user profile may be established (e.g., generated) based on the collected (e.g., first social interaction) data and e.g., analyzed for preferences. For example, second (e.g., social interaction) data may be received. For example, the second (e.g., social interaction) data may be filtered based on the user profile; and the filtered second (e.g., social interaction) data may be displayed on the content consumption device or transmitted to at least one other content consumption device consuming the media content.

Description

METHODS, ARCHITECTURES, APPARATUSES AND SYSTEMS DIRECTED TO DYNAMICALLY ENHANCE INTERACTION OF MULTIPLE USERS CONSUMING CONTENT
CROSS-REFERENCE TO RELATED APPLCIATIONS
[0001] This application claims the benefit of U.S. Patent Application No. 63/253,582, filed October 8, 2021 , which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
[0002] The present disclosure generally relates to networking websites and more particularly to techniques for enhancing user interaction between users of a network or website simultaneously as pertaining to consumption of a content.
BACKGROUND
[0003] With increasing availability of digital electronics and network devices, communication between individuals has become much easier. With the advent of social media and on-line communities, friend groups and haphazard strangers may join and play games, may share thoughts, and may attend meetings. Technologies allowing for interactive interactions may allow any such individual engagements that may be virtual to feel closer to real engagements. Video conferencing applications may be one such technology that may allow people to communicate with each other in remote locations through use of audio and video media in real time or otherwise. In addition to video conferencing, similar other technologies may allow the broadcast and dissemination of such viewings and interactions. Single content can be consumed in such manners simultaneously or otherwise and be enjoyed during get togethers, meetings and conferences, and other popular events. During many of these events or conferences, participant responses may be captured or disseminated live. This can include different scenarios, ranging from live streams of a sporting event to captured business meetings and technical conferences. Technology may be providing secondary users and consumers of content to connect together in a social manner e.g., when located remotely. [0004] Currently a popular method to enable user interaction may be through a video commenting feature and may be provided by some streaming platforms and video sharing services. The commenting feature may accompany the media content (e.g., a movie, a video clip, a live event streaming) and users may post their comments regarding the same media content. The comments may be posted by the users after watching the content or while watching the content. The management of comments may be performed by the service/content provider or the platform manager/moderator. This may not allow user interactions to be further customized according to user preferences and/or based on a particular user interaction. Embodiments described herein have been designed with the foregoing in mind.
SUMMARY
[0005] An apparatus, device and methods for collecting user data are described herein. A method for collecting user data may be implemented in a content consumption device. In an embodiment, (e.g., first social interaction) data may be collected based on a consumed content or in anticipating with consuming an upcoming content. For example, the (e.g., first social interaction) data may be associated with a consumption of a media content on the content consumption device, wherein at least a part of the first social interaction data may be for transmission to other content consumption devices consuming the media content. For example, the (e.g., first social interaction) data may include any of (i) one or more first comments related to the media content and (ii) one or more first chat messages occurring during the consumption of the media content. A user profile may be established (e.g., generated) based on the collected (e.g., first social interaction) data and e.g., analyzed for preferences. For example, second (e.g., social interaction) data may be received including any of (i) one or more second comments related to the media content and (ii) one or more second chat messages occurring during the consumption of the media content. For example, the second (e.g., social interaction) data may be filtered based on the user profile; and the filtered second (e.g., social interaction) data may be displayed on the content consumption device or transmitted to at least one other content consumption device consuming the media content.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The teachings of the present disclosure can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:
[0007] FIG. 1 is a block diagram illustrating an example of a communication network configured for facilitating one or more interactive real-time communications between users and/or (e.g., content consumption) devices, according to one environment;
[0008] FIG. 2 is a diagram illustrating an example of a dynamic adaption process;
[0009] FIG. 3 is a diagram illustrating an example of a flowchart for establishing an interaction session;
[0010] FIG. 4 is a diagram illustrating an example of a dynamic adaptation process;
[0011] FIG. 5 is a system diagram illustrating an example of deployment of an interaction enhancement service;
[0012] FIG. 6 schematically illustrates a general overview of an encoding and decoding system according to one or more embodiments;
[0013] FIG. 7 is a diagram illustrating an example of a method for enhancing an interaction service;
[0014] FIG. 8 is a diagram illustrating another example of a method for enhancing an interaction service;
[0015] FIG. 9 is a diagram illustrating another example of a method for enhancing an interaction service;
[0016] FIG. 10 is a diagram illustrating another example of a method for enhancing an interaction service; and
[0017] FIG. 11 is a diagram illustrating another example of a method for enhancing an interaction service. [0018] To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0019] FIG. 1 is a block diagram illustrating an example of a communication network 110 that may be coupled to a variety of (e.g., content consumption) devices 100 via different alternative means such as busses and cables (shown generally as 150). Throughout embodiments described herein the terms “content consumption devices”, “client computer”, and “devices”, collectively devices 100 may be used interchangeably to refer to devices capable to consume (e.g., any of render, display) any kind of content. The devices 100 are illustratively shown in FIG. 1 as desktop computers, laptops or mobile or smart devices like mobile phones. These are just a few examples of the devices that can be used, and it is understood that in alternate embodiments other devices can be added or substituted for these examples.
[0020] The network may be capable of facilitating one or more interactive real-time communication such as teleconference between streaming devices and mini networks that may provide audio and/or video components. The network may contain at least one processor 120 that may be, for example, in one or more servers (shown as 120 as well). The server/processor may be remote or local and can be in processing communication with other processor/servers, via e.g., network interfaces (e.g., any of transmitter, receiver, transceiver). In alternate embodiments, one or more of these processor/servers can be part of a coupled or group of wireless or portable devices via a base station. In one example, server 120 may represent an instance among a large set of instances of application servers in a data center, cloud computing environment, or any other mass computing environment. This is illustrated by reference numerals 130 (e.g., network/cloud). There also may include thousands or millions of client computers. Together they will hereinafter be referenced as numerals 110, 130 or 110/130 interchangeably. [0021] The network 110/130 can also provide internet connectivity and include other components such as a digital television (DTV) as illustratively shown at 140 that may allow viewers to stream any of on-demand videos, online interactive media, over-the-top content, music, browse the internet, view photos, and the like. There may be technologies that may enable a user to mirror the content displayed on the screen of their personal device to the screen of a digital television (DTV) or the like which can also provide such communication between one or more devices 100. Some examples of these technologies include Google Chromecast™ and Apple TV™. This technology may enable a user to share content from their personal devices onto a larger screen such that it can be more easily viewed by others. For example, displaying pictures or movies stored on a personal device onto the larger screen of a DTV can be used for others to view the content more easily.
[0022] In one embodiment, at least one processor is provided in one of the devices for example in the server computer 120 which may be configured to host a conferencing meeting and may transmit and receive any of video, image, and audio data to and from (e.g., each of) the client computers (through the cloud or the network when that is the case). Each of the client computers, in one embodiment, may also include their own computing device with at least a processor, processing units, graphics processing units (GPU), one or more buses, memory organized as volatile and/or nonvolatile storage, one or more data input devices, I/O interfaces (such as e.g., any of transmitter, receiver, transceiver) and output devices such as loudspeakers or a LINE-OUT jack and associated drivers. Each of the client computers may also include an integrated or separate display unit such as a computer screen, TV screen or other display and have any of mobile or stationary computers including desktop computers, laptops, netbooks, ultrabooks, tablet computers, smartphones, et cetera. In one embodiment, the network 130 may comprise the Internet.
[0023] In one embodiment, each (e.g., content consumption) device 100 that can be referenced as user or client computer 100, may have a particular profile and one or more users can have a profile that may reside in the device or client computer or can be located on the cloud under a user account. In an embodiment, the network through one or more servers may also maintain a list of accounts or profiles that may (e.g., each) be associated with one of the client computers and/or one or more users of the client computers. In one embodiment, (e.g., each of) the client computers can be used by an attendee of a conference session that may enable any of video (any of live, streaming and recorded images) and audio operations. Devices/client computers 100 can be (e.g., simultaneously) a presenting device or a recipient device (attendee) of a video conference session.
[0024] In recent years, technologies for creating or enhancing a viewing experience for users who may be consuming content in non-social settings (e.g., an individual or small group viewing a DTV) or socially suboptimal settings may be available. A socially suboptimal setting may be one that may lack social cues of someone who would otherwise enhance the experience if present. Such systems may inject or overlay social effects with the primary media content during synchronous or asynchronous viewing. However, valuable social effects may be lacking from current systems for several reasons.
[0025] In one scenario, users may be reluctant to share information due to privacy concerns (e.g., for effects that may utilize user-provided audio, a user may be concerned that a microphone may pick up unrelated conversation in a different room). In other scenarios users may be reluctant to share information that may interrupt the viewing experience (e.g., writing a chat message or selecting an emoticon may involve attention from the user that may distract from the primary content). In addition, users may be unavailable for viewing of a live event referenced as a media item (e.g., a live broadcast of a sporting event or new show). Media items may be part of a larger media content being consumed or be the entirety of the media content.
[0026] It should be noted that in some scenarios, users may not have viewed content that may be being viewed by other members of their social groups (e.g., asynchronous viewing).
[0027] Current technologies lack the ability to the creation/modification/addition of social elements to the content that may be consumed by the users via content consumption devices. For example, many general-purpose teleconferencing solutions (Skype, Zoom, Facetime etc.) implicitly pull in social elements from participants that may be viewing a shared media item (e.g., a user may play a video while sharing his/her screen). These solutions may include tools such as live chat that may serve to synchronously overlay information about the shared media item. More recently, solutions such as Netflix Party, Disney Plus GroupWatch, Facebook Live, Scener, Amazon Prime Video Watch Party and various applications for synchronous viewing of YouTube videos have been developed that incorporate different forms of text-based social cues but still do not address the control of cues. These social cues may include (e.g., traditional) text-based information such as live chat that users may engage in that may appear alongside displayed media content.
[0028] Graphical emoticons may be another form of text-based cues that may be selected and inserted during playback of the media item. An example of a system that supports insertion of emoticons is Facebook Live which allows users to select “reaction emojis” which appear layered with the display of a live-streamed media item. These reaction emojis may be preserved with the media item and made available when the media item may be replayed by other users of the system but still do not address the prior art shortcomings.
[0029] User’s content consumption experience may be accompanied by interactions between the (e.g., content consumption device of the) user and (e.g., other content consumption devices of) other users that may be consuming or may have consumed the same content. For example, a user may post a comment to a video and read comments posted by others, or multiple users may use text chat to share their thoughts on the content while watching at the same time. The interactions among users regarding the content consumption may be viewed as part of the viewing experience. Embodiments described herein provide (e.g., management) mechanisms enabling the user to better control the interaction with others, or tailor the interactions to better fit the user’s interest or preference.
[0030] Streaming platforms and video sharing services may enable user interaction through the commenting feature that may accompany the media content (e.g., a movie, a video clip, a live event streaming). In this manner users may post their comments (e.g., which may be transmitted by content consumption devices) regarding the same media content. The comments may be posted by the users (e.g., transmitted by the content consumption devices) after watching (e.g., displaying) the content or while watching (e.g., displaying) the content. The comments could be organized in a separate section that may be displayed independently of the media content. For comments that may be input during the content consumption process, the display of such comments may be synchronized with the content. Moreover, a user may react to the other users’ comments through replying or rating (e.g., upvote or downvote). For example, the commenting features, the management of comments may be performed by the service/content provider or the platform manager/moderator. For example, the manager may block or mute a certain user that violates the terms of service. In some cases, users (e.g., content consumption devices) may be categorized into different tiers (e.g., VIP user, subscribed user) where only certain tiers of users may be allowed to post comment, and the privilege of different tiers of users may be dynamically adjusted by the manager or moderator. Some services may allow the user (e.g., content consumption device) to specify (e.g., indicate) keywords to be filtered so that comments containing the keywords will not be displayed to the user (e.g., by the content consumption device). The commenting feature may be applied to group-based content consumption services (e.g., watch party, teleparty). A user may join a group of family members or friends to form a group prior to the content consumption process, and group-wide interaction could be enabled (e.g., group chat window). However, because there is no ability to further customize user interactions or allow the user to manage such interactions according to preferences and/or content, user interactions may remain limited. The following two scenarios enable understanding of some common problems.
[0031] In a first scenario, User 1 may be watching a content. In this example the content may be a live sports game between Team A and Team B. User 1 may be watching by himself, and he would like to interact with the other sports fans that may be also watching the live game to share his excitements and thoughts. He may start to read the real-time comment section and may post his comments. The live game may have attracted a huge number of viewers. As a result, the comment section may be filled with an overwhelming number of comments such that User 1 may find difficult to keep up with what the others may be saying. As a fan of Team A, User 1 may find that he may not want to see the comments that are cheering for Team B. In addition, User 1’s favorite player in Team A may be Bob, so he may prefer to have more discussions with the other fans of Bob. User 1 may find from the comment section that several other viewers may also be talking about Bob, so he would like to have more discussions with them. Voice chatting with those viewers may be preferred as typing comments may be time consuming. However, those viewers may be strangers to User 1 and he may not be able to have voice chats with them.
[0032] In a second scenario, User X may be watching a suspense movie which may be based on the parallel universe theory. She may like this theory and would like to see what the other viewers of this movie may think about this topic. She may take out her smart phone to check on social media for more discussions on this topic (e.g., searching with “#ParallelUniverseMovie”). While she may be looking at her smart phone, she may mis an important scene that may explain the plot. Having difficulty understanding the plot, she may decide to turn on the synchronized commenting window so see if there may be any other viewer talking about this. But she may see a spoiler comment, which may make her unhappy. [0033] Embodiments described herein provide features and services that can enable management and control over these types of interactions by enabling customizing the user interaction based on user context or preference or by allowing the user to manage or control the interaction. FIG. 2 is an illustration of an example providing customized user interaction techniques.
[0034] In many communication interactions, the input may come from a variety of sources. One of these sources may be a user interface of a content consumption device associated with the user, shown in FIG. 2 by numerals 200 or from a user device associated with the content consumption device (automatically generated or otherwise) as shown by reference numerals 202. Other users (e.g., other content consumption devices) 206 may also respond or initiate. In one embodiment, a user interaction enhancement service 204 can also be provided to improve user-awareness and customizability of the users’ interactions that may accompany the media consumption process. The latter service may be capable of collecting and analyzing users’ content consumption and interaction activity to detect user preference, based on which, customizing and processing user interaction to enhance the media consumption experience may be enabled.
[0035] In this scenario, the interaction enhancement service 204 can also be implemented at the user/client side to provide customized enhancement for the client user. The interaction enhancement service 204, at one user (e.g., content consumption device) may interact with another user (e.g., content consumption device) directly or through the counterpart service enabled at another user (e.g., content consumption device). The described service may also be implemented in a centralized manner, providing services to multiple users (e.g., content consumption devices). In this scenario, the service may support the customization of individual users (e.g., content consumption devices) and the coordination among multiple users (e.g., content consumption devices). Moreover, the described service may be implemented as a combination of the above, with, for example, the client-side service supporting user (e.g., content consumption device) customization and the central service supporting multi-user (e.g., content consumption devices) coordination.
[0036] Since the user interaction during content consumption may have the strongest impact on the viewing experience, many of the examples and embodiments described herein involve interaction during content consumption. Embodiments described herein are not limited to social user interactions during content consumption and may be applicable to social user interactions that may take place before or after the content consumption.
[0037] Interaction data in this disclosure may refer to any data associated with the media consumption user inputs responsive to or related to the content or to other interaction data. The interaction data may be made visible (e.g., public, for transmission) to other users (e.g., content consumption devices) consuming the same content. Examples may include comments to a video, messages in real-time chat window accompanying a live event, etc.
[0038] User interaction, as in the disclosure may refer the interaction among users associated with the same media content. The interaction may take place among the users (e.g., content consumption devices) that may be consuming the content at the same time, or among users (e.g., content consumption devices) that may consume the same content at different times. [0039] In FIG. 2, the different steps of how the interaction can be further customized can be reviewed by looking at the steps of this embodiment.
[0040] Step 1 , enumerated as S210, provides for an interaction-related user data collection step. In this step, the data collection process may include collecting data that could be used for computing user preference and enhancing user interaction. The data to be collected may include the interaction data from the user (e.g., content consumption device) and other users (e.g., content consumption devices), user’s feedback or reaction to the content or the other users’ interaction data, user context information, etc.
[0041] Step 2 or S220, provides for the user (e.g., preference analysis and) profile generation step. In this step, based on the collected data, the system may perform analysis to gain insights on how to improve user interaction, such as the user preference of an interaction topic and an interaction mode, user expectations on interaction frequency and data volume, etc. The user preference and/or expectations could be maintained, for example, in a user profile.
[0042] Step 3 or S230, provides for an interaction enablement and enhancement step. In this step, after the analysis, the interaction process may be customized, and the interaction data may be processed according to (e.g., the user preference captured in) the user profile.
[0043] Step 4 or S240 provides for a dynamic adaptation. In this step, the user preference and/or context may change during the content consumption process. For example, the user profile may be dynamically updated, and interaction enhancement mechanisms may be adapted according to the (e.g., latest) updated user profile.
[0044] Finally step 5 or S250 provides for a session-based interaction enablement step. In this step, the user interaction process may involve multiple users (e.g., content consumption devices), and the interaction enhancement service may be applied to more than one user (e.g., content consumption device) jointly. For example, an interaction session may be established between the involved (e.g., selected) users (e.g., content consumption devices) to perform session-based enhancement services.
[0045] To ease understanding of how these steps S210 to S250 can be achieved using different embodiments, a more detailed description of some examples will now be provided with the understanding that alternate embodiments can also be achieved as appreciated by those skilled in the art.
I. Step S210 (1) - User Data Collection
[0046] To understand the user data collection process, a description of how and what type of data is to be collected is provided including a location of where the data may be collected, and when to perform the data collection.
[0047] A) Data type to be collected - A variety of user data may be collected as the basis of e.g., user awareness and analyzing user preference, including the interaction data, user responses and other context information.
1) Interaction data
[0048] The interaction data to be collected may include both the interaction data generated by the user (e.g., content consumption device) for transmission to other users (e.g., content consumption devices) and the interaction data received from the other users (e.g., content consumption devices). The interaction data generated by the user (e.g., content consumption device) may indicate the user preference or interest regarding the content and the interaction accompanying the content consumption process. For example, the user (e.g., content consumption device) may have left (e.g., transmitted) a comment while watching (e.g., displaying, rendering) the semi-final of a sports game indicating the user (e.g., of the content consumption device) is a fan of a certain team or a certain player (e.g., “I hope Team A could be the champion of this season”, “I think Player X is the MVP”). The information may indicate the user’s interests, which may be used to filter future social interaction data to better suit the user’s interest.
2) Responsive data
[0049] The user’s reaction or feedback to the content or to the interaction data may provide supplemental information in identifying user preference or interest. For example, the user may show interest in a particular component/object in the content which may be indicated by the user gazing at that component/object for a while. The user following or subscribing to another user on social media may indicate the user’s interest in other users.
[0050] In another example, after seeing (e.g., receiving) a comment from another user (e.g., content consumption device), the user (e.g., of the content consumption device) may agree/disagree to the comment by putting (transmitting) an upvote/downvote to that comment. The action of showing agreement/disagreement to the social interaction data may indicate the user may have similar/opposite opinions with another user and thus help identify the likeminded users or the interaction data from such users. In embodiments described herein the term “social interaction data” will be used to refer to any kind of data representative of a social interaction between users of content consumption devices related to a consumption of a same media content, such as any of interaction data and responsive data as previously described.
3) Context information
[0051] In addition to the interaction data and responsive data, other information that may be relevant to the content consumption or the user interaction may be context information of the user that may be used to enhance the interaction and media consumption experience. For example, any of the user location information, language preference, and information of the device used for content consumption process may be collected and used for any of customizing and filtering the (e.g., social) interaction data. The user contact list (based on user IE, emails, social networking contacts) and social map may be used to detect a potential interaction target that the user may be interested in. For example, the context information may include whether the user is consuming the content alone or with a pre-made group, or whether the user is consuming the content with a child.
[0052] In another example, the information of whether the user is watching a movie for the first time may help to determine if the user may desire a spoiler-free interaction with the other users. The information of whether the user is watching a show with their child may determine if additional filtering on language usage should be applied to the (e.g., social) interaction data. [0053] In yet another example, context information may indicate the device(s) the user may be using for any of media consumption and user interaction. For example, certain users may only want to interact via the secondary device (e.g., phone) rather than primary device (e.g., DTV). For example, users may only want to interact if they have their phone present with them and it is operating in a specified mode (e.g., interaction app is open, and interaction is enabled) while watching content on the primary device. Any of the presence and the status of a user (e.g., secondary) device may be monitored as (e.g., user) context information.
4) User’s instruction
[0054] The user may manually configure the interaction enhancement service by setting preferences and expectations e.g., via a user interface. For example, the service may be requested (e.g., configured) to filter out negative or irrelevant interaction data from the other users (e.g., content consumption devices), or to keep both positive and negative comments but throttle the comments to maintain an acceptable amount. The user may also specify (e.g., indicate, configure) what sources may be allowed for the data collection.
5) Data source
[0055] The social interaction data and other relevant information may be collected from a variety of sources (or specified by the user, if applicable). Many content providing or sharing platforms may provide a commenting feature where a user may post their comments. The comments as well as the user response to the comments may be collected as social interaction data. Social media and communication tools may be the source where the users may share and exchange their thoughts regarding the media content, which could be collected as social interaction data. User context information may be collected from the user content consumption history. For example, from the user’s movie watching history, it may be determined if the user has watched a certain movie and that information may be used to determine if a spoiler-free interaction may be preferred in a case where the user watches the movie in the future. The context information may be collected from the physical environment of content consumption and the devices within the environment. For example, it may be detected from sensing devices if the user is watching the content alone or with family members and/or with a child, then the interaction preference could be adjusted accordingly.
6) Timing of collection
[0056] Data collection may be (e.g., constantly) performed, including any of before, during, and after a content consumption process. Before the content consumption, social interaction data, such as any of the user’s posts and replies on their social media, may be collected to predict topics of interest for the user for enhancing the interaction of a future content consumption process. For example, the users may talk about how they are expecting to see the performance of a player in a live event which may be on air in two days. After the content consumption, user’s feedbacks, and responses to the content and/or to the interaction data may be collected for future reference. For example, the users may write a review of the movie or TV show that they just watched or post a comment to a video. During the content consumption process, the social interaction data that may be generated by the user (e.g., content consumption device) may be used to make dynamic adjustments to the interaction. The social interaction data received from the other users (e.g., content consumption devices) may be processed and integrated into the content to be displayed by the content consumption device to the user.
II. Step 220 (2) - User Preference Analysis and Profile Generation
[0057] Based on the collected data, the system may perform analysis to establish user awareness and gain insights on how to improve user interaction, such as e.g., any of the user preference of an interaction topic, an interaction mode, user expectations on interaction volume, etc. The user preference and expectations may include the following aspects:
A. Topics that the user may (not) be interested in e.g., topics of interest associated with a content consumption device
[0058] The user may be interested in certain topics related to the content and would like to interact with others around those topics. For example, the user may prefer to see comments on a specific team, a player, an actor/actress, a theme. For example, the user may prefer to avoid social interaction data related to a certain topic. In addition to examining user content consumption history, the topics of interest may be identified by extracting frequently appeared keywords from the interaction data and/or leveraging natural language processing techniques on any of the user interaction data and responsive data.
B. The emotion/attitude of interaction
[0059] The social interaction data may convey certain emotional stance or attitude of the user who generated the social interaction data. The user may prefer receiving social interaction data in a certain attitude than the others. For example, the user may prefer to see positive, optimistic, or cheerful comments rather than negative or aggressive comments. The emotion/attitude of social interaction may be identified with e.g., emotion detection algorithms. For example, based on user response to interaction data with certain emotion/attitude, the user preference may be determined.
C. The volume or intensity of interaction
[0060] The user may specify (e.g., configure via a user interface) a restriction on the volume of interaction data that could be received/displayed during the content consumption process. The restriction may not be a fixed value and may be dynamically adjusted during the content consumption process. For example, the user may be able to follow more comments when the content is at a slower pace but fewer or no comments when the pace of the content is increased. The allowed intensity of interaction may be estimated based on whether the user may be focusing on the content, which may be done by analyzing the desired level of concentration of the specific content (e.g., the climax of a movie, a scoring moment of a sports game) or tracking the user’s gaze.
D. Specific interaction target
[0061] The user may be willing to interact differently with different users. Such specific interaction targets may be specified (e.g., configured) by the user (e.g., via a user interface) or identified based on the user interaction data. For example, the user may be more willing to interact with their acquaintances or close contacts during the content consumption process, who may be identified from the user context information. If the user has responded positively to several comments from another user, the latter may be identified as a potential target that the user may like to be engaged with more often. For example, if multiple comments from the same person are manually blocked by the user or automatically blocked based on any other criteria, then the person may be blacklisted so that future comments from this person may not be displayed by the content consumption device to the user.
E. User category
[0062] Categorization of users could be viewed as a special case of specifying (e.g., configuring, identifying) an interaction target. The interaction process may be configured so that only the social interaction data from the same or a particular user category may be visible to (e.g., displayed by) the user (e.g., content consumption device). For example, users (e.g., content consumption devices) may be categorized based on their context information such as any of age, location, primary language they are using, first-time viewer, etc. For example, a user (e.g., content consumption device) can only display the comments from other users (e.g., content consumption devices) using the same language that the user (e.g., content consumption device) may be using and the comments in other languages may be filtered out (e.g., not displayed). In another example, only the comments from a first-time viewer may be visible to the other first-time viewers, which may help prevent spoilers in the social interaction data.
F. Interaction mode
[0063] The user may indicate (e.g., configure via a user interface) the preferred interaction mode, such as any of text, voice, and video. The preferred interaction mode may be specified (e.g., configured) by the user or detected by the system during the content consumption process. The interaction mode may be initialized as text messages or comments (default mode) and adjusted as the interaction atmosphere may change (e.g., the users feel they are getting closer socially, the discussions heat up). For example, during the interaction process, a pair of users may share highly similar comments and frequently upvote the comments from the other. An emotional resonance could be detected between them, which may indicate the preference of additional modality of interaction, such as switching from text chatting to audio/video chatting.
[0064] The above-mentioned user preferences may apply to the social interaction data received from the other users (e.g., content consumption devices). The user preference may also apply to the social interaction data generated by the user (e.g., content consumption device), such as keywords and/or emotion-based filtering.
[0065] In addition to the above-mentioned aspects, the user may also specify (e.g., configure) other configurations such as where the interaction data could be displayed or presented (e.g., as chat window on the primary screen, on a companion device).
[0066] For example, a user profile may be created to record the user preference and expectations that may be identified from the data analysis or configured by the user. The service may also provide pre-configured template profiles which the user may directly apply for interaction enhancement or apply with minor modifications. For example, a “friendly mode” profile may be pre-configured featuring mild language usage, non-aggressive messages, positive and encouraging attitude, etc.
III. S230 (3) - Interaction Enablement and Enhancement
[0067] After the analysis, the system may customize the user interaction during the content consumption process according to the user profile, which may include any of the processing and presenting (e.g., displaying) of received (e.g., social) interaction data, managing and assisting the generation of (e.g., social) interaction data.
A. Processing received (e.g., social) interaction data
[0068] One procedure in interaction enhancement may comprise processing the received (e.g., social) interaction data. The processing of (e.g., social) interaction data may be performed any of before the content consumption process, and during the content consumption process. Different types of operations may be applied to the received (e.g., social) interaction data. a. Filtering - The received (e.g., social) interaction data may be filtered based on e.g., the user profile (such as e.g., the user preference and expectations). For example, (e.g., social) interaction data containing keywords or topics that the user may not be willing to see or may not belong to topics of interest may be removed or made invisible (e.g., not displayed) to the user. For example, a comment from a person that may have been red flagged or blacklisted by the user (e.g., configuration) may be blocked. b. Compression - If the user requested (e.g., via user configuration) the amount of interaction data to be restricted, then the (e.g., social) interaction data volume may be compressed by methods such as any of throttling and combining comments with similar meaning. In embodiments described herein, the term “filtering” may be used to also cover the compression operation (e.g., reducing the volume of the interaction data) c. Integration - There could be multiple sources of (e.g., social) interaction data. For example, (e.g., social) interaction data from multiple resources may be integrated in any of one location and one device such that the user may not need to check multiple locations. In embodiments described herein, the term “filtering” may be used to also cover the integration operation, e.g., in a case where interaction data are received from multiple sources, filtered and combined in a single piece of interaction data. d. Transformation - Any of the format and the modality of the (e.g., social) interaction data may be transformed (e.g., adapted) according to the user preference. For example, the user may find it distracting to read the comments during the content consumption process, and the text comments may be transformed into audio comments (e.g., text to speech) such that they could be read to the user.
B. Presenting received (e.g., social) interaction data
[0069] The processed (e.g., social) interaction data may be displayed (e.g., presented) to the user during and/or after the content consumption process in different formats. a. Differentiated display (e.g., presentation)- The (e.g., social) interaction data may be categorized and presented (e.g., displayed) to the user in a differentiated manner. For example, if a comment is on (e.g., belongs to) a topic of interest, the font of the comment may be set larger than the others. If multiple comments are expressing similar meaning, they can be combined and displayed in a larger font indicating that this comment is echoed by many users. If a comment is from a close contact, the comment may be any of annotated and highlighted for easy recognition. In addition to using larger font size, an alternative way of differentiating comments may be to transform a highlighted comment into audio comment and read to the user as compared to the rest of the comments that may be displayed to the user in text format. b. Save for later - During the processing of (e.g., social) interaction data, some of the data may be filtered out and not displayed to the user during the content consumption process. The filtered-out data may be any of permanently removed, stored such that it could be displayed to the user at a later time. For example, due to the volume restriction, it may not be able to display all the (e.g., social) interaction data during the content consumption process. In this case, only a portion of (e.g., social) interaction data may be displayed to the user in real-time and the rest of the data may be stored such that it may be accessible by the user later.
[0070] In a case where multiple users are associated with a single content consumption device and sharing the interaction enhancement service, the service may be configured to any of display the (e.g., social) interaction data on the common primary device, and on a (e.g., each) individual user companion device. This may be based in part on whether all of the users have the same or differing interaction preferences.
C. Managing interaction data generation
[0071] Similar to the processing of received (e.g., social) interaction data, the (e.g., social) interaction data generated by the user (e.g., content consumption device) may be processed before it is sent to or shared with the other users (e.g., content consumption devices) with any of filtering, compression, integration, and transformation. For example, the user may request to transform their voice commenting into text comments. In another example, the user may apply self-censoring on the generated (e.g., social) interaction data, such as blocking or replacing expletive words, blocking entire phrases with expletive words. A time delay may be applied before the generated (e.g., social) interaction data may be shared with (e.g., transmitted to) the others (e.g., content consumption devices) so that additional processing may be performed on the data.
[0072] Moreover, during the content consumption process or the interaction, special moments of user’s emotion such as any of wow, excitement, frustration, and sigh may be captured by any of eye gaze tracking, facial expression, and voice recognition. This captured information can be converted to current communication format on the streaming session. For example, if the system detects that the user is laughing, a smile icon or emoji may be inserted into the interaction data on behalf of the user. If the user got angry at certain comments made by (e.g., received from) another user (e.g., content consumption device), the angry facial expression may be automatically converted to angry icon/emoji, and inserted in the chatting message. [0073] The user may be assisted in generating the interaction data. For example, the user may want to stay involved in the interaction, but it may feel distracted to constantly post comments while watching the content. The service may assist the user in generating comments and replying to others’ comments (e.g., by using message bot) so that the user could stay connected without spending excessive amount of time in the conversation. For example, when users are chatting about content with one another, the bot can be used to auto reply on behalf of a user. This could be controlled by user configuration (e.g., the bot may be enabled in a case where the user is detected as busy, e.g., doesn’t want to be bothered, etc.). The bot may be used as an assistant/helper to generate a draft reply for a user, which may be updated/tweaked before sending, e.g., by the user if he so chooses. This bot can take into account the computed user preferences and/or expectations when formulating (e.g., generating) interaction message data on behalf of a user. For the case where multiple users share the content consumption device and the interaction enhancement service, the users in the group may specify if the (e.g., social) interaction data generated by them (e.g., the content consumption device) will be shown (e.g., transmitted) to the other users (e.g., content consumption devices) as data originated from a single user or from multiple users (e.g., of the content consumption device).
IV. S240 (4) - Dynamic Adaptation-
10074] The user preference or context may change during the content consumption process. In an embodiment, the interaction enhancement mechanisms may adapt to the changes. For example, the user may have no preference on the attitude of comments at the beginning of the content consumption process. Later, the user’s mood may change (e.g., the team that the user is supporting is losing). If the user’s mood turns to bad, seeing negative comments may further upset the user, in which case the filter may be dynamically adjusted to remove negative comments.
[0075] To achieve dynamic adaption, the data collection and e.g., any of the user profile and user preference analysis process may be continuously performed during the content consumption process so that the (e.g., social) interaction data may be processed based on the real-time updated user preference (e.g., profile). In the above example, the change of the user’s mood, or more generally, the change of user context may be detected as the user responsive data (e.g., bio-signals, emotional state,) may be collected. Based on the updated context information, the user profile may be updated correspondingly, e.g., the preference of non-negative comments may be added. The updated profile may then be reflected by the processing of received (e.g., social) interaction data where a new (e.g., updated) filter may be applied to the (e.g., social) interaction data.
[0076] In another example, the user may be exploring some media content belonging to a genre which the user may not be familiar with. In that case, there may be little information in the user profile that may describe the user preference on interaction, such as whether the user may be interested in a certain topic mentioned in the content. A default profile may be applied at the beginning of the content consumption process. During the content consumption process, as the user may react to the content and the (e.g., social) interaction data, the user profile may be dynamically built up (e.g., updated), based on which the future (e.g., social) interaction data may be processed accordingly. For example, a minimum length of data collection period may allow to ensure enough data may be collected for analysis and creating (e.g., updating) the user profile.
[0077] A user preference or profile can be temporarily modified due to (e.g., certain) events during the content consumption process, which may apply temporary changes to the (e.g., social) interaction data. For example, the user may post a question during the content consumption process, asking for answers from the other viewers. Then in a following period of time, all the comments that may be answering the question may be highlighted, or the comments not related to the question may be temporarily blocked.
[0078] An embodiment of the dynamic adaptation process is illustrated in FIG. 3. In FIG. 3, timeline of (e.g., user) data collection is shown 310. The (e.g., user) data collection may be performed (e.g., constantly) any of during the content consumption process (timestamp A to B, timestamp C to D) and outside the process (before timestamp A, timestamp B to C, after timestamp D). For example, more information may be gathered during the content consumption process since there might be a larger amount of interaction data (including generated data and received data) and responsive data during the process. For example, the user profile 320 may be updated more frequently during the content consumption process, and less frequently outside the content consumption process. After a content consumption process may be started, the service may generate (e.g., update) a user profile based on previously gathered user data as well as information of the content. Interaction enablement and enhancement may be performed, for example, based on the latest user profile. During the content consumption process, the user profile may be (e.g., constantly) updated based on the newly collected information, such that the interaction process of the user can be dynamically adjusted accordingly. Note that, Error! Reference source not found.3 only shows the interaction enhancement 330 applied during the content consumption process, the enhancement service may also be applied to the interaction process outside the content consumption process.
V. S250 (5) Session-based Interaction Enhancement
[0079] Previously the interaction enhancement service was described from the perspective of individual users (e.g., content consumption devices). Since the interaction process may involve multiple users (e.g., content consumption devices), the interaction enhancement service may apply to more than one user (e.g., content consumption device) jointly in certain cases, such as to obtain consensus among multiple users (e.g., content consumption devices), or to apply additional functionality or limitation within a specific set of users (e.g., content consumption devices). An interaction session may be established between the involved users (e.g., content consumption devices) to perform session-based enhancement services. For example, based on any of the user data analysis and user profile, a specific subset of users (e.g., content consumption devices) may be identified (e.g., selected) as mutually desired targets for interaction, where an interaction session may be established among these users (e.g., content consumption devices). The interaction session may involve operations in addition to the above-mentioned steps. For example, a voice chatting session may be established among a subset of users (e.g., content consumption devices) in a case where voice chatting is the preferred interaction mode.
[0080] If some users get closer through well sympathized comments or positive communications, the users (e.g., content consumption devices) may be identified as potential targets for further interaction enhancement where more capable or additional modality of communication functionality may be opened between these users (e.g., content consumption devices) such as e.g., from text chatting only mode to video/audio capable communication. An interaction session may be established to enable the additional communication functionality. For example, if the functionality is not supported by the current app, other capable app (Zoom, Skype, etc.) may be provided.
[0081] In addition, any of a competition and a cooperation mechanism may be applied to users (e.g., content consumption devices) in an interaction session. For example, the service may enable the users’ interaction activity to be evaluated by each other. Some user comment or interaction activity may get momentum or support from the other users, then this user could be considered as “popular” in this session and the user may get recognized with a highlighted icon or identity (e.g., a visual crown, cheer audio). In addition, session-specific content modification or adjustment may be applied to users within a session. For example, the content consumption process may be paused in a case where the users in the session are discussing the content.
[0082] In addition, session-specific content modification or adjustment may be applied to users (e.g., content consumption devices) within a session. For example, the content consumption process may be paused in a case where the users (e.g., content consumption devices) in the session are discussing (e.g., transmitting and receiving social interaction data) about the content.
[0083] FIG. 4 shows an example of a procedure for establishing an interaction session with the interaction enhancement service. In FIG. 4, user A (e.g., content consumption device) and user B (e.g., content consumption device) are specified by numerals 400 and 406 respectively and their respective interaction session service by 402 and 404 respectively.
[0084] In FIG. 4 at S410 or Step 1 - a trigger may be detected to establish an interaction session. As described in the previous examples, a subset of (e.g., two or more) users (e.g., content consumption devices) may be identified (e.g., selected) by the interaction enhancement service (hosted locally at user A (e.g., content consumption device) 400 or centrally at a cloud server) where additional features or configurations may be applied to the interactions among the content consumption devices, such as advanced interaction mode, more closely connected interaction, competition or cooperation mechanism, etc. The trigger may be detected based on any of the user preferences and profile (e.g., updated in real-time during the content consumption process). The user (e.g., content consumption device) may also initiate the establishment of an interaction session by sending a command indicating a Session Establishment Request to the service (not represented).
[0085] In S420 or Step 2 - a Session Establishment Request message may be sent from the interaction enhancement service to user A (e.g., content consumption device) 400. The Session Establishment Request message may include information indicating a description of the session to be established, such as the specific feature of the session as compared to the baseline interaction process. The identities (e.g., identifiers) of the other users (e.g., content consumption devices) in the session may or may not be included in the Session Establishment Request message.
[0086] In S430 or Step 3 - user A (e.g., content consumption device) may respond by transmitting a Session Establishment Response message, indicating accepting (or declining) the request to establish the interaction session.
[0087] In S440 or Step 4 - a Session Establishment Request message may further be sent to each user (e.g., content consumption device) such as e.g., user B that may be involved in (e.g., selected for) the session to get the consensus of users (e.g., content consumption devices) in the session. If the target user (e.g., content consumption device) supports interaction enhancement service, the Session Establishment Request message may be processed by the corresponding service hosted at the target user (e.g., content consumption device). If not, the Session Establishment Request message may be directed to corresponding applications/services that may be enabling user interactions at the target user (e.g., content consumption device).
[0088] In S450 or Step 5 - the user(s) (e.g., content consumption devices) involved in the session may respond to the interaction enhancement service which initiated the session establishment by transmitting a Session Establishment Response message.
[0089] In S460 or Step 6 - after all users (e.g., content consumption devices) may have agreed on the establishment of the interaction session, the interaction enhancement service may send a notification to each user (e.g., content consumption device). If not, all users (e.g., content consumption devices) agree on the establishment, then the session may be established among (e.g., only) the users (e.g., content consumption devices) that may have agreed.
[0090] For example, in S470 or Step 7 - an interaction session may be established among the selected set of users (e.g., content consumption devices), and session-specific interaction enhancement features or configurations may be applied to the (e.g., social) interaction data of these users (e.g., content consumption devices).
[0091] As can be appreciated by those skilled in the art, a number of embodiments can be used to achieve the techniques discussed. To ease understanding two of these embodiments will now be discussed in more detail with the understanding that others are alternatively achievable.
[0092] A first embodiment can be titled as the System Level Embodiment. For this embodiment, there may be several deployment options for the described interaction enhancement service. The interaction enhancement service may be deployed, for example, in a centralized manner, such as in a cloud server. The interaction enhancement service may be accessed by the client hosted on user (e.g., content consumption) devices. In another example, the interaction enhancement service may be deployed in a distributed manner where the interaction enhancement service may be hosted (e.g., implemented) at the user (e.g., content consumption) device. In the distributed deployment, the interaction enhancement services hosted on different user (e.g., content consumption) devices may coordinate with each other to achieve certain functionalities, such as the session-based enhancement as described herein. For example, the interaction enhancement service may be deployed in a hybrid manner where a portion of the functionality may be provided on user (e.g., content consumption) device(s) and the rest provided by the centralized server. The hybrid deployment may provide the flexibility of adjusting the footprint at the user side according to different implementation considerations, such as the capability of a device, latency requirements, privacy preservation, etc.
[0093] FIG. 5 illustrates an example of deployment of the interaction enhancement service. By adjusting the allocation of functionalities at the user (e.g., content consumption) device (client) and the centralized server, different deployment options may be achieved (e.g., implemented). For example, the client may provide limited functionality and the majority of the interaction enhancement service may be carried out at the cloud server (such as e.g., computationally intensive operations such as e.g., user data analysis) to achieve a light-weight deployment (from the device perspective) in a case where the (e.g., content consumption) device has limited computation capability. In another example, a distributed deployment may be preferred for privacy preservation where the functionalities may be mainly provided by the (e.g., content consumption) device, such as user data collection and processing.
[0094] In FIG. 5, a media (e.g., content) consumption device 510 at the user side may be hosting the interaction enhancement service client 517. Various types of multimedia consumption devices may support (e.g., include, implement) the described interaction enhancement functionality. Such content consumption devices may include but are not limited to a DTV, smart phone, tablet, laptop, HMD, etc. For example, the content consumption devices 510 may support (e.g., include, implement) the capabilities such as collecting user data from applications, computing user profile (e.g., preference), processing and presenting (e.g., displaying) interaction data. The user may interact with the service client to provide any of (e.g., social) interaction data and instructions (e.g., configuration information) to build user profile. [0095] The interaction enhancement service 502 may also leverage assistance from supporting capabilities provided by other related applications or services 520, such as any of monitoring user context and gathering user’s interaction data from various sources (e.g., search engine, social media), providing additional modality of interaction (e.g., voice chatting, video calls), obtaining social mining results, etc. These applications or services 520 may be located on the same media consumption device 510 where the interaction enhancement service (client) 517 may be hosted, or on other devices 525 that may interact with the media consumption device 510 to support the interaction process.
[0096] The interaction enhancement service client may interact with the server hosted in the cloud 501. The interaction between the client and server may differ depending on how the interaction enhancement service may be deployed. In the case where the interaction enhancement service is hosted at the user (e.g., content consumption) device, user (e.g., social interaction) data may be collected and processed locally at the (e.g., content consumption) device. For example, there will be little to no interaction between the client and the cloud server. In another example, if the data collection and processing is done at the server hosted in the cloud, user (e.g., social interaction) data that may be generated locally may be transmitted to the server. At the server, the user (e.g., social interaction) data from the client may be combined with the user (e.g., social interaction) data collected from the content/service provider (e.g., the interaction data from other users). The combined user (e.g., social interaction) data may be analyzed and processed, and then transmitted to the client (and other users) e.g., content consumption device(s).
[0097] The interaction enhancement service hosted at one user (e.g., content consumption) device 515 may interact with another user (e.g., content consumption) device 515 any of directly and through the centralized server. The interaction among (e.g., content consumption) devices may include the request and response messages exchanged during the establishment of interaction session. For example, the interaction enhancement service may not be enabled at all the users (e.g., content consumption devices) in an interaction session. In this case, a certain functionality of the interaction enhancement service may still be achieved (e.g., performed). For example, the service hosted on one (e.g., content consumption) device may directly interact with the applications/services hosted on a remote (e.g., content consumption) device to apply (e.g., perform the processing associated with) the enhancement.
[0098] A second embodiment can be labelled as a Protocol Embodiment. In this embodiment, the interaction enhancement service may be carried out via the use of an interaction enhancement protocol. This protocol may be supported by applications and services hosted on user (e.g., content consumption) devices (e.g., DTVs) or the centralized server, as well as other entities in the system that may interact with the service such as those shown in FIG. 5. The applications and/or services hosted by the content consumption devices and related entities can support the exchange of interaction enhancement protocol messages as described herein.
[0099] In one embodiment, an interaction enhancement message protocol can be realized as a client/server messaging protocol where users and/or their media consumption devices can function in the role of a client and/or a server to exchange interaction enhancement request and response messages with other entities in the system (e.g., other supporting applications/services/devices). For example, the information elements of any of the interaction enhancement request and response protocol messages can be encapsulated and carried within the payloads of existing client/server protocols such HTTP or Web Sockets.
[0100] In another embodiment, these information elements can be encapsulated and carried within lower-level protocols such as any of TCP and UDP e.g., without the use of higher layer protocols.
[0101] In another embodiment, the interaction enhancement service messages can be encapsulated and carried within publish/subscribe messaging protocols. For example, entities in the system can support message broker functionality. This broker functionality can be used by the devices to exchange the interaction enhancement service message with other entities in the system. This exchange can be facilitated by each entity subscribing to the message broker to receive messages from other entities. Likewise, each entity can publish a message to the message broker that may target other entities. The information elements of the interaction enhancement protocol messages can be encapsulated and carried within the payloads of existing publish/subscribe protocols such a message queuing telemetry transport (MQTT) or advanced message queuing protocol (AMQP). [0102] In another embodiment, the interaction enhancement service protocol may employ a combination of the aforementioned protocol types.
[0103] Various types of request and response protocol messages can be supported by applications and/or services with interaction enhancement service capability. For example, any of request and response protocol messages of the interaction enhancement protocol may comprise information indicating a type of message which may include but may not be limited to the types of messages described in Table 1 .
Table 1- Interaction Enhancement Service Protocol Message Types
Figure imgf000033_0001
Figure imgf000034_0001
[0104] In embodiments described herein, an interaction enhancement service may be capable of any of: a. Collecting and analyzing user data related to interaction (such as e.g., social interaction data) to any of identify user preference and generate a user profile; b. Enabling and dynamically enhancing the interaction process of users based on any of their preference and profile to improve the user content consumption experience; c. Identifying (e.g., selecting) specific subset of users (e.g., content consumption devices) to establish interaction session, and applying session-specific enhancement to users (e.g., content consumption devices) within the session.
[0105] In embodiments described herein, the user interaction enhancement service can be designed to be capable of collecting (e.g., user) data related to any of the user interaction and context, wherein a. The (e.g., user) data may include (e.g., social) interaction data, which may include the user inputs responsive to the media content (e.g., consumption) and may be made public by the user (e.g., content consumption device) to be visible (e.g., transmitted) to the other users (e.g., content consumption devices) that may I will be consuming the same content (e.g., any of comments to a video, chat messages during a live event, posts on social media that may be linked to (e.g., associated with) a certain content). b. The sources of (e.g., user) data may include but are not limited to, the platform that may be streaming the content on the content consuming device (e.g., DTV), related applications (e.g., social media, communication) on the content consuming device or other devices of the user. c. The (e.g., user) data regarding a particular content may be generated while the user may be consuming the content, or before the content may be consumed (e.g., the user talked about a live event that he/she was expecting to be on air in two days), or after the content may be consumed (e.g., user writing a review of a movie or posting comments to a video). d. The user context may include information of the userorthe userviewing environment (e.g., any of the user may be watching the content for the first time, the user may be watching with a child, status of devices used by the user during content consumption). e. The (e.g., user) data may include user responsive data such as feedback to the content or response to the interaction data generated by other users (e.g., content consumption devices). Examples may include any of user showing interest to a particular component of the content, user showing interest to another user (e.g., following/subscribing to another user’s social media), user reaction to the interaction data input (e.g., transmitted) by other users (e.g., content consumption devices) such as e.g., any of like, dislike, upvote, downvote to a comment made (e.g., transmitted) by another user (e.g., content consumption device). f. The (e.g., user) data may include user explicit request or instruction on how to process the (e.g., social) interaction data (e.g., user-specified keywords to be blocked, user-specified interaction data sources).
[0106] Based on the collected (e.g., user) data (including (e.g., social) interaction data, context and user feedback), user preference and/or expectation on interaction may be computed, and, for example, a user profile may be generated to store and maintain information of the user preference and/or expectation, wherein: a. The user preference may include the topics/keywords that the user may be willing to see in the interaction (e.g., topics/keywords of interest), and/or the topics/keywords that the user may not be willing to see in the interaction (e.g., topics/keywords to be blacklisted). b. The user preference may include the preferred targets (other users) that the user may be willing to interact with, or targets that the user may not be willing to interact with. c. The user preference may include the preferred interaction mode (e.g., text, voice, video, emote). d. The user context (e.g., the user may be watching the content for the first time, the user may be watching with a child) or expectation on interaction may include specific restriction on the interaction (e.g., any of spoiler-free, mild language usage, intensity and volume of interaction data, user will participate in interaction only when a secondary device or a related application is present).
[0107] In an embodiment, enabling and enhancing user interaction based on the computed user preferences and/or expectations on interaction, may include any of the following: a. processing received (e.g., social) interaction data such as any of (i) filtering the (e.g., social) interaction data based on any of user preference and expectations (e.g., filtering spoiler interaction data if the user is watching the content for the first time), (ii) adjusting the interaction mode, and (iii) adjusting the intensity and volume of interaction. b. presenting (e.g., displaying) the received and processed interaction data to the user, where the (e.g., social) interaction data may include the data integrated from other platforms/applications other than the platform that may be streaming the content. c. recording and storing the processed or un-processed data so that the user may access the data later. d. assisting the user in generating (e.g., social) interaction data (e.g., via a message bot) and managing the generated (e.g., social) interaction data. e. dynamically updating the user profile based on the user changing preference and/or context and adapting the interaction enhancement mechanisms according to the (e.g., latest) updated user profile.
[0108] In an embodiment, session-based enhancement may be performed to a subset of users (e.g., content consumption devices) that may be identified (e.g., selected) based on any of (e.g., users) data and social interaction.
[0109] The subset of users (e.g., content consumption devices) may be determined based on them sharing similar/same preference or expectations on the interaction (e.g., any of same topic of interest, same preferred interaction mode).
[0110] Requests may be sent to the subset of users (e.g., content consumption devices), proposing session-based enhancement to the interaction and/or viewing experience among the subset of users (e.g., content consumption devices).
[0111] The enhancement may include changing the interaction mode specific to this subset of users (e.g., content consumption devices) or include content modification/adjustment specific to this subset of users (e.g., content consumption devices) such as e.g., pausing the content in a case where the subset of users is discussing about the content).
[0112] A response may be received, based on which, adapting the interaction and/or viewing experience of the subset of user (e.g., content consumption device) may be performed.
[0113] FIG. 6 schematically illustrates a general overview of an encoding and decoding system according to one or more embodiments. The system of FIG. 6 is configured to perform one or more functions of embodiments described herein and can have a pre-processing module 630 to prepare a received content (including one more images or videos) for encoding by an encoding device 640. Encoding device 640 packages the content in a form suitable for transmission and/or storage for recovery by a compatible decoding device 670. In general, though not strictly required, the encoding device 640 provides a degree of compression, allowing the common space to be represented more efficiently (i.e., using less memory for storage and/or less bandwidth required for transmission. After being encoded, the data is sent to a network interface 650, which may be typically implemented in any network interface, for instance present in a gateway. The data can be then transmitted through a communication network, such as the internet. Various other network types and components (e.g., wired networks, wireless networks, mobile cellular networks, broadband networks, local area networks, wide area networks, Wi-Fi networks, and/or the like) may be used for such transmission, and any other communication network may be foreseen. Then the data may be received via network interface 660 which may be implemented in a gateway, in an access point, in the receiver of an end user device, or in any device comprising communication receiving capabilities. After reception, the data are sent to a decoding device 670. Decoded data are then processed by device 680 that can be also in communication with sensors or users input data. The decoder 870 and the device 880 may be integrated in a single device (e.g., a smartphone, a game console, a STB, a tablet, a computer, etc.). In another embodiment, a rendering device 690 may also be incorporated.
[0114] FIG. 7 is a diagram illustrating an example of a method 700 for enhancing an interaction service. Method 700 may comprise, in step 710, establishing an interaction enhancement service for collecting a plurality of user data related to at least a user interaction while consuming content. Method 700 may further comprise, in step 720, analyzing the user interaction and identifying user preferences based on the user interaction. Method 700 may further comprise, in step 730, generating a user profile based on the user preferences.
[0115] For example, a user interaction session may be established between at least two users by identifying common interests amongst the at least two users by analyzing the user preferences.
[0116] For example, functions may be generated when the user consumes another content based on the user preferences established in the user profile. [0117] For example, a subset of users with common interests may be identified based on their user profiles and interaction session(s) may be established amongst the subset of users.
[0118] For example, session-specific enhancements may be generated in a case where the subset of users are consuming another content and the session specific enhancements may be generated based on the user preferences in the user profiles of the subset of users.
[0119] For example, the user data may include (e.g., social) interaction data.
[0120] For example, (e.g., social) interaction data may comprise data inputted by at least one user responsive to the consumed content and the input may be made visible to the other plurality of users that may be consuming the same content.
[0121] For example, the input may include at least one of a user comment related to a video, a chat message during a live event, and/or a post on a social media that may be linked to the content.
[0122] For example, the user data may include streaming content on a content consuming device.
[0123] For example, the content consuming device may be a DTV.
[0124] For example, the user data may be collected from a social media user profile or other related social media applications.
[0125] For example, the user data may be collected from a user device or a content consuming device in communication with the user device.
[0126] For example, the user data may be collected based on comments generated by the users prior to consuming a content but relating to the content.
[0127] For example, the user data may include data generated based on comments collected prior consuming the content and the user data may include the viewing environment(s) of the user(s) and/or the user schedule(s) for viewing the content.
[0128] For example, the user data may include whether each user is consuming the content alone or with other users.
[0129] FIG. 8 is a diagram illustrating another example of a method 800 for enhancing an interaction service. Method 800 may comprise, in step 810, capturing a plurality of user responses generated in relation to consuming a media content. Method 800 may comprise, in step 820, analyzing the responses to identify a common response pattern amongst at least two users. Method 800 may further comprise, in step 830, establishing an interaction session amongst the at least two users.
[0130] For example, a user profile may be established for each user.
[0131] For example, the user profile may include user preferences established at least based on the responses generated in relation to consuming the media content.
[0132] For example, the user profile may include additional data relating to user preferences added by at least one user.
[0133] For example, the user may be the owner of the user profile.
[0134] For example, method 800 may further comprise generating session-specific enhancements when the users may be consuming new content based on the user preferences.
[0135] For example, a subset of users may be identified to have common preferences and a session may be established for the subset of users.
[0136] For example, method 800 may further comprise applying session-specific enhancement to users within the session.
[0137] For example, data may be generated relating to at least one user to identify user preferences.
[0138] For example, a subset of users may be identified to have common interests based on their user profile and an interaction session may be established for the subset.
[0139] FIG. 9 is a diagram illustrating an example of a method 900 for enhancing an interaction service. Method 900 may comprise, in step 910, collecting user data for a plurality of users consuming at least one content. Method 900 may further comprise, in step 920, analyzing the user data and identifying user preferences based on the user data. Method 900 may further comprise, in step 930, establishing a user profile including the user preferences. Method 900 may further comprise, in step 940, identifying common user preferences amongst at least two users. Method 900 may further comprise, in step 950, notifying the at least two users of their common interests and establishing a session between the at least two users if response is received from at least one of the two users that a session should be established. [0140] For example, the user data may include (e.g., social) interaction data.
[0141] For example, the (e.g., social) interaction data may comprise data input by at least one user responsive to the consumed content and the input may be made visible to the other plurality of users that may be consuming the same content.
[0142] For example, the input may include at least one of a user comment made to a video, a chat message during a live event, and/or posts on a social media that may be linked to the content.
[0143] For example, the user data may include streaming content on a content consuming device.
[0144] For example, the content consuming device may be a DTV.
[0145] For example, the user data may be collected from a social media user profile or other related social media applications.
[0146] For example, the user data may be collected from a user device or a content consuming device in communication with the user device.
[0147] For example, the user data may be collected based on comments generated by the users prior to consuming a content but relating to the content.
[0148] For example, the user data may include data generated based on comments collected prior consuming the content and the user data may include the viewing environment(s) of the user(s) and/or the user schedule(s) for viewing the content.
[0149] For example, the user data may include whether each user is consuming the content alone or with other users.
[0150] For example, the user data may include the age of the user.
[0151] For example, the user data may include the age of the user and of the other users consuming the content with the user.
[0152] For example, the user data may include the number of times the content may have been consumed. [0153] For example, the user data may include user responsive data.
[0154] For example, the user responsive data may include user feedback to the content and/or user response data generated by other users.
[0155] For example, the user data may include user request(s) on how to process user interaction data.
[0156] For example, the user request(s) may include blocking at least one of a word, a topic and/or a user.
[0157] For example, the user profile may include user collected data and user added data.
[0158] For example, the data in the user profile may include any of user interaction data, generated context relating to a content and/or user feedback and preferences and expectations.
[0159] For example, the user preferences may include any of topics, keywords, and another user.
[0160] For example, the user preferences may include topics, keywords, and another user that at least each user may be willing to interact with or topics, keywords and/or other users that each user may be unwilling to see or interact with.
[0161] For example, the user preferences may include an interaction mode.
[0162] For example, the interaction mode may include any of text, voice, video, emote and email
[0163] For example, the user preferences may include a restriction.
[0164] For example, the restriction may only to be observed if another condition is available.
[0165] For example, the restriction may include any of language, availability of a secondary device, volume intensity, and/or availability of an application.
[0166] For example, user viewing of future content may be enhanced by incorporating features based on the user profile.
[0167] For example, future content to be consumed may be filtered based on the user profile or the user preferences. [0168] For example, the filtering may include spoiler information, users, interaction mode, and/or intensity of volume.
[0169] For example, method 900 may further comprise rendering processed interaction data to each user, wherein the interaction data may include data integrated from other platforms and/or applications.
[0170] For example, method 900 may further comprise storing the data that may be desirous for each user to access at a future time based on the user preferences in the user profile.
[0171] For example, method 900 may further comprise generating for each user interaction data based on the user preferences to communicate with other users and/or user devices.
[0172] For example, method 900 may further comprise dynamically updating each user profile based on new data and/or changed user preferences.
[0173] FIG. 10 is a diagram illustrating another example of a method 1000 for enhancing an interaction service. Method 1000 may be implemented in a content consumption device and may comprise, in step 1010, collecting first social interaction data associated with a consumption of a media content on the content consumption device, wherein at least a part of the first social interaction data may be for transmission to other content consumption devices consuming the media content, wherein the first social interaction data may comprise any of (i) one or more first comments related to the media content and (ii) one or more first chat messages occurring during the consumption of the media content. Method 1000 may further comprise, in step 1020, generating a user profile based on the collected first social interaction data. Method 1000 may further comprise, in step 1030, receiving second social interaction data comprising any of (i) one or more second comments related to the media content and (ii) one or more second chat messages occurring during the consumption of the media content. Method 1000 may further comprise, in step 1040, filtering the second social interaction data based on the user profile. Method 1000 may further comprise, in step 1050, displaying the filtered second social interaction data on the content consumption device or transmitting the filtered second social interaction data to at least one other content consumption device consuming the media content. [0174] For the sake of simplicity, embodiments are described herein by referring to any of a user profile and user preferences that may be obtained by processing the first social interaction data, based on which the second social interaction data may be filtered. Any technique for processing the first social interaction data, based on which second social interaction data may be filtered, may be applicable to embodiments described herein.
[0175] For example, the second social interaction data may be received from any of a local user interface of the content consumption device and a user device associated with the content consumption device.
[0176] For example, the second social interaction data may be received from any of the other content consumption devices consuming the media content.
[0177] For example, the first comments may be collected prior to the consumption of the media content on the content consumption device.
[0178] For example, the first comments may be collected during the consumption of the media content on the content consumption device.
[0179] For example, the first social interaction data may further comprise one or more responses on one or more previously received comments from any of the other content consumption devices consuming the media content, and wherein the user profile may be further based on the one or more responses.
[0180] For example, method 1000 may further comprise collecting one or more posts on a social media that may be linked to the media content, wherein the user profile may be further based on the collected one or more posts.
[0181] For example, method 1000 may further comprise collecting user data from any of a social media user profile and a social media application, wherein the user profile may be further based on the collected user data.
[0182] For example, method 1000 may further comprise collecting user schedule information associated with the content consumption device, wherein the user profile may be further based on the collected user schedule information. [0183] For example, method 1000 may further comprise collecting context information, wherein the user profile may be further based on the collected context information.
[0184] For example, generating the user profile may comprise identifying user preferences based on the collected first social interaction data.
[0185] For example, the user preferences may include any of topics of interest, keywords of interest, blocked topics, blocked keywords, an interaction mode, an interaction intensity, an interaction device, and an interaction application.
[0186] For example, method 1000 may further comprise any of adjusting an interaction mode and adjusting an interaction intensity based on the user profile.
[0187] For example, method 1000 may further comprise dynamically updating the user profile based on changing user preferences and adapting the filtering of the second social interaction data according to the updated user profile.
[0188] For example, the at least one other content consumption device may be selected from the other content consumption devices consuming the media content, based on common interests between the content consumption device and the at least one other content consumption device.
[0189] For example, common interests may be determined based on the first social interaction data and the second social interaction data.
[0190] For example, method 1000 may further comprise establishing an interaction session between the content consumption device and the at least one other content consumption device.
[0191] For example, method 1000 may further comprise generating session specific enhancements when another media content may be consumed by the content consumption device and the at least one other content consumption device.
[0192] For example, the content consumption device may be a digital TV.
[0193] FIG. 11 is a diagram illustrating another example of a method 1100 for enhancing an interaction service. Method 1100 may be implemented in a processing device such as e.g., a server. Method 1100 may comprise, in step 1110, collecting social interaction data from a plurality of content consumption devices consuming a same media content, wherein the social interaction data may comprise any of (i) one or more comments related to the same media content, and (ii) one or more chat messages occurring during a consumption of the same media content. Method 1100 may further comprise, in step 1120, selecting at least two content consumption devices from the plurality of content consumption devices consuming the same media content, wherein the at least two content consumption devices may be selected based on the collected social interaction data by identifying common interests between the at least two content consumption devices. Method 1100 may further comprise, in step 1130, transmitting information to the at least two content consumption devices, indicating the identified common interests. Method 1100 may further comprise, in step 1140, establishing a session between the at least two content consumption devices if a response is received from at least one of the at least two content consumption devices that the session should be established.
[0194] For example, the social interaction data may further comprise one or more responses on one or more previously received comments from any of the plurality of content consumption devices consuming the same media content, and the at least two content consumption devices may be further selected according to the one or more responses.
[0195] For example, method 1100 may further comprise collecting one or more posts on a social media that may be linked to the same media content, and the at least two content consumption devices may be further selected according to the collected one or more posts. [0196] For example, method 1100 may further comprise collecting user schedule information associated with at least one content consumption device of the plurality of content consumption devices, and the at least two content consumption devices may be further selected according to the collected user schedule information.
[0197] For example, method 1100 may further comprise collecting context information, wherein the at least two content consumption devices may be further selected according to the collected context information. [0198] For example, method 1100 may further comprise generating a plurality of user profiles respectively associated with the plurality of content consumption devices based on the collected social interaction data, wherein the common interests may be identified based on the plurality of user profiles.
[0199] For example, method 1100 may further comprise identifying a plurality of user preferences respectively associated with the plurality of content consumption devices based on the collected social interaction data, wherein the common interests may be identified based on the plurality of user preferences.
[0200] For example, the user preferences may include any of topics of interest, keywords of interest, blocked topics, blocked keywords, an interaction mode, an interaction intensity, an interaction device, and an interaction application.
[0201] For example, method 1100 may further comprise generating session specific enhancements when another media content may be consumed by the at least two content consumption devices.
Conclusion
[0202] While not explicitly described, the present embodiments may be employed in any combination or sub-combination. For example, the present principles are not limited to the described variants, and any arrangement of variants and embodiments may be used.
[0203] Any characteristic, variant or embodiment described for a method is compatible with an apparatus device comprising means for processing the disclosed method, with a device comprising a processor configured to process the disclosed method, with a computer program product comprising program code instructions and with a non-transitory computer-readable storage medium storing program instructions.
[0204] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer readable medium for execution by a computer or processor. Examples of non- transitory computer-readable storage media include, but are not limited to, a read only memory (ROM), random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
[0205] Moreover, in the embodiments described above, processing platforms, computing systems, controllers, and other devices containing processors are noted. These devices may contain at least one Central Processing Unit ("CPU") and memory. In accordance with the practices of persons skilled in the art of computer programming, reference to acts and symbolic representations of operations or instructions may be performed by the various CPUs and memories. Such acts and operations or instructions may be referred to as being "executed," "computer executed" or "CPU executed."
[0206] One of ordinary skill in the art will appreciate that the acts and symbolically represented operations or instructions include the manipulation of electrical signals by the CPU. An electrical system represents data bits that can cause a resulting transformation or reduction of the electrical signals and the maintenance of data bits at memory locations in a memory system to thereby reconfigure or otherwise alter the CPU's operation, as well as other processing of signals. The memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, optical, or organic properties corresponding to or representative of the data bits. It should be understood that the representative embodiments are not limited to the above-mentioned platforms or CPUs and that other platforms and CPUs may support the provided methods.
[0207] The data bits may also be maintained on a computer readable medium including magnetic disks, optical disks, and any other volatile (e.g., Random Access Memory ("RAM")) or non-volatile (e.g., Read-Only Memory ("ROM")) mass storage system readable by the CPU. The computer readable medium may include cooperating or interconnected computer readable medium, which exist exclusively on the processing system or are distributed among multiple interconnected processing systems that may be local or remote to the processing system. It is understood that the representative embodiments are not limited to the above- mentioned memories and that other platforms and memories may support the described methods.
[0208] In an illustrative embodiment, any of the operations, processes, etc. described herein may be implemented as computer-readable instructions stored on a computer-readable medium. The computer-readable instructions may be executed by a processor of a mobile unit, a network element, and/or any other computing device.
[0209] There is little distinction left between hardware and software implementations of aspects of systems. The use of hardware or software is generally (e.g., but not always, in that in certain contexts the choice between hardware and software may become significant) a design choice representing cost vs. efficiency tradeoffs. There may be various vehicles by which processes and/or systems and/or other technologies described herein may be effected (e.g., hardware, software, and/or firmware), and the preferred vehicle may vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle. If flexibility is paramount, the implementer may opt for a mainly software implementation. Alternatively, the implementer may opt for some combination of hardware, software, and/or firmware.
[0210] The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples may be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Suitable processors include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs); Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.
[0211] Although features and elements are provided above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. The present disclosure is not to be limited in terms of the particular embodiments described in this application, which are intended as illustrations of various aspects. Many modifications and variations may be made without departing from its spirit and scope, as will be apparent to those skilled in the art. No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly provided as such. Functionally equivalent methods and apparatuses within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from the foregoing descriptions. Such modifications and variations are intended to fall within the scope of the appended claims. The present disclosure is to be limited only by the terms of the appended claims, along with the full scope of equivalents to which such claims are entitled. It is to be understood that this disclosure is not limited to particular methods or systems.
[0212] In certain representative embodiments, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), and/or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, may be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein may be distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a CD, a DVD, a digital tape, a computer memory, etc., and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
[0213] The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely examples, and that in fact many other architectures may be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively "associated" such that the desired functionality may be achieved. Hence, any two components herein combined to achieve a particular functionality may be seen as "associated with" each other such that the desired functionality is achieved, irrespective of architectures or intermediate components. Likewise, any two components so associated may also be viewed as being "operably connected", or "operably coupled", to each other to achieve the desired functionality, and any two components capable of being so associated may also be viewed as being "operably couplable" to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
[0214] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
[0215] It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as "open" terms (e.g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, where only one item is intended, the term "single" or similar language may be used. As an aid to understanding, the following appended claims and/or the descriptions herein may contain usage of the introductory phrases "at least one" and "one or more" to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to embodiments containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., "a" and/or "an" should be interpreted to mean "at least one" or "one or more"). The same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations," without other modifiers, means at least two recitations, or two or more recitations).
[0216] Furthermore, in those instances where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to "at least one of A, B, or C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, or C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "A or B" will be understood to include the possibilities of "A" or"B" or "A and B." Further, the terms "any of' followed by a listing of a plurality of items and/or a plurality of categories of items, as used herein, are intended to include "any of," "any combination of," "any multiple of," and/or "any combination of multiples of' the items and/or the categories of items, individually or in conjunction with other items and/or other categories of items. Moreover, as used herein, the term "set" or “group” is intended to include any number of items, including zero. Additionally, as used herein, the term "number" is intended to include any number, including zero.
[0217] In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group. [0218] Moreover, the claims should not be read as limited to the provided order or elements unless stated to that effect. In addition, use of the terms "means for" in any claim is intended to invoke 35 U.S.C. §112, If 6 or means-plus-function claim format, and any claim without the terms "means for" is not so intended.

Claims

1. A method implemented in a content consumption device, the method comprising: collecting first social interaction data associated with a consumption of a media content on the content consumption device, wherein at least a part of the first social interaction data is for transmission to other content consumption devices consuming the media content, wherein the first social interaction data comprise any of (i) one or more first comments related to the media content and (ii) one or more first chat messages occurring during the consumption of the media content; generating a user profile based on the collected first social interaction data; receiving second social interaction data comprising any of (i) one or more second comments related to the media content and (ii) one or more second chat messages occurring during the consumption of the media content; filtering the second social interaction data based on the user profile; and displaying the filtered second social interaction data on the content consumption device or transmitting the filtered second social interaction data to at least one other content consumption device consuming the media content.
2. The method according to claim 1 , wherein the second social interaction data are received from any of a local user interface of the content consumption device and a user device associated with the content consumption device.
3. The method according to claim 1 , wherein the second social interaction data are received from any of the other content consumption devices consuming the media content.
4. The method according to any of claims 1 to 3, wherein said first comments were collected prior the consumption of the media content on the content consumption device.
53
5. The method according to any of claims 1 to 3, wherein said first comments were collected during the consumption of the media content on the content consumption device.
6. The method according to any of claims 1 to 5, wherein the first social interaction data further comprise one or more responses on one or more previously received comments from any of the other content consumption devices consuming the media content, and wherein the user profile is further based on the one or more responses.
7. The method according to any of claims 1 to 6, comprising collecting one or more posts on a social media that are linked to the media content, wherein the user profile is further based on the collected one or more posts.
8. The method according to any of claims 1 to 7, comprising collecting user data from any of a social media user profile and a social media application, wherein the user profile is further based on the collected user data.
9. The method according to any of claims 1 to 8, comprising collecting user schedule information associated with the content consumption device, wherein the user profile is further based on the collected user schedule information.
10. The method according to any of claims 1 to 9, comprising collecting context information, wherein the user profile is further based on the collected context information.
11. The method according to any of claims 1 to 10, wherein generating the user profile comprises identifying user preferences based on the collected first social interaction data.
54
12. The method according to claim 11 , wherein the user preferences include any of topics of interest, keywords of interest, blocked topics, blocked keywords, an interaction mode, an interaction intensity, an interaction device, and an interaction application.
13. The method according to any of claims 1 to 12, comprising any of adjusting an interaction mode and adjusting an interaction intensity based on the user profile.
14. The method according to any of claims 1 to 13, comprising dynamically updating the user profile based on changing user preferences and adapting the filtering of the second social interaction data according to the updated user profile.
15. The method according to any of claims 3 to 14, wherein the at least one other content consumption device is selected from the other content consumption devices consuming the media content, based on common interests between the content consumption device and the at least one other content consumption device.
16. The method according to claim 15, wherein the common interests are determined based on the first social interaction data and the second social interaction data.
17. The method according to any of claims 1 to 16, comprising establishing an interaction session between the content consumption device and the at least one other content consumption device.
18. The method according to claim 17, comprising generating session specific enhancements when another media content is consumed by the content consumption device and the at least one other content consumption device.
55
19. The method according to any of claims 1 to 18, wherein the content consumption device is a digital TV.
20. An apparatus comprising circuitry, including any of a transmitter, a receiver, a processor and a memory, configured to carry out the method according to any of claims 1 to 19.
21 . A method comprising: collecting social interaction data from a plurality of content consumption devices consuming a same media content, wherein the social interaction data comprise any of (i) one or more comments related to the same media content, and (ii) one or more chat messages occurring during a consumption of the same media content; selecting at least two content consumption devices from the plurality of content consumption devices consuming the same media content, wherein the at least two content consumption devices are selected based on the collected social interaction data by identifying common interests between the at least two content consumption devices; transmitting information to the at least two content consumption devices, indicating the identified common interests; and establishing a session between said at least two content consumption devices if a response is received from at least one of said at least two content consumption devices that the session should be established.
22. The method according to claim 21 , wherein the social interaction data further comprise one or more responses on one or more previously received comments from any of the plurality of content consumption devices consuming the same media content, and wherein the at least two content consumption devices are further selected according to the one or more responses.
56
23. The method according to any of claims 21 to 22, comprising collecting one or more posts on a social media that are linked to the same media content, wherein the at least two content consumption devices are further selected according to the collected one or more posts.
24. The method according to any of claims 21 to 23, comprising collecting user schedule information associated with at least one content consumption device of the plurality of content consumption devices, wherein the at least two content consumption devices are further selected according to the collected user schedule information.
25. The method according to any of claims 21 to 24, comprising collecting context information, wherein the at least two content consumption devices are further selected according to the collected context information.
26. The method according to any of claims 21 to 25, comprising generating a plurality of user profiles respectively associated with the plurality of content consumption devices based on the collected social interaction data, wherein the common interests are identified based on the plurality of user profiles.
27. The method according to any of claims 21 to 26, comprising identifying a plurality of user preferences respectively associated with the plurality of content consumption devices based on the collected social interaction data, wherein the common interests are identified based on the plurality of user preferences.
28. The method according to claim 27, wherein the user preferences include any of topics of interest, keywords of interest, blocked topics, blocked keywords, an interaction mode, an interaction intensity, an interaction device, and an interaction application.
29. The method according to any of claims 26 to 28, comprising generating session specific enhancements when another media content is consumed by the at least two content consumption devices.
30. An apparatus comprising circuitry, including any of a transmitter, a receiver, a processor and a memory, configured to carry out the method according to any of claims 21 to 29.
PCT/US2022/045610 2021-10-08 2022-10-04 Methods, architectures, apparatuses and systems directed to dynamically enhance interaction of multiple users consuming content WO2023059586A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163253582P 2021-10-08 2021-10-08
US63/253,582 2021-10-08

Publications (1)

Publication Number Publication Date
WO2023059586A1 true WO2023059586A1 (en) 2023-04-13

Family

ID=84360104

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/045610 WO2023059586A1 (en) 2021-10-08 2022-10-04 Methods, architectures, apparatuses and systems directed to dynamically enhance interaction of multiple users consuming content

Country Status (1)

Country Link
WO (1) WO2023059586A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003030547A1 (en) * 2001-09-12 2003-04-10 Opentv, Inc. A method and apparatus for disconnected chat room lurking in an interactive television environment
US20060107314A1 (en) * 2004-11-12 2006-05-18 Cataldi John M Content management system and method
US20090083383A1 (en) * 2007-09-26 2009-03-26 Microsoft Corporation Dynamic instant comments
US20120150698A1 (en) * 2010-12-10 2012-06-14 Mcclements Iv James Burns Media content clip identification and combination architecture
US20140075317A1 (en) * 2012-09-07 2014-03-13 Barstow Systems Llc Digital content presentation and interaction
WO2015148693A1 (en) * 2014-03-26 2015-10-01 Publicover Mark W Computerized method and system for providing customized entertainment content
US20200351562A1 (en) * 2016-12-29 2020-11-05 Dressbot Inc System and method for multi-user digital interactive experience

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003030547A1 (en) * 2001-09-12 2003-04-10 Opentv, Inc. A method and apparatus for disconnected chat room lurking in an interactive television environment
US20060107314A1 (en) * 2004-11-12 2006-05-18 Cataldi John M Content management system and method
US20090083383A1 (en) * 2007-09-26 2009-03-26 Microsoft Corporation Dynamic instant comments
US20120150698A1 (en) * 2010-12-10 2012-06-14 Mcclements Iv James Burns Media content clip identification and combination architecture
US20140075317A1 (en) * 2012-09-07 2014-03-13 Barstow Systems Llc Digital content presentation and interaction
WO2015148693A1 (en) * 2014-03-26 2015-10-01 Publicover Mark W Computerized method and system for providing customized entertainment content
US20200351562A1 (en) * 2016-12-29 2020-11-05 Dressbot Inc System and method for multi-user digital interactive experience

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHIAO-FANG HSU ET AL: "Ranking Comments on the Social Web", COMPUTATIONAL SCIENCE AND ENGINEERING, 2009. CSE '09. INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 29 August 2009 (2009-08-29), pages 90 - 97, XP031544391, ISBN: 978-1-4244-5334-4 *

Similar Documents

Publication Publication Date Title
US20210051034A1 (en) System for integrating multiple im networks and social networking websites
US8554848B2 (en) Collective asynchronous media review
US9686512B2 (en) Multi-user interactive virtual environment including broadcast content and enhanced social layer content
US20180205797A1 (en) Generating an activity sequence for a teleconference session
US8751572B1 (en) Multi-user chat search and access to chat archive
US8676908B2 (en) Method and system for seamless interaction and content sharing across multiple networks
US9876827B2 (en) Social network collaboration space
US11310463B2 (en) System and method for providing and interacting with coordinated presentations
US10904179B2 (en) System and method for voice networking
WO2019164708A1 (en) Automatic method and system for identifying consensus and resources
US11803579B2 (en) Apparatus, systems and methods for providing conversational assistance
CN113711618B (en) Authoring comments including hyperlinks referencing typing of video content
CN113597626B (en) Real-time meeting information in calendar view
CN112291503B (en) Interaction method and device and electronic equipment
JP2008199584A (en) Interactive communication method between communication terminals, and interactive server and tv network
US10523899B2 (en) System and method for providing and interacting with coordinated presentations
CN110598143B (en) Method, related device and system for displaying instant communication content
US20110258017A1 (en) Interpretation of a trending term to develop a media content channel
CN103401854A (en) Social network service-based television content sharing method
US10257140B1 (en) Content sharing to represent user communications in real-time collaboration sessions
CN112528052A (en) Multimedia content output method, device, electronic equipment and storage medium
US10504277B1 (en) Communicating within a VR environment
US9531822B1 (en) System and method for ranking conversations
WO2023059586A1 (en) Methods, architectures, apparatuses and systems directed to dynamically enhance interaction of multiple users consuming content
Schatz et al. “What Are You Viewing?” Exploring the Pervasive Social TV Experience

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22806030

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22806030

Country of ref document: EP

Kind code of ref document: A1