WO2023055363A1 - System, method and computer-readable medium for rendering a streaming - Google Patents

System, method and computer-readable medium for rendering a streaming Download PDF

Info

Publication number
WO2023055363A1
WO2023055363A1 PCT/US2021/052775 US2021052775W WO2023055363A1 WO 2023055363 A1 WO2023055363 A1 WO 2023055363A1 US 2021052775 W US2021052775 W US 2021052775W WO 2023055363 A1 WO2023055363 A1 WO 2023055363A1
Authority
WO
WIPO (PCT)
Prior art keywords
streaming
user terminal
environment parameter
mode
rendering
Prior art date
Application number
PCT/US2021/052775
Other languages
French (fr)
Inventor
Yung Chi Hsu
Chung Chiang HSU
Shao Yuan Wu
Ming-Che Cheng
Original Assignee
17Live Japan Inc.
17Live (Usa) Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 17Live Japan Inc., 17Live (Usa) Corp. filed Critical 17Live Japan Inc.
Priority to JP2022516347A priority Critical patent/JP7406713B2/en
Priority to PCT/US2021/052775 priority patent/WO2023055363A1/en
Priority to US17/880,707 priority patent/US11870828B2/en
Publication of WO2023055363A1 publication Critical patent/WO2023055363A1/en
Priority to US18/523,168 priority patent/US20240098125A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4425Monitoring of client processing errors or hardware failure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/458Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules ; time-related management operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4621Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen

Definitions

  • the present disclosure relates to a system, a method and a computer-readable medium for rendering a streaming.
  • Live streaming refers to online streaming media or live video simultaneously recorded and broadcast in real-time. Live streaming encompasses a wide variety of topics, from social media to video games to professional sports.
  • a method is a method for rendering a streaming on a user terminal being executed by one or a plurality of computers, and includes: rendering the streaming in a first mode, receiving an environment parameter of the user terminal, receiving a timing when the user terminal closes the streaming, determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming, receiving an updated environment parameter of the user terminal, and rendering the streaming in a second mode if the updated environment parameter meets the threshold value.
  • the second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
  • a system is a system for rendering a streaming on a user terminal that includes one or a plurality of processors, and the one or plurality of computer processors execute a machine-readable instruction to perform: rendering the streaming in a first mode, receiving an environment parameter of the user terminal, receiving a timing when the user terminal closes the streaming, determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming, receiving an updated environment parameter of the user terminal, and rendering the streaming in a second mode if the updated environment parameter meets the threshold value.
  • the second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
  • a computer-readable medium is a non-transitory computer-readable medium including a program for rendering a streaming on a user terminal, and the program causes one or a plurality of computers to execute: rendering the streaming in a first mode, receiving an environment parameter of the user terminal, receiving a timing when the user terminal closes the streaming, determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming, receiving an updated environment parameter of the user terminal, and rendering the streaming in a second mode if the updated environment parameter meets the threshold value.
  • the second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
  • FIG. 1 shows a schematic configuration of a communication system in accordance with some embodiments of the present disclosure.
  • FIG. 2 shows an exemplary functional configuration of a communication system in accordance with some embodiments of the present disclosure.
  • FIG. 3 shows an exemplary sequence chart illustrating an operation of a communication system in accordance with some embodiments of the present disclosure.
  • FIG. 4 shows a flowchart illustrating a process in accordance with some embodiments of the present disclosure.
  • a streaming watched by a user (such as a viewer) on the display of a user terminal is the result of processing or rendering various data objects.
  • Some of the data objects may exist on the user terminal (ex., may have been downloaded along with the application used to watch the live streaming) and some of the data objects may be received by the user terminal through a network.
  • the data objects may include a streaming data or a live video/ audio data from another user (such as a streamer) and other objects to perform functions such as gaming, special effects, gift or avatars.
  • a live streaming provider which may be the provider of the application through which the live streaming is watched by viewers, it is important to make sure that the viewers enjoy the streaming, or stay in the chat room, as long as possible. And it is important to prevent the viewers from leaving the streaming or the chat room due to environment or device factors such as poor network quality or overloaded/ overburdened device, which may cause delay, lag, or freezing and jeopardize the viewing experience.
  • the present disclosure provides systems, methods and computer-readable mediums that can dynamically or adaptively adjust the data objects used to render the streaming, according to user behaviors or preferences in various conditions, to optimize the viewing experience.
  • FIG. 1 shows a schematic configuration of a communication system 1 according to some embodiments of the present disclosure.
  • the communication system 1 provides a live streaming service with interaction via a content.
  • content refers to a digital content that can be played on a computer device.
  • the communication system 1 enables a user to participate in real-time interaction with other users on-line.
  • the communication system 1 includes a plurality of user terminals 10, a backend server 30, and a streaming server 40.
  • the user terminals 10, the backend server 30 and the streaming server 40 are connected via a network 90, which may be the Internet, for example.
  • the backend server 30 may be a server for synchronizing interaction between the user terminals and/ or the streaming server 40.
  • the backend server 30 may be referred to as the backend server of an application (APP) provider.
  • the streaming server 40 is a server for handling or providing streaming data or video data.
  • the streaming server 40 may be a server from a content delivery network (CDN) provider.
  • CDN content delivery network
  • the backend server 30 and the streaming server 40 may be independent servers.
  • the backend server 30 and the streaming server 40 may be integrated into one server.
  • the user terminals 10 are client devices for the live streaming.
  • the user terminal 10 may be referred to as viewer, streamer, anchor, podcaster, audience, listener or the like.
  • Each of the user terminal 10, the backend server 30, and the streaming server 40 is an example of an information-processing device.
  • the streaming may be live streaming or video replay.
  • the streaming may be audio streaming and/or video streaming.
  • the streaming may include contents such as online shopping, talk shows, talent shows, entertainment events, sports events, music videos, movies, comedy, concerts or the like.
  • FIG. 2 shows an exemplary functional configuration of the communication system 1.
  • the user terminal 10 includes a Ul unit 102, a storage unit 104, a user behavior tracker 106, an environment condition tracker 108, a controller 110, a Tenderer 112, a decoder 114, and a display 116.
  • each of the above components can be viewed as a processing unit or a processor.
  • the Ul unit 102 is the interface through which a user of the user terminal 10 operates or plays an APP, which may be an APP providing streaming service in some embodiments.
  • the user behavior tracker 106 is configured to monitor or track behaviors or actions of the user terminal 10 and deliver the results to the controller 110. For example, the actions may include participating in/ opening a streaming or leaving/ closing a streaming on the APP.
  • the storage unit 104 is configured to store a program of the APP, which includes instructions or data objects necessary for the APP to run on the user terminal 10.
  • the storage unit 104 may be constituted with a DRAM or the like, for example.
  • the storage unit 104 is constituted with a magnetic disk, a flash memory, or the like, for example.
  • the storage unit 104 stores various kinds of programs including an operating system, various kinds of data, and the like.
  • the environment condition tracker 108 is configured to monitor or track the environment condition under which the APP is operated, and deliver the results to the controller 110.
  • the environment condition tracker 108 may detect various environment parameters that are related to the operation/ playback of the streaming and thus the viewing experience of the user.
  • the environment parameters may include a CPU usage rate of the user terminal 10, a memory usage rate of the user terminal 10, a time duration or a number of times a freezing/ lag happens during the streaming, a length of time during which the number of frames per second (FPS) with which the streaming is being played is below a predetermined value, and network quality parameters that indicate the quality of the network 90.
  • the network quality parameters may include an application programming interface (API) response time, a transmission control protocol (TCP) connection time, a domain name system (DNS) lookup time, an security sockets layer (SSL) handshake time, and a downstream bandwidth regarding the streaming service through the network 90.
  • API application programming interface
  • TCP transmission control protocol
  • DNS domain name system
  • SSL security sockets layer
  • the controller 110 receives the user behavior data and the environment parameters from the user behavior tracker 106 and the environment condition tracker 108, and determines how to render or present the subsequent streaming. For example, the controller may determine a rendering mode based on the user behavior data and the environment parameters, access the storage unit 104 and/or the streaming server 40 for the corresponding data objects, and instructs the Tenderer 112 to render the streaming.
  • the controller 110 is configured as a CPU, a GPU, or the like, reads various programs that may be part of an APP and are stored in the storage unit 104 or the like into a main memory (not shown here), and executes various kinds of commands or machine-readable instructions included in the programs.
  • the decoder 114 is configured to convert streaming data from the streaming server 40 into video data or frame image for the Tenderer 112 to render the streaming.
  • the streaming data may be provided to the streaming server 40 by another user who could be referred to as a streamer, a broadcaster or an anchor.
  • the streaming server 40 may receive a streaming media from a streamer and convert it into versions with different resolutions such as 360p, 480p and 720p.
  • Different versions or grades of streaming data may be stored in the streaming server 40 with different uniform resource locators (URL). In some embodiments, those URLs are assigned by the backend server 30 and transferred to the user terminal 10 by the backend server 30.
  • the decoder 114 may reach to a URL for a certain grade of streaming data according to the rendering mode determined by the controller 110.
  • the Tenderer 112 may be configured to perform: receiving instructions regarding the rendering mode from the controller 110; receiving the data objects corresponding to the rendering mode from the storage unit 104; receiving the streaming data (which could be referred to as another data object) corresponding to the rendering mode from the decoder 114; and rendering the streaming media on the display 116.
  • the display 116 could be or include a screen on which the streaming media is enjoyed by the user of the user terminal 10.
  • Fig. 3 shows an exemplary sequence chart illustrating an operation of the communication system 1 according to some embodiments of the present disclosure.
  • step SI the controller 110 instructs the Tenderer 112 to render a streaming or a streaming media in a first mode, which may, for example, follow an action of the user to participate in or open a streaming on an APP.
  • step S2 the Tenderer 112 receives data objects that correspond to the first mode from the storage unit 104.
  • step S3 the Tenderer 112 receives streaming data or video data (which may be from another user) that corresponds to the first mode from the decoder 114.
  • step S4 the rendered streaming is shown on the display 116.
  • the first mode indicates a higher-performance mode, which requires the Tenderer 112 to include more or higher-grade data objects from the storage unit 104 and/ or to acquire a higher-resolution version of streaming data from the decoder 114 for the streaming rendering.
  • step S5 the controller 110 receives various environment parameters from the environment condition tracker 108.
  • step S6 the user behavior tracker 106 monitors the behavior of the user through the Ul unit 102.
  • step S7 the user behavior tracker 106 detects a timing the user closes or turns off the streaming and reports to the controller 110.
  • step S8 the controller 110 determines a threshold value for each of the environment parameters based on the timing the user closes the streaming.
  • the threshold values may be used by the controller 110 to compare with subsequent monitored environment parameters for determining a subsequent rendering mode.
  • the threshold value of an environment parameter may be determined by a predetermined offset from a received value of the environment parameter at the timing the user closes the streaming. For example, for the parameter of CPU usage rate of the user terminal, if the received CPU usage rate at the timing the user closes the streaming is Nl%, the threshold value for the CPU usage rate may be determined to be (N1-T1)%, wherein T1 is a predetermined offset. In some embodiments, T1 could be from 2.5 to 5. For another example, for the parameter of memory usage rate of the user terminal, if the received memory usage rate at the timing the user closes the streaming is N2%, the threshold value for the memory usage rate may be determined to be (N2-T2)%, wherein T2 is a predetermined offset. In some embodiments, T2 could be from 2.5 to 5.
  • the environment parameters may include a number of times a freezing or a lag occurs during rendering the streaming in the first mode.
  • a freezing or a lag indicates a pause, stop or delay of the streaming content or the whole user terminal for a period of time such as, for example, 2 to 5 seconds.
  • a threshold value of (N3-T3) times may be determined, wherein T3 is a predetermined value which could be, for example, 2, 3 or 5.
  • the environment parameters may include a length of time during which the number of frames per second (FPS) with which the streaming is being rendered is below a specified value. For example, if within a specified time period (for example, 3, 5, or 10 mins) before the timing the user closes the streaming, the FPS is below a specified value (for example, 30 frames) for N4 seconds, a threshold value of (N4-T4) seconds may be determined, wherein T4 is a predetermined value which could be, for example, 2 to 5.
  • a specified time period for example, 3, 5, or 10 mins
  • N4-T4 a threshold value of (N4-T4) seconds
  • the environment parameters may include a network quality parameter whose value is determined by quality factors such as API response time, TCP connection time, DNS lookup time, SSL Handshake time, and Downstream bandwidth.
  • quality factors such as API response time, TCP connection time, DNS lookup time, SSL Handshake time, and Downstream bandwidth.
  • a score for each of the above factors may be determined according to Table 1 as below, and a value of the network quality parameter may be an average of the scores of the factors which are taken into account. Depending on the actual application or practice, all or some of the factors could be taken into account for determining the network quality parameter.
  • some quality factors may have higher weights than the others when calculating the network quality parameter.
  • a threshold value of (N5+T5) may be determined, wherein T5 is a predetermined value which could be, for example, 5 to 10.
  • the threshold value (N5+T5) may indicate a tighter criterion for subsequent streaming rendering to switch to a lower- performance or a less-demanding mode (such as the second mode) before the network quality parameter drops to the value of N5.
  • step S9 the threshold values determined in step S8 are stored in the storage unit 104.
  • step S10 the controller 110 again receives the environment parameters (or updated environment parameters) from the environment condition tracker 108.
  • step Sil the controller 110 reads the threshold values of the environment parameters stored in the storage unit 104.
  • step S12 the controller 110 compares the threshold values with the environment parameters to see if any environment parameter meets or reaches its threshold value. In some embodiments, if any one of the environments meets its threshold value, the controller 110 will determine to render the streaming in a second mode and instructs the Tenderer 112 to act accordingly in step S13, which may, for example, follow an action of the user to re-participate in or re-open a streaming on the APP.
  • step S12 the controller 110 determines to keep the first mode rendering and instructs the Tenderer 112 to act accordingly, and the flow may go back to step SI, which may, for example, follow an action of the user to re-participate in or re-open a streaming on the APP.
  • step S14 the Tenderer 112 receives data objects that correspond to the second mode from the storage unit 104.
  • step S15 the Tenderer 112 receives streaming data or video data (which may be from another user) that corresponds to the second mode from the decoder 114.
  • step S16 the rendered streaming is shown on the display 116.
  • the second mode indicates a lower-performance mode, which requires the Tenderer 112 to include fewer or lower-grade data objects (compared with the first mode) from the storage unit 104 and/ or to acquire a lower-resolution or a downgraded version of streaming data (compared with the first mode) from the decoder 114 for the streaming rendering.
  • the second mode instructed by the controller 110 will include fewer gifts, special effects, game functions, avatars, or animations for rendering compared with the first mode.
  • Rendering the streaming with fewer gifts, special effects, game functions, avatars, or animations may relieve or alleviate the user terminal's burden regarding the CPU usage rate and the memory usage rate, may reduce the number of times a freezing or a lag may happen, or may reduce the length of time the FPS is below a preferred or satisfying value.
  • This rendering mode adaptation may prevent the user from closing or leaving the streaming due to unsmooth rendering and may improve the user experience.
  • the second mode instructed by the controller 110 may include a downgraded version of video data from another user (for example, 360p or 480p) for rendering compared with the video data used in the first mode (for example, 720p). Rendering the streaming with a downgraded version of video data may relieve or alleviate the user terminal's burden regarding the network connection condition. This rendering mode adaptation may prevent the user from closing or leaving the streaming due to unsmooth rendering and improve the user experience.
  • Fig. 4 is a flowchart illustrating a process in accordance with some embodiments of the present disclosure. Fig. 4 shows how the threshold values for the environment parameters may be dynamically updated with respect to each user terminal.
  • step S400 the streaming is being rendered, which could be in the first mode, the second mode or any default mode.
  • step S402 the environment parameters are monitored, for example, by the environment condition tracker 108.
  • step S404 a close of the streaming is detected, for example, by the user behavior tracker 106.
  • step S406 the viewing time of the streaming is compared with a predetermined time period VI, which may be performed by the controller 110, for example. If the viewing time is greater than or equal to VI, then the flow goes to step S408, wherein the threshold values for all parameters are kept unchanged. In this situation, the user is judged to have left the streaming for a reason not related to the monitored environment parameters, and therefore there is no need to update or tighten the threshold values of the environment parameters, which will be used for determining the rendering mode in subsequent streaming viewing. For example, a viewing time greater than the predetermined time period VI may indicate that the user has already been satisfied with the streaming. In some embodiments, the predetermined time period VI may be greater than 30 mins or greater than 60 mins.
  • step S406 If the viewing time is found to be less than the predetermined time period VI in step S406, the close of the streaming may be viewed as related to the environment parameters and the flow goes to step S410.
  • step S410 the monitored environment parameters are checked, by the controller 110, for example, to see if the values are within their respective safe zones. If all environment parameters are within their safe zones, the flow goes to step S408, wherein the threshold values for all parameters are kept unchanged. If any environment parameter is greater than or exceeds its safe zone, the flow goes to step S412.
  • a safe zone is a range of the corresponding environment parameter that is considered to be unlikely to cause the user terminal overburdened for the streaming rendering. That is, if a detected environment parameter is in its safe zone when the streaming is closed, that environment parameter will not be considered as the reason for a possibly bad viewing experience that results in the streaming close, and hence there is no need to update or tighten the threshold value of the the environment parameter, which will be used for determining the rendering mode for subsequent streaming.
  • the range for each safe zone may be defined according to practical application. An example is shown in Table 2 as below.
  • step S412 environment parameters that are found to be outside of their respective safe zones will be given updated thresholds. Examples of methods of threshold updating are given in the description regarding step S8 in Fig. 3, and similar methods can be applied in step S412.
  • the threshold value of the environment parameter CPU usage rate for that specific user terminal may be updated to 75%, which is (80-5)%.
  • the rendering mode will be switched (for example, switched to the second mode described above) to incorporate fewer data objects (such as gifts, game functions, avatars or special effects) or an downgraded version of a data object (such as a streaming data or video data from another user) to alleviate or relieve the user terminal's burden and to keep a satisfying viewing experience.
  • data objects such as gifts, game functions, avatars or special effects
  • an downgraded version of a data object such as a streaming data or video data from another user
  • step S408 or step S412 the flow may go back to step S400 for subsequent streaming rendering, which may, for example, follow when the user terminal initiates streaming next time.
  • Embodiments of the present disclosure disclose a method, a system, and a computer- readable medium for dynamically or adaptively switching the rendering mode for streaming on a user terminal, based on monitored environment parameters of that user terminal, to ensure a satisfying viewing experience on that specific user terminal.
  • the monitored environment parameters are compared with their respective threshold values to determine whether it is necessary to switch the rendering mode to alleviate the user terminal for a smooth rendering.
  • the setting of threshold values of the environment parameters for each user could be very different, because they are set according to the relation or correlation between each user's behavior and the monitored environment parameters of the user terminal of that user.
  • the threshold values of the network quality parameters for user A and user B may be set to be 75 (70+5) and 65 (60+5).
  • the threshold values for each user terminal are dynamically adjusted continuously as described above, according to each user's behavior or preference.
  • a study time period is a time period during which a user's closing of streaming will not be used instantly to determine or update the threshold value.
  • the study time period allows the system or the APP to learn the behavior pattern of the user (or user terminal).
  • the APP may catch or calculate the correlation between a behavior of the user (such as closing the streaming) and various environment parameters. Therefore, the concerning level or tolerance level of the user regarding each environment parameter can be figured out to determine a priority or a tightening level of threshold setting for the various environment parameters.
  • the study time period may be a predetermined time period which could be, for example, 1 week or 1 month. In some embodiments, the study time period may be a variable time period until the user terminal finishes XI times of streaming viewing, wherein XI could be, for example, 5 to 10 times.
  • a threshold for the network quality parameter may be set before setting the threshold values for other environment parameters. This mechanism may prevent the situation of unnecessarily downgrading the streaming (for example, from a first mode to a second mode) due to variations of other environment parameters which are not concern points for that user.
  • the threshold values of the environment parameters could be relieved or loosened, by the controller 110, for example, when some conditions are met. For example, when an environment parameter meets its threshold value and the streaming is switched to a lower-performance rendering mode accordingly, there may be an option on the APP providing the streaming for the user to execute to return to the normal/ default or higher-performance rendering mode, regardless of the possibly deteriorated viewing experience due to the environment parameter meeting or exceeding its threshold value.
  • the threshold of that environment parameter may be set looser for subsequent streaming rendering to cater to that user's personal preference.
  • the threshold value may be loosened from 70% to 75% if the user consecutively executes the option to return to the higher-performance rendering mode every time the rendering mode is downgraded because the CPU usage rate reaches the original threshold value 70%.
  • the processing and procedures described in the present disclosure may be realized by software, hardware, or any combination of these in addition to what was explicitly described.
  • the processing and procedures described in the specification may be realized by implementing a logic corresponding to the processing and procedures in a medium such as an integrated circuit, a volatile memory, a non-volatile memory, a non-transitory computer-readable medium and a magnetic disk.
  • the processing and procedures described in the specification can be implemented as a computer program corresponding to the processing and procedures, and can be executed by various kinds of computers.
  • the factors, sub-scores, scores and weights may include a decay factor that causes the strength of the particular data or actions to decay with time, such that more recent data or actions are more relevant when calculating the factors, sub-scores, scores and weights.
  • the factors, sub-score, scores and weights may be continuously updated based on continued tracking of the data or actions. Any type of process or algorithm may be employed for assigning, combining, averaging, and so forth the score for each factor and the weights assigned to the factors and scores.
  • the highlight detection unit 35 may determine factors, sub-scores, scores and weights using machine-learning algorithms trained on historical data, historical actions and past user terminal responses, or data collected from user terminals by exposing them to various options and measuring responses.
  • the factors, subscores, scores and weights may be decided in any suitable manner.
  • system or method described in the above embodiments may be integrated into programs stored in a computer-readable non-transitory medium such as a solid state memory device, an optical disk storage device, or a magnetic disk storage device.
  • programs may be downloaded from a server via the Internet and be executed by processors.

Abstract

The present disclosure relates to a system, a method and a computer-readable medium for rendering a streaming on a user terminal. The method includes rendering the streaming in a first mode, receiving an environment parameter of the user terminal, receiving a timing when the user terminal closes the streaming, determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming, receiving an updated environment parameter of the user terminal, and rendering the streaming in a second mode if the updated environment parameter meets the threshold value. The second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering. The present disclosure can customize the rendering mode for each user and maximize the satisfaction of viewing streaming for each user.

Description

PCT Patent Application
SYSTEM, METHOD AND COMPUTER-READABLE MEDIUM FOR RENDERING A STREAMING
Field of the Invention
[0001] The present disclosure relates to a system, a method and a computer-readable medium for rendering a streaming.
Description of the Prior Art
[0002] Live streaming refers to online streaming media or live video simultaneously recorded and broadcast in real-time. Live streaming encompasses a wide variety of topics, from social media to video games to professional sports.
[0003] User interaction via chat rooms forms a major component of live streaming. Conventionally, to boost the motivation of viewers to participate in the live streaming, the application or platform on which the live streaming is viewed provides functions such as gift sending or gaming to improve the interaction between the viewers and the streamers (or broadcasters).
Summary of the Invention
[0004] A method according to one embodiment of the present disclosure is a method for rendering a streaming on a user terminal being executed by one or a plurality of computers, and includes: rendering the streaming in a first mode, receiving an environment parameter of the user terminal, receiving a timing when the user terminal closes the streaming, determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming, receiving an updated environment parameter of the user terminal, and rendering the streaming in a second mode if the updated environment parameter meets the threshold value. The second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
[0005] A system according to one embodiment of the present disclosure is a system for rendering a streaming on a user terminal that includes one or a plurality of processors, and the one or plurality of computer processors execute a machine-readable instruction to perform: rendering the streaming in a first mode, receiving an environment parameter of the user terminal, receiving a timing when the user terminal closes the streaming, determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming, receiving an updated environment parameter of the user terminal, and rendering the streaming in a second mode if the updated environment parameter meets the threshold value. The second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
[0006] A computer-readable medium according to one embodiment of the present disclosure is a non-transitory computer-readable medium including a program for rendering a streaming on a user terminal, and the program causes one or a plurality of computers to execute: rendering the streaming in a first mode, receiving an environment parameter of the user terminal, receiving a timing when the user terminal closes the streaming, determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming, receiving an updated environment parameter of the user terminal, and rendering the streaming in a second mode if the updated environment parameter meets the threshold value. The second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
Brief description of the drawings
[0007] FIG. 1 shows a schematic configuration of a communication system in accordance with some embodiments of the present disclosure.
[0008] FIG. 2 shows an exemplary functional configuration of a communication system in accordance with some embodiments of the present disclosure.
[0009] Fig. 3 shows an exemplary sequence chart illustrating an operation of a communication system in accordance with some embodiments of the present disclosure.
[0010] Fig. 4 shows a flowchart illustrating a process in accordance with some embodiments of the present disclosure.
Detailed Description
[0011] A streaming watched by a user (such as a viewer) on the display of a user terminal (such as a smartphone) is the result of processing or rendering various data objects. Some of the data objects may exist on the user terminal (ex., may have been downloaded along with the application used to watch the live streaming) and some of the data objects may be received by the user terminal through a network. For example, in a live streaming chat room, the data objects may include a streaming data or a live video/ audio data from another user (such as a streamer) and other objects to perform functions such as gaming, special effects, gift or avatars.
[0012] For a live streaming provider, which may be the provider of the application through which the live streaming is watched by viewers, it is important to make sure that the viewers enjoy the streaming, or stay in the chat room, as long as possible. And it is important to prevent the viewers from leaving the streaming or the chat room due to environment or device factors such as poor network quality or overloaded/ overburdened device, which may cause delay, lag, or freezing and jeopardize the viewing experience.
[0013] Therefore, how to guarantee a smooth viewing experience in various environment or device conditions is crucial. The present disclosure provides systems, methods and computer-readable mediums that can dynamically or adaptively adjust the data objects used to render the streaming, according to user behaviors or preferences in various conditions, to optimize the viewing experience.
[0014] FIG. 1 shows a schematic configuration of a communication system 1 according to some embodiments of the present disclosure. The communication system 1 provides a live streaming service with interaction via a content. Here, the term "content" refers to a digital content that can be played on a computer device. In other words, the communication system 1 enables a user to participate in real-time interaction with other users on-line. The communication system 1 includes a plurality of user terminals 10, a backend server 30, and a streaming server 40. The user terminals 10, the backend server 30 and the streaming server 40 are connected via a network 90, which may be the Internet, for example. The backend server 30 may be a server for synchronizing interaction between the user terminals and/ or the streaming server 40. In some embodiments, the backend server 30 may be referred to as the backend server of an application (APP) provider. The streaming server 40 is a server for handling or providing streaming data or video data. In some embodiments, the streaming server 40 may be a server from a content delivery network (CDN) provider. In some embodiments, the backend server 30 and the streaming server 40 may be independent servers. In some embodiments, the backend server 30 and the streaming server 40 may be integrated into one server. The user terminals 10 are client devices for the live streaming. In some embodiments, the user terminal 10 may be referred to as viewer, streamer, anchor, podcaster, audience, listener or the like. Each of the user terminal 10, the backend server 30, and the streaming server 40 is an example of an information-processing device. In some embodiments, the streaming may be live streaming or video replay. In some embodiments, the streaming may be audio streaming and/or video streaming. In some embodiments, the streaming may include contents such as online shopping, talk shows, talent shows, entertainment events, sports events, music videos, movies, comedy, concerts or the like.
[0015] FIG. 2 shows an exemplary functional configuration of the communication system 1. In this embodiment, the user terminal 10 includes a Ul unit 102, a storage unit 104, a user behavior tracker 106, an environment condition tracker 108, a controller 110, a Tenderer 112, a decoder 114, and a display 116. In some embodiments, each of the above components can be viewed as a processing unit or a processor.
[0016] The Ul unit 102 is the interface through which a user of the user terminal 10 operates or plays an APP, which may be an APP providing streaming service in some embodiments. The user behavior tracker 106 is configured to monitor or track behaviors or actions of the user terminal 10 and deliver the results to the controller 110. For example, the actions may include participating in/ opening a streaming or leaving/ closing a streaming on the APP.
[0017] The storage unit 104 is configured to store a program of the APP, which includes instructions or data objects necessary for the APP to run on the user terminal 10. The storage unit 104 may be constituted with a DRAM or the like, for example. In some embodiments, The storage unit 104 is constituted with a magnetic disk, a flash memory, or the like, for example. The storage unit 104 stores various kinds of programs including an operating system, various kinds of data, and the like.
[0018] The environment condition tracker 108 is configured to monitor or track the environment condition under which the APP is operated, and deliver the results to the controller 110. The environment condition tracker 108 may detect various environment parameters that are related to the operation/ playback of the streaming and thus the viewing experience of the user. In some embodiments, the environment parameters may include a CPU usage rate of the user terminal 10, a memory usage rate of the user terminal 10, a time duration or a number of times a freezing/ lag happens during the streaming, a length of time during which the number of frames per second (FPS) with which the streaming is being played is below a predetermined value, and network quality parameters that indicate the quality of the network 90. For example, the network quality parameters may include an application programming interface (API) response time, a transmission control protocol (TCP) connection time, a domain name system (DNS) lookup time, an security sockets layer (SSL) handshake time, and a downstream bandwidth regarding the streaming service through the network 90. [0019] The controller 110 receives the user behavior data and the environment parameters from the user behavior tracker 106 and the environment condition tracker 108, and determines how to render or present the subsequent streaming. For example, the controller may determine a rendering mode based on the user behavior data and the environment parameters, access the storage unit 104 and/or the streaming server 40 for the corresponding data objects, and instructs the Tenderer 112 to render the streaming.
[0020] In some embodiments, the controller 110 is configured as a CPU, a GPU, or the like, reads various programs that may be part of an APP and are stored in the storage unit 104 or the like into a main memory (not shown here), and executes various kinds of commands or machine-readable instructions included in the programs.
[0021] The decoder 114 is configured to convert streaming data from the streaming server 40 into video data or frame image for the Tenderer 112 to render the streaming. The streaming data may be provided to the streaming server 40 by another user who could be referred to as a streamer, a broadcaster or an anchor. There may be various versions or grades of the streaming data from one streamer. For example, the streaming server 40 may receive a streaming media from a streamer and convert it into versions with different resolutions such as 360p, 480p and 720p. Different versions or grades of streaming data may be stored in the streaming server 40 with different uniform resource locators (URL). In some embodiments, those URLs are assigned by the backend server 30 and transferred to the user terminal 10 by the backend server 30. The decoder 114 may reach to a URL for a certain grade of streaming data according to the rendering mode determined by the controller 110.
[0022] The Tenderer 112 may be configured to perform: receiving instructions regarding the rendering mode from the controller 110; receiving the data objects corresponding to the rendering mode from the storage unit 104; receiving the streaming data (which could be referred to as another data object) corresponding to the rendering mode from the decoder 114; and rendering the streaming media on the display 116. The display 116 could be or include a screen on which the streaming media is enjoyed by the user of the user terminal 10.
[0023] Fig. 3 shows an exemplary sequence chart illustrating an operation of the communication system 1 according to some embodiments of the present disclosure.
[0024] In step SI, the controller 110 instructs the Tenderer 112 to render a streaming or a streaming media in a first mode, which may, for example, follow an action of the user to participate in or open a streaming on an APP. In step S2, the Tenderer 112 receives data objects that correspond to the first mode from the storage unit 104. In step S3, the Tenderer 112 receives streaming data or video data (which may be from another user) that corresponds to the first mode from the decoder 114. In step S4, the rendered streaming is shown on the display 116.
[0025] In some embodiments, the first mode indicates a higher-performance mode, which requires the Tenderer 112 to include more or higher-grade data objects from the storage unit 104 and/ or to acquire a higher-resolution version of streaming data from the decoder 114 for the streaming rendering.
[0026] In step S5, the controller 110 receives various environment parameters from the environment condition tracker 108. In step S6, the user behavior tracker 106 monitors the behavior of the user through the Ul unit 102. In step S7, the user behavior tracker 106 detects a timing the user closes or turns off the streaming and reports to the controller 110.
[0027] In step S8, the controller 110 determines a threshold value for each of the environment parameters based on the timing the user closes the streaming. The threshold values may be used by the controller 110 to compare with subsequent monitored environment parameters for determining a subsequent rendering mode.
[0028] In some embodiments, the threshold value of an environment parameter may be determined by a predetermined offset from a received value of the environment parameter at the timing the user closes the streaming. For example, for the parameter of CPU usage rate of the user terminal, if the received CPU usage rate at the timing the user closes the streaming is Nl%, the threshold value for the CPU usage rate may be determined to be (N1-T1)%, wherein T1 is a predetermined offset. In some embodiments, T1 could be from 2.5 to 5. For another example, for the parameter of memory usage rate of the user terminal, if the received memory usage rate at the timing the user closes the streaming is N2%, the threshold value for the memory usage rate may be determined to be (N2-T2)%, wherein T2 is a predetermined offset. In some embodiments, T2 could be from 2.5 to 5.
[0029] In some embodiments, the environment parameters may include a number of times a freezing or a lag occurs during rendering the streaming in the first mode. A freezing or a lag indicates a pause, stop or delay of the streaming content or the whole user terminal for a period of time such as, for example, 2 to 5 seconds. For example, if within a specified time period (for example, 3, 5, or 10 mins) before the timing the user closes the streaming, the number of times a freezing or a lag is detected is N3, a threshold value of (N3-T3) times may be determined, wherein T3 is a predetermined value which could be, for example, 2, 3 or 5. [0030] In some embodiments, the environment parameters may include a length of time during which the number of frames per second (FPS) with which the streaming is being rendered is below a specified value. For example, if within a specified time period (for example, 3, 5, or 10 mins) before the timing the user closes the streaming, the FPS is below a specified value (for example, 30 frames) for N4 seconds, a threshold value of (N4-T4) seconds may be determined, wherein T4 is a predetermined value which could be, for example, 2 to 5.
[0031] In some embodiments, the environment parameters may include a network quality parameter whose value is determined by quality factors such as API response time, TCP connection time, DNS lookup time, SSL Handshake time, and Downstream bandwidth. For example, a score for each of the above factors may be determined according to Table 1 as below, and a value of the network quality parameter may be an average of the scores of the factors which are taken into account. Depending on the actual application or practice, all or some of the factors could be taken into account for determining the network quality parameter. In some embodiments, some quality factors may have higher weights than the others when calculating the network quality parameter.
Figure imgf000009_0001
Figure imgf000010_0001
Table 1
[0032] In some embodiments, when in the vicinity of the timing the user closes the streaming, if the value of the network quality parameter is N5, a threshold value of (N5+T5) may be determined, wherein T5 is a predetermined value which could be, for example, 5 to 10. The threshold value (N5+T5) may indicate a tighter criterion for subsequent streaming rendering to switch to a lower- performance or a less-demanding mode (such as the second mode) before the network quality parameter drops to the value of N5.
[0033] In step S9, the threshold values determined in step S8 are stored in the storage unit 104. In step S10, the controller 110 again receives the environment parameters (or updated environment parameters) from the environment condition tracker 108. In step Sil, the controller 110 reads the threshold values of the environment parameters stored in the storage unit 104.
[0034] In step S12, the controller 110 compares the threshold values with the environment parameters to see if any environment parameter meets or reaches its threshold value. In some embodiments, if any one of the environments meets its threshold value, the controller 110 will determine to render the streaming in a second mode and instructs the Tenderer 112 to act accordingly in step S13, which may, for example, follow an action of the user to re-participate in or re-open a streaming on the APP. If no environment parameter meets its threshold value in step S12, the controller 110 determines to keep the first mode rendering and instructs the Tenderer 112 to act accordingly, and the flow may go back to step SI, which may, for example, follow an action of the user to re-participate in or re-open a streaming on the APP.
[0035] In step S14, the Tenderer 112 receives data objects that correspond to the second mode from the storage unit 104. In step S15, the Tenderer 112 receives streaming data or video data (which may be from another user) that corresponds to the second mode from the decoder 114. In step S16, the rendered streaming is shown on the display 116.
[0036] In some embodiments, the second mode indicates a lower-performance mode, which requires the Tenderer 112 to include fewer or lower-grade data objects (compared with the first mode) from the storage unit 104 and/ or to acquire a lower-resolution or a downgraded version of streaming data (compared with the first mode) from the decoder 114 for the streaming rendering.
[0037] In some embodiments, if the environment parameters found to meet their threshold values include the CPU usage rate, the memory usage rate, the number of times a freezing or a lag occurs during rendering the streaming in the first mode, or the length of time during which the FPS with which the streaming is rendered in the first mode is below a predetermined value, the second mode instructed by the controller 110 will include fewer gifts, special effects, game functions, avatars, or animations for rendering compared with the first mode. Rendering the streaming with fewer gifts, special effects, game functions, avatars, or animations may relieve or alleviate the user terminal's burden regarding the CPU usage rate and the memory usage rate, may reduce the number of times a freezing or a lag may happen, or may reduce the length of time the FPS is below a preferred or satisfying value. This rendering mode adaptation may prevent the user from closing or leaving the streaming due to unsmooth rendering and may improve the user experience.
[0038] In some embodiments, if the environment parameters found to meet their threshold values include the network quality parameter determined by the API response time, the TCP connection time, the DNS lookup time, the SSL handshake time and/or the downstream bandwidth, the second mode instructed by the controller 110 may include a downgraded version of video data from another user (for example, 360p or 480p) for rendering compared with the video data used in the first mode (for example, 720p). Rendering the streaming with a downgraded version of video data may relieve or alleviate the user terminal's burden regarding the network connection condition. This rendering mode adaptation may prevent the user from closing or leaving the streaming due to unsmooth rendering and improve the user experience. [0039] Fig. 4 is a flowchart illustrating a process in accordance with some embodiments of the present disclosure. Fig. 4 shows how the threshold values for the environment parameters may be dynamically updated with respect to each user terminal.
[0040] In step S400, the streaming is being rendered, which could be in the first mode, the second mode or any default mode. In step S402, the environment parameters are monitored, for example, by the environment condition tracker 108. In step S404, a close of the streaming is detected, for example, by the user behavior tracker 106.
[0041] In step S406, the viewing time of the streaming is compared with a predetermined time period VI, which may be performed by the controller 110, for example. If the viewing time is greater than or equal to VI, then the flow goes to step S408, wherein the threshold values for all parameters are kept unchanged. In this situation, the user is judged to have left the streaming for a reason not related to the monitored environment parameters, and therefore there is no need to update or tighten the threshold values of the environment parameters, which will be used for determining the rendering mode in subsequent streaming viewing. For example, a viewing time greater than the predetermined time period VI may indicate that the user has already been satisfied with the streaming. In some embodiments, the predetermined time period VI may be greater than 30 mins or greater than 60 mins.
[0042] If the viewing time is found to be less than the predetermined time period VI in step S406, the close of the streaming may be viewed as related to the environment parameters and the flow goes to step S410.
[0043] In step S410, the monitored environment parameters are checked, by the controller 110, for example, to see if the values are within their respective safe zones. If all environment parameters are within their safe zones, the flow goes to step S408, wherein the threshold values for all parameters are kept unchanged. If any environment parameter is greater than or exceeds its safe zone, the flow goes to step S412.
[0044] A safe zone is a range of the corresponding environment parameter that is considered to be unlikely to cause the user terminal overburdened for the streaming rendering. That is, if a detected environment parameter is in its safe zone when the streaming is closed, that environment parameter will not be considered as the reason for a possibly bad viewing experience that results in the streaming close, and hence there is no need to update or tighten the threshold value of the the environment parameter, which will be used for determining the rendering mode for subsequent streaming. The range for each safe zone may be defined according to practical application. An example is shown in Table 2 as below.
Figure imgf000013_0001
Table 2
[0045] In step S412, environment parameters that are found to be outside of their respective safe zones will be given updated thresholds. Examples of methods of threshold updating are given in the description regarding step S8 in Fig. 3, and similar methods can be applied in step S412.
[0046] For an environment parameter that is outside of the safe zone, it is likely that the user closed the streaming due to that specific environment parameter reaching a value that impairs or deteriorates the viewing experience for that user. For example, the viewing experience may be impaired when the CPU usage rate reaches 80%, which may happen when the user concurrently operates various applications. Therefore, an updated tighter threshold of that environment parameter for that specific user is needed to prevent the user from leaving a streaming for the same reason in subsequent streaming viewing. For example, the threshold value of the environment parameter CPU usage rate for that specific user terminal may be updated to 75%, which is (80-5)%. In this way, next time the user is viewing a streaming, when the CPU usage rate climbs to meet 75%, the rendering mode will be switched (for example, switched to the second mode described above) to incorporate fewer data objects (such as gifts, game functions, avatars or special effects) or an downgraded version of a data object (such as a streaming data or video data from another user) to alleviate or relieve the user terminal's burden and to keep a satisfying viewing experience.
[0047] After step S408 or step S412, the flow may go back to step S400 for subsequent streaming rendering, which may, for example, follow when the user terminal initiates streaming next time.
[0048] Embodiments of the present disclosure disclose a method, a system, and a computer- readable medium for dynamically or adaptively switching the rendering mode for streaming on a user terminal, based on monitored environment parameters of that user terminal, to ensure a satisfying viewing experience on that specific user terminal. The monitored environment parameters are compared with their respective threshold values to determine whether it is necessary to switch the rendering mode to alleviate the user terminal for a smooth rendering. The setting of threshold values of the environment parameters for each user could be very different, because they are set according to the relation or correlation between each user's behavior and the monitored environment parameters of the user terminal of that user.
[0049] Different users may have different tolerance levels regarding different environment parameters. For example, if user A always turned off the streaming when the network quality parameter reaches or deteriorates to 70 and user B always turned off the streaming when the network quality parameter reaches or deteriorates to 60, the threshold values of the network quality parameters for user A and user B may be set to be 75 (70+5) and 65 (60+5). The threshold values for each user terminal are dynamically adjusted continuously as described above, according to each user's behavior or preference. By dynamically switching the rendering mode based on the threshold values of various environment parameters which are customized for each user terminal, the present disclosure can effectively maximize the satisfaction of streaming viewing for each user.
[0050] In some embodiments, there may be a study time period before setting the threshold values of the environment parameters. A study time period is a time period during which a user's closing of streaming will not be used instantly to determine or update the threshold value. For example, during an initial stage of a user viewing streaming in the APP, the study time period allows the system or the APP to learn the behavior pattern of the user (or user terminal). During the learning process, through several rounds of streaming viewing, the APP may catch or calculate the correlation between a behavior of the user (such as closing the streaming) and various environment parameters. Therefore, the concerning level or tolerance level of the user regarding each environment parameter can be figured out to determine a priority or a tightening level of threshold setting for the various environment parameters. In some embodiments, the study time period may be a predetermined time period which could be, for example, 1 week or 1 month. In some embodiments, the study time period may be a variable time period until the user terminal finishes XI times of streaming viewing, wherein XI could be, for example, 5 to 10 times.
[0051] For example, during the study time period, if user A is found to be more affected by the network quality parameter, that is, the closing behavior is highly correlated with a lower value of the network quality parameter and is less correlated with other environment parameters, then a threshold for the network quality parameter may be set before setting the threshold values for other environment parameters. This mechanism may prevent the situation of unnecessarily downgrading the streaming (for example, from a first mode to a second mode) due to variations of other environment parameters which are not concern points for that user.
[0052] In some embodiments, there may be a mechanism with which the threshold values of the environment parameters could be relieved or loosened, by the controller 110, for example, when some conditions are met. For example, when an environment parameter meets its threshold value and the streaming is switched to a lower-performance rendering mode accordingly, there may be an option on the APP providing the streaming for the user to execute to return to the normal/ default or higher-performance rendering mode, regardless of the possibly deteriorated viewing experience due to the environment parameter meeting or exceeding its threshold value.
[0053] In some embodiments, if a user consecutively (for example, for a consecutive 3 or 5 times) executes the above option to insist on a higher-performance rendering mode with respective to a specific environment parameter (despite that the environment parameter already meets its threshold value), then the threshold of that environment parameter may be set looser for subsequent streaming rendering to cater to that user's personal preference. For example, in the case that the environment parameter is the CPU usage rate, the threshold value may be loosened from 70% to 75% if the user consecutively executes the option to return to the higher-performance rendering mode every time the rendering mode is downgraded because the CPU usage rate reaches the original threshold value 70%.
[0054] The processing and procedures described in the present disclosure may be realized by software, hardware, or any combination of these in addition to what was explicitly described. For example, the processing and procedures described in the specification may be realized by implementing a logic corresponding to the processing and procedures in a medium such as an integrated circuit, a volatile memory, a non-volatile memory, a non-transitory computer-readable medium and a magnetic disk. Further, the processing and procedures described in the specification can be implemented as a computer program corresponding to the processing and procedures, and can be executed by various kinds of computers.
[0055] In some embodiments, the factors, sub-scores, scores and weights may include a decay factor that causes the strength of the particular data or actions to decay with time, such that more recent data or actions are more relevant when calculating the factors, sub-scores, scores and weights. The factors, sub-score, scores and weights may be continuously updated based on continued tracking of the data or actions. Any type of process or algorithm may be employed for assigning, combining, averaging, and so forth the score for each factor and the weights assigned to the factors and scores. In particular embodiments, the highlight detection unit 35 may determine factors, sub-scores, scores and weights using machine-learning algorithms trained on historical data, historical actions and past user terminal responses, or data collected from user terminals by exposing them to various options and measuring responses. In some embodiments, the factors, subscores, scores and weights may be decided in any suitable manner.
[0056] Furthermore, the system or method described in the above embodiments may be integrated into programs stored in a computer-readable non-transitory medium such as a solid state memory device, an optical disk storage device, or a magnetic disk storage device. Alternatively, the programs may be downloaded from a server via the Internet and be executed by processors.
[0057] Although technical content and features of the present invention are described above, a person having common knowledge in the technical field of the present invention may still make many variations and modifications without disobeying the teaching and disclosure of the present invention. Therefore, the scope of the present invention is not limited to the embodiments that are already disclosed, but includes another variation and modification that do not disobey the present invention, and is the scope covered by the patent application scope.
Description of reference numerals
1 Communication system
10 User terminal
102 Ul unit
104 Storage Unit
106 User Behavior Tracker
108 Environment Condition Tracker
110 Controller
112 Renderer
114 Decoder
116 Display
30 Backend server
40 Streaming server
90 Network

Claims

CLAIMS We Claim:
1. A method for rendering a streaming on a user terminal, comprising: rendering the streaming in a first mode; receiving an environment parameter of the user terminal; receiving a timing when the user terminal closes the streaming; determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming; receiving an updated environment parameter of the user terminal; and rendering the streaming in a second mode if the updated environment parameter meets the threshold value; wherein the second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
2. The method according to claim 1, wherein the threshold value of the environment parameter is determined by a predetermined offset from a value of the environment parameter at the timing the user terminal closes the streaming.
3. The method according to claim 2, wherein the environment parameter is a CPU usage rate of the user terminal or a memory usage rate of the user terminal.
4. The method according to claim 2, wherein the environment parameter is a number of times a freezing or a lag occurs during rendering the streaming in the first mode within a specified time period before the timing the user closes the streaming.
5. The method according to claim 2, wherein the environment parameter is a length of time during which a number of frames per second with which the streaming is rendered in the first mode is below a predetermined time period.
6. The method according to claim 2, wherein the environment parameter is a network quality parameter determined by quality factors including an API response time, a TCP connection time, a DNS lookup time, an SSL Handshake time, or a Downstream bandwidth.
7. The method according to claim 1, wherein the data objects include a gift, a special effect, a game function, an avatar, an animation, or a video data from another user.
8. The method according to claim 3, 4 or 5, wherein the data objects include gifts, special effects, game functions, avatars, or animations, and the second mode includes fewer data objects than the first mode for the rendering.
9. The method according to claim 6, wherein the data objects include video data from another user, and the second mode includes a downgraded version of the video data from another user for the rendering.
10. The method according to claim 6, wherein a score for each quality factor is defined according to a performance grade of the quality factor, and a value of the network quality parameter is determined to be an average of the scores of the quality factors.
11. The method according to claim 1, wherein the determining a threshold value of the environment parameter comprises: defining a safe zone for the environment parameter; determining if the environment parameter is within the safe zone at the timing the user terminal closes the streaming; keeping the threshold value unchanged if the environment parameter is within the safe zone at the timing the user terminal closes the streaming; and tightening the threshold value if the environment parameter is outside of the safe zone at the timing the user terminal closes the streaming.
12. The method according to claim 1, further comprising: receiving a plurality of environment parameters of the user terminal in a study time period; receiving a plurality of timings when the user terminal closes the streaming in the study time period; calculating a correlation between the user terminal closing the streaming and each environment parameter; and determining a threshold value of an environment parameter with the highest correlation regarding the user terminal closing the streaming before determining threshold values of the rest environment parameters. 17
13. A system for rendering a streaming on a user terminal, comprising one or a plurality of processors, wherein the one or plurality of processors execute a machine-readable instruction to perform: rendering the streaming in a first mode; receiving an environment parameter of the user terminal; receiving a timing when the user terminal closes the streaming; determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming; receiving an updated environment parameter of the user terminal; and rendering the streaming in a second mode if the updated environment parameter meets the threshold value; wherein the second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
14. The system according to claim 13, wherein the threshold value of the environment parameter is determined by a predetermined offset from a value of the environment parameter at the timing the user terminal closes the streaming.
15. The system according to claim 13, wherein the environment parameter is a CPU usage rate or the user terminal, a memory usage rate of the user terminal, a number of times a freezing or a lag occurs during rendering the streaming in the first mode within a specified time period before the timing the user closes the streaming, a length of time during which a number of frames per second with which the streaming is rendered in the first mode is below a predetermined time period, or a network quality parameter determined by quality factors including an API response time, a TCP connection time, a DNS lookup time, an SSL Handshake time, or a Downstream bandwidth.
16. The system according to claim 13, wherein the data objects include a gift, a special effect, a game function, an avatar, an animation, or a video data from another user. 18
17. The system according to claim 13, wherein the determining a threshold value of the environment parameter comprises: defining a safe zone for the environment parameter; determining if the environment parameter is within the safe zone at the timing the user terminal closes the streaming; keeping the threshold value unchanged if the environment parameter is within the safe zone at the timing the user terminal closes the streaming; and tightening the threshold value if the environment parameter is outside of the safe zone at the timing the user terminal closes the streaming.
18. The system according to claim 13, wherein the one or plurality of processors execute the machine-readable instruction to further perform: receiving a plurality of environment parameters of the user terminal in a study time period; receiving a plurality of timings when the user terminal closes the streaming in the study time period; calculating a correlation between the user terminal closing the streaming and each environment parameter; and determining a threshold value of an environment parameter with the highest correlation regarding the user terminal closing the streaming before determining threshold values of the rest environment parameters.
19. A non-transitory computer-readable medium including a program for rendering a streaming on a user terminal, wherein the program causes one or a plurality of computers to execute: rendering the streaming in a first mode; receiving an environment parameter of the user terminal; receiving a timing when the user terminal closes the streaming; determining a threshold value of the environment parameter based on the timing the user terminal closes the streaming; receiving an updated environment parameter of the user terminal; and rendering the streaming in a second mode if the updated environment parameter meets the threshold value; wherein 19 the second mode includes fewer data objects than the first mode or includes a downgraded version of a data object in the first mode for the rendering.
PCT/US2021/052775 2021-09-30 2021-09-30 System, method and computer-readable medium for rendering a streaming WO2023055363A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2022516347A JP7406713B2 (en) 2021-09-30 2021-09-30 Systems, methods, and computer-readable media for rendering streaming
PCT/US2021/052775 WO2023055363A1 (en) 2021-09-30 2021-09-30 System, method and computer-readable medium for rendering a streaming
US17/880,707 US11870828B2 (en) 2021-09-30 2022-08-04 System, method and computer-readable medium for rendering a streaming
US18/523,168 US20240098125A1 (en) 2021-09-30 2023-11-29 System, method and computer-readable medium for rendering a streaming

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2021/052775 WO2023055363A1 (en) 2021-09-30 2021-09-30 System, method and computer-readable medium for rendering a streaming

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/052777 Continuation-In-Part WO2023055364A1 (en) 2021-09-30 2021-09-30 System, method and computer-readable medium for determining a cache ttl

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/880,707 Continuation-In-Part US11870828B2 (en) 2021-09-30 2022-08-04 System, method and computer-readable medium for rendering a streaming

Publications (1)

Publication Number Publication Date
WO2023055363A1 true WO2023055363A1 (en) 2023-04-06

Family

ID=85783377

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/052775 WO2023055363A1 (en) 2021-09-30 2021-09-30 System, method and computer-readable medium for rendering a streaming

Country Status (2)

Country Link
JP (1) JP7406713B2 (en)
WO (1) WO2023055363A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110307900A1 (en) * 2010-06-14 2011-12-15 Microsoft Corporation Changing streaming media quality level based on current device resource usage
US20170060538A1 (en) * 2015-08-28 2017-03-02 International Business Machines Corporation Fusion recommendation for performance management in streams
US20190373036A1 (en) * 2018-05-31 2019-12-05 Microsoft Technology Licensing, Llc Modifying content streaming based on device parameters

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3517631B2 (en) * 2000-05-08 2004-04-12 株式会社リコー Digest video storage method and digest video storage device
JP2003333569A (en) * 2002-05-13 2003-11-21 Sony Corp File format, information processing system, information processing apparatus and method, recording medium, and program
US9300647B2 (en) * 2014-01-15 2016-03-29 Sonos, Inc. Software application and zones
JP2016036103A (en) * 2014-08-04 2016-03-17 富士通株式会社 Image distribution server and image distribution method
US10315108B2 (en) * 2015-08-19 2019-06-11 Sony Interactive Entertainment America Llc Local application quick start with cloud transitioning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110307900A1 (en) * 2010-06-14 2011-12-15 Microsoft Corporation Changing streaming media quality level based on current device resource usage
US20170060538A1 (en) * 2015-08-28 2017-03-02 International Business Machines Corporation Fusion recommendation for performance management in streams
US20190373036A1 (en) * 2018-05-31 2019-12-05 Microsoft Technology Licensing, Llc Modifying content streaming based on device parameters

Also Published As

Publication number Publication date
JP2023547287A (en) 2023-11-10
JP7406713B2 (en) 2023-12-28

Similar Documents

Publication Publication Date Title
CN111587137B (en) Detecting and compensating for display lag in gaming systems
US10904639B1 (en) Server-side fragment insertion and delivery
US10986414B1 (en) Resource management for video playback and chat
US11870828B2 (en) System, method and computer-readable medium for rendering a streaming
US11463784B2 (en) Coordination of media content delivery to multiple media players
US9060207B2 (en) Adaptive video streaming over a content delivery network
US20200099732A1 (en) Catching up to the live playhead in live streaming
US20120102184A1 (en) Apparatus and method for adaptive streaming of content with user-initiated quality adjustments
US10476922B2 (en) Multi-deterministic dynamic linear content streaming
US20210368223A1 (en) Method and apparatus for adjusting timestamp of live streaming video
WO2012055022A1 (en) Delivery quality of experience (qoe) in a computer network
CN113457123B (en) Interaction method and device based on cloud game, electronic equipment and readable storage medium
CA2888218A1 (en) Playback stall avoidance in adaptive media streaming
CN111083514B (en) Live broadcast method and device, electronic equipment and storage medium
WO2015031258A1 (en) Http streaming client adaptation algorithm based on proportional-integral control
US11910071B2 (en) Presenting media items on a playing device
US20130046856A1 (en) Event-triggered streaming of windowed video content
KR20150027262A (en) Provision of a personalized media content
WO2023055363A1 (en) System, method and computer-readable medium for rendering a streaming
TWI798849B (en) System, method and computer-readable medium for rendering a streaming
CN108322787A (en) Video stream distributing method, device and electronic equipment
Staelens et al. On the impact of video stalling and video quality in the case of camera switching during adaptive streaming of sports content
US20150026711A1 (en) Method and apparatus for video content distribution
JP2012508918A (en) Methods, systems, and devices for aggregating multimedia assets and later provisioning to client devices
Jackson et al. A user study of Netflix streaming

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2022516347

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21959619

Country of ref document: EP

Kind code of ref document: A1