US20160259453A1 - Dynamic adjustment of cloud game data streams to output device and network quality - Google Patents

Dynamic adjustment of cloud game data streams to output device and network quality Download PDF

Info

Publication number
US20160259453A1
US20160259453A1 US14/641,121 US201514641121A US2016259453A1 US 20160259453 A1 US20160259453 A1 US 20160259453A1 US 201514641121 A US201514641121 A US 201514641121A US 2016259453 A1 US2016259453 A1 US 2016259453A1
Authority
US
United States
Prior art keywords
data
output
readable medium
computer readable
transitory computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US14/641,121
Inventor
Roelof Roderick Colenbrander
David Perry
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment LLC
Original Assignee
Sony Interactive Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment America LLC filed Critical Sony Interactive Entertainment America LLC
Priority to US14/641,121 priority Critical patent/US20160259453A1/en
Assigned to SONY COMPUTER ENTERTAINMENT AMERICA LLC reassignment SONY COMPUTER ENTERTAINMENT AMERICA LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PERRY, DAVID, COLENBRANDER, Roelof Roderick
Assigned to SONY INTERACTIVE ENTERTAINMENT AMERICA LLC reassignment SONY INTERACTIVE ENTERTAINMENT AMERICA LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SONY COMPUTER ENTERTAINMENT AMERICA LLC
Publication of US20160259453A1 publication Critical patent/US20160259453A1/en
Assigned to Sony Interactive Entertainment LLC reassignment Sony Interactive Entertainment LLC MERGER AND CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SONY INTERACTIVE ENTERTAINMENT AMERICA LLC, Sony Interactive Entertainment LLC
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/26208Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/32Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections
    • A63F13/323Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections between game devices with different hardware characteristics, e.g. hand-held game devices connectable to game consoles or arcade machines
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/335Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/358Adapting the game course according to the network or server load, e.g. for reducing latency due to different connection speeds between clients
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5375Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/77Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games

Abstract

Aspects of the present disclosure relate to systems and methods for the dynamic adjustment of data streamed over a network which is then displayed on an output device. In particular, aspects of the present disclosure relate to systems and methods for determining the display capabilities of an output device and then formatting two or more streams of data configured for display on the output device, wherein the user interface and video data are contained in separate streams, such that the content is displayed in a manner consistent with the display capabilities of the output device. The output can then be adjusted or prioritized based on device orientation and network quality.

Description

    FIELD
  • The present disclosure relates to the dynamic adjustment of data streamed over a network. In particular, aspects of the present disclosure relate to systems and methods for determining the display capabilities of an output device and then formatting two or more streams of data configured for display on the output device.
  • BACKGROUND
  • With the increasing prevalence of digital streaming services and various cloud-based computing solutions, the ability to quickly and accurately transfer large amounts of data between remote devices is a critical task. Currently, data streaming services do not take into account the specifications of the device that the data is meant to be displayed on. As a result, scaling and pixel variation between devices can create undesirable display scenarios, for example, text being unreadable on a smaller screen, or an aspect ratio of a display may not be suitable for a certain piece of media.
  • Additionally, digital streaming services and cloud-based computing solutions may experience limitations in the quality and bandwidth of networks established or used during the transfer of data between remote devices when utilizing applications that are sensitive to latencies, such as video games. These limitations may lead to delays in the data transmission and can thus cause latency, which typically creates inconsistencies during the use of an application. While client devices will attempt to achieve the lowest latency through a variety of methods, inevitably, each client device will experience a different amount of latency due to differences in factors such as the decode speed of transmitted data, render rates, input polling, or even the client's network connection. In some forms of media, the latent display of specific media such as text or user interface (UI) data may be more detrimental to the enjoyment of the user, causing user frustration and perhaps even abandonment of a title or the platform altogether.
  • Furthermore, on a wide variety of media playing devices available to consumers, the orientation of the display may rapidly change depending on the needs of the user. Certain device orientations may make viewing a media title, which would otherwise be normally viewable in an alternative orientation nearly impossible. Accordingly, there is a need in the art to find alternative means for adjusting and displaying media streamed over a network that takes into account network latency, display orientation, and the specific display capabilities of a wide variety of devices.
  • SUMMARY
  • In accordance with certain aspects of the present disclosure, a non-transitory computer readable medium may contain computer readable instructions embodied therein. The instructions may be configured to implement a method when executed. The method may include determining the display capabilities of an output device. Two or more data streams configured for display on the output device may be established and formatted. The data streams may include separate streams for video data and user interface (UI) data. The data streams may then be delivered to an output device.
  • In accordance with certain implementations of the present disclosure, a method for streaming data to an output device may include determining the display capabilities of the output device. Two or more data streams configured for display on the output device may be established and formatted. The data streams may contain separate information for video data and user interface (UI) data. The data streams may then be delivered to the output device.
  • In accordance with certain implementations of the present disclosure, a computing system may include at least one processor unit, and at least one memory unit coupled to the at least one processor unit. The at least one processor unit and the at least one memory unit may be configured to perform a method. The method may include determining the display capabilities of an output device. Two or more data streams configured for display on the output device may be established and formatted. The data streams may contain separate information for video data and user interface (UI) data. The data streams may then be delivered to the output device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The teachings of the present disclosure can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:
  • FIG. 1 is an example of a screen shot of a portion of a media title displayed on an output device derived from data streamed over a network, and is intended to illustrate various components of displaying a title.
  • FIG. 2 is a flow diagram of an example system in accordance with certain aspects of the present disclosure.
  • FIG. 3 is a block diagram of an example system in accordance with certain aspects of the present disclosure.
  • FIG. 4 is a flow diagram of an example asset management technique from the server side in accordance with certain aspects of the present disclosure.
  • FIG. 5 is a flow diagram of an example asset management technique from the output device side in accordance with certain aspects of the present disclosure.
  • DETAILED DESCRIPTION
  • Although the following detailed description contains many specific details for the purposes of illustration, anyone of ordinary skill in the art will appreciate that many variations and alterations to the following details are within the scope of the invention. Accordingly, the illustrative implementations of the present disclosure described below are set forth without any loss of generality to, and without imposing limitations upon, the claimed invention.
  • Introduction
  • Aspects of the present disclosure relate to systems and methods for the dynamic adjustment of UI and video streaming data in response to varying device types, orientations, and network quality.
  • In accordance with certain aspects, a client device configured to operate on a network may provide a user with a list of one or more digital assets that can be borrowed from a providing user. The may then be able to request the use of an asset that can be borrowed from a providing user. The user may then receive certain rights, such as access to an application or application features, from the providing user. Alternative embodiments provide a method in which a providing user may grant asset rights to another user without first receiving a request.
  • Implementation Details
  • In FIG. 1, an example is provided of a screen shot of a media title displayed on an output device derived from data streamed over a network 10. This example is intended to illustrate the various components of a streamed title. For example, on the output device 12, various portions of the display may vary in quality and clarity depending on the orientation of the device or the quality of the network connection. For example, video data such as 14 may be clear when displayed independent of the device upon which it is displayed. However, UI data such as text 16 and map data 18 may be difficult to display properly on a smaller screen, or may be rendered unreadable or unusable when formatted to fit a smaller display.
  • Turning now to FIG. 2, an illustrative example is provided of how a media title may be displayed on various devices from data streamed over a network 100. In particular, FIG. 2 depicts an example process flow for providing a varying output devices with several data streams in accordance with certain implementations of the present disclosure. It is noted that the example method of providing an application asset in FIG. 2 may have one or more aspects in common with the methods indicated in FIGS. 4 and 5. It is also important to note that the example of FIG. 2 is only a simplified example for purposes illustrating only certain aspects of how data streams may be provided in accordance with the present disclosure.
  • At the outset, it is important to note that in this example, a data stream is provided to an output device that is requesting a stream from a server containing media information, but data may be requested from and delivered by alternative sources. In this example, as shown in FIG. 2, a user utilizing an output device 103 requests the stream of a media title from a server 102. The server receives this request 182 and determines the capabilities 186 of the output device 103 from a list of compatible devices. Relevant capabilities may include parameters such as screen size in pixels, physical screen dimensions, screen resolution, color format (e.g., 8-bit, 15-bit, 16-bit, 24-bit, 30-bit, 36-bit, or 48-bit color), and the like.
  • Additional examples of parameters which may be of use include, but are not limited to the following: pixel aspect ratio (a pixel is typically assumed to be square with the same width and height. This is not necessarily true especially for old SD television signals); color space information (RGB, YUV, etc.), gamma correction information, screen orientation, display backlight level (‘brightness level’), refresh rate of the display. With respect to refresh rate, is may be useful to know the maximum refresh rate (60 Hz), but the device may also support lower or even variable refresh rates.
  • Some of these parameters could change over time. For example, for handheld devices such as tablets and smartphones, orientation may change. It is also possible for the display capabilities to change when streaming to one device that is locally connected to a different display device. For example, a user who is streaming video to a device such as a tablet or phone may suddenly connect the device to a 4K display. This can change some of the capabilities.
  • There may also be some other relevant graphics processing unit (GPU) parameters. For example some GPUs (especially on embedded platforms) have support for video overlays. Specifically, a frame may have multiple drawing layers, which are somehow combined (‘composited’) before the frame is output to the display. This can allow for independent rendering of video and UI. For example the video stream could be rendered on a different layer than the UI. The UI and video streams can have different update frequencies and both don't have to ‘wait’ for each other. For example, digital televisions use overlays to layer a system menu system on top of the ‘TV image’.
  • In alternative embodiments, the server may request the device information 183 from the output device when the display capabilities are unknown, and the output device may in turn deliver that information 184 to the server. For example, the output device may include one or more sensors, e.g., an inertial sensor and/or user-facing camera, that allow the output device to sense the orientation of a display screen relative to a user. The display screen may be rectangular but have a non-square aspect ratio (ratio of width to height). The server 102 may request that the output device indicate whether it is in a horizontal or vertical orientation relative to the user so that the data to be streamed may be best formatted for presentation at that orientation.
  • Information regarding device capabilities (e.g. whether device supports 480p, 720p . . . ) give the server information on what is technically possible. It would also be useful for the server to know certain current settings of the device such as orientation and output resolution. Other information such as color depth may be less important, e.g., because it can be adjusted at the client device or the server would dictate this information for compression efficiency reasons or to reduce required bandwidth.
  • The server 102 may also request information about the quality of the network 100 between the server and the output device 103. Such information may include parameters such as network latency, available network bandwidth, packet loss, or network protocol. This information can be used in formatting the data streams 188. For example, some network protocols, e.g., Uniform Datagram Protocol (UDP) are highly unreliable. If the server 102 knows that the network 100 uses UDP the streams 188 can be configured to include Forward Error Correction (FEC) packets that the output device 103 can use to recover lost data packets.
  • Streaming technology often uses FEC techniques. Before starting streaming to a client device a server sets a bandwidth budget, e.g., 8 Mbps for the stream. To set the bandwidth budget it is useful for the server to pick some client device settings such as video resolution, frame rate and compression settings.
  • During streaming the server can monitor bandwidth and based on network parameters such as bandwidth and packet loss and adjust our video resolution, frame rate and compression settings accordingly. Similar settings can be adjusted for audio streaming, which may become more important. Knowing the audio settings allows the server to optimize the audio stream for, e.g., a six channel surround sound setup (e.g., 5.1 surround sound) or a stereo setup.
  • Some streaming technology reduces frame rate or resolution somewhat crudely. For example, some video game streaming systems capture video from HDMI, so if a game is generating 60 fps and the streaming rate needs to be reduced to 30 fps, the streaming server simply throws frames away. In addition if resolution needs to be reduced, the server can just scale the video frame received from the game. By way of example, the game may be generating video at 720P and if the server 102 needs to stream at 540P, the frame is just scaled in software.
  • Servers for some applications, such as online video games, often run the application software on different hardware than the hardware used to handle streaming of video to the client devices. The streaming server and application hardware may be located in a data center and connected by a local area network. According to aspects of the present disclosure, formatting the video data could be done using the application that generates video frames that are used for the video stream. By way of example, and not by way of limitation, the streaming server could notify the hardware running the application that it needs to reduce frame rate or display resolution. The application can then adjust the frame rate or resolution or otherwise adjust the video formatting for the output device 103. The application that generates the video frames often can do a better job of adjusting frame rate, display resolution or other video formatting parameters. In addition, formatting the video this way could even save a little bit of electrical power (a lower resolution or framerate is less intensive for a server and hence uses less power).
  • Once the device display capability information is determined, the server may create separate streams for the video and UI information 188. This step is performed so that the UI and video data information can be formatted separately on the output device 103, allowing for the separate scaling of UI and video data to ensure that all aspects of the streamed title are clear and useable. These formatted streams are delivered 190 to the output device 103 where they are received and displayed 140 in a manner consistent with the display capabilities of the output device. In alternative embodiments, the output device may prioritize the display of the streams 150. By way of example, and not by way of limitation, if each aspect of the UI data is contained in a separate stream, the output device may prioritize the display of the streams in the order of (text data, video data, map data, etc.) so that certain aspects of the display maintain clarity during times of network latency while others become less defined.
  • As an example, a ‘game’ stream for an online video game may be a compressed video stream. The UI stream for the game may not necessarily be a video stream, but as noted above can be text, bitmap data, vectored graphics and other types of data. These types of data may be shared with the output device 103. In some implementations, the output device 103 may already have some of the data locally, e.g., as part of the data 236′ in the memory 232′ or cache 244′. For example the fonts used to render text may already be available on the client. In some implementations, the output device 103 may be able to configure the UP based on user settings. In such an implementation the user could, e.g., override bitmaps or fonts with whatever data is already stored on the output device 103 to create something like a game ‘mod’ which is often used in massive multiplayer online games (MMOs).
  • The illustration in FIG. 2 also provides an example of a second user requesting to stream a media title from a server, and is provided to show that the data may be streamed to various devices which each have their own respective data stream 140′ such that the data is received and displayed in a manner consistent with the display capabilities of a multitude of output devices.
  • It is emphasized that the example depicted in FIG. 2 is provided for purposes of illustration only, in order to highlight certain aspects of the present disclosure. In practice, implementations of the present disclosure may factor in additional or alternative considerations not depicted by the example of FIG. 2, and may be more complex than the simplified scheme depicted in FIG. 2.
  • Certain implementations of aspects of the present disclosure include systems configured for the dynamic adjustment of streamed content. By way of example, and not by way of limitation, FIG. 3 depicts a distributed computing system that includes three devices 102, 103, and 104, and the computing systems 102, 103, and 104 are configured to transfer data over a network in accordance with certain aspects of the present disclosure. In certain implementations, the device 102 may be configured to execute instructions that have one or more aspects in common with those described with respect to FIG. 2 and/or FIG. 4. In certain implementations, the devices 103 and 104 may be configured to execute instructions that have one or more aspects in common with one or more of those described above with respect to FIG. 2 or below with respect to FIG. 5. Each of the devices 102, 103, and 104 may be configured with suitable software and/or hardware to implement various aspects of the methods described herein. Each of the devices 102, 103, and 104 may be a server, an embedded system, mobile phone, personal computer, laptop computer, tablet computer, portable game device, workstation, game console, wearable devices such as a smart watches, “dongle” devices and the like. As used herein, the term “dongle device” refers to a device that plugs into other equipment, such as a computer or television, to add functionality to the equipment. Such functionality may include as copy protection, audio, video, games, data, or other services that are available only when the dongle is attached. Examples of dongle devices include, but are not limited to internet content streaming devices and infrared remote control adapters available for “smart” mobile phones. The output devices 103, 104 have certain components in common with the server 102. In FIG. 3 components of the server 102 are denoted by reference numerals without primes, corresponding components of output device 103 are denoted by the same reference numerals with primes (′) and corresponding components of output device 104 are denoted by the same reference numerals with double primes (″).
  • In accordance with certain implementations, the device 102 may be a server configured to format and provide media streaming data, and the devices 103 and 104 may be an output device utilized by a user who wishes to display a media title. The server 102 may be configured to create and format media 188 and deliver the data 190 to an output device 103 or 104 over a network 260, e.g., using an internet connection or a local area network connection. It is noted that the network 260 need not be an internet connection. In some implementations streaming may take place from one device at a given location to another device in that location via a local area network.
  • Furthermore, even if the network 260 is a wide area network, it may be implemented by technology other than the internet, such as, a cable network.
  • The data streams 188 may be separately formatted for respective devices 103 or 104 even if the media title requested is the same, as the device capabilities 186 of the respective output devices may vary. By way of example, and not by way of limitation, the data streams 188 may include streams for video data, UI data, text, inventory data, map data, or audio data. In some implementations certain types of data streams may be compressed to reduce the number of bits of data that need to be transmitted in the stream. In particular, video data and audio data are commonly compressed prior to being transmitted.
  • Each of the devices 102, 103, and 104 may include one or more processor units 231, 231′, 231″ which may be configured according to well-known architectures, such as, e.g., single-core, dual-core, quad-core, multi-core, processor-coprocessor, cell processor, and the like. Either of the devices Each of the devices 102, 103, and 104 may also include one or more memory units 232, 232′, 232″ (e.g., RAM, DRAM, ROM, and the like). The processor unit 231, 231′, 231″ may execute one or more programs 233, 233′, 233″ which may be stored in the memory 232, and the processor 231, 231′, 231″ may be operatively coupled to the memory 232, 232′, 232″, e.g., by accessing the memory via a data bus 250, 250′, 250″. The memory unit 232, 231′, 231″ may include data 236, 236′, 236″ and the processor unit 231, 231′, 231″ may utilize the data 236, 236′, 236″ in implementing the program 233, 233′, 233″. The data 236, 236′, 236″ for each of the devices 102, 103, and 104 may include, e.g., a request for streaming data 120 transmitted from an output device 103 or 104 to the server 102, and a specially formatted set of data streams 188 for delivery 140 from the server 102 to an output device 103 or 104 according to various aspects of the present disclosure. The program 233, 233′, 233″ may include optionally instructions that, when executed by a processor, perform one or more operations associated with requesting streaming data for a media title 120, determining the capabilities of the output device from which the request was received 186, creating and formatting two or more video streams which respectively contain video and UI data 188 for display on the output device 103 or 104, or receiving and displaying data streams formatted 140 to best utilize the display capabilities of the output device 103 or 104; e.g., a method having one or more features in common with the methods of FIGS. 2, 4, and/or 5. For example, the program 233, 233′, 233″ of the server 102 may include instructions that, when executed by the processor 231, 231′, 231″, cause the server to format and deliver streaming data to the at least one recipient device 103 or 104, in accordance with aspects of the server side of the method depicted in FIG. 2 and/or the sending of streaming data. In alternative embodiments, the program 233, 233′, 233″ of the server 102 may include instructions that, when executed by the processor 231, 231′, 231″, cause the server to request display capability information from the at least one recipient device 103 or 104, in accordance with aspects of the server side of the method depicted in FIG. 2 and/or the sending of streaming data. The program 233, 233′, 233″ of the output device 103 or 104 may include instructions that, when executed by the processor 231, 231′, 231″, cause the output device to request streaming data 120 that can then be provided by the server 102 in accordance with aspects of the output device side of the method depicted in FIG. 2 and/or the sending of streaming data. In alternative embodiments, the program 233, 233′, 233″ of the output device 103 or 103 may include instructions that, when executed by the processor 231, 231′, 231″, cause the output device to deliver display capability information 184 to the server 102 in accordance with aspects of the output device side of the method depicted in FIG. 2 and/or the sending of streaming data.
  • Each of the devices 102, 103, and 104 may also include well-known support circuits 240, 240′, 240″, such as input/output (I/O) circuits 241, 241′, 241″ (which in the case of output devices 103, 104 may be coupled to a controller 245′, 245″), power supplies (P/S) 242, 242′, 242″, a clock (CLK) 243, 243′, 243″, and cache 244, 244′, 244″, which may communicate with other components of the system, e.g., via the bus 250, 250′, 250″. Each of the devices 102, 103, and 104 may optionally include a mass storage device 234, 234′, 234″ such as a disk drive, CD-ROM drive, tape drive, flash memory, or the like, and the mass storage device 234, 234′, 234″ may store programs and/or data. Each of the devices 102, 103, and 104 may also optionally include a display unit 237, 237′, 237″. The display unit 237, 237′, 237″ may be in the form of a cathode ray tube (CRT), flat panel screen, touch screen, or other device that displays text, numerals, graphical symbols, or other visual objects. Each of the devices 102, 103, and 104 may also include a user interface 206, 206′, 206″ to facilitate interaction between the device 102/103 or 104 and a user. The user interface 206, 206′, 206″ may include a keyboard, mouse, light pen, game control pad, touch interface, or other device. The user interface may also include an audio I/O device, such as a speaker and/or microphone.
  • A user may interact with either of the computer systems through the user interface 206, 206′, 206″. By way of example, the server may 102 may be a cloud gaming server, and the output device 103 or 104 may be a cloud gaming client, and a video game user may interact with a video game executed by the server 102 and streamed to the output device 104 through the user interface 206, 206′, 206″. Portions of the user interface 206, 206′, 206″ may include a graphical user interface (GUI) that can be displayed on the display unit 237, 237′, 237″ in order to facilitate user interaction with the system 102/103 or 104. The system 102/103 or 104 may include a network interface 239, 239′, 239″, configured to enable the use of Wi-Fi, an Ethernet port, or other communication methods. The network interface 239, 239′, 239″ may incorporate suitable hardware, software, firmware or some combination thereof to facilitate communication via a telecommunications network 260, and may support data transport using an unreliable protocol in accordance with certain aspects of the present disclosure. The network interface 239, 239′, 239″ may be configured to implement wired or wireless communication over local area networks and wide area networks such as the Internet.
  • As shown in FIG. 4, a set of server side instructions 370 may be implemented, e.g., by the server 102. The server instructions 370 may be formed on a nontransitory computer readable medium such as the memory 232, 232′, 232″ or the mass storage device 234, 234′, 234″. The server side instructions 370 may also be part of the process control program 233, 233′, 233″. As indicated at 372 the server instructions 370 may include instructions for requesting output device information, optionally including output device display capabilities 183 to be received from or more client devices 103 or 104 over a network 260. The instructions 370 may include instructions 374 for formatting two or more streams of content data 188 based on the capabilities of the output device. Thereafter, at 376 the instructions may include instructions for delivering 190 the data streams to the output device 103 or 104 and subsequently, at 378 may include instructions for delivering data stream formatting or priority information to the respective output device.
  • As shown in FIG. 5, a set of output device instructions 480 may be implemented, e.g., by the output device 103 or 104. The output device instructions 480 may be formed on a nontransitory computer readable medium such as the memory 232, 232′, 232″ or the mass storage device 234, 234′, 234″. The output device instructions 480 may also be part of the process control program 233, 233′, 233″. As indicated at 482, the instructions 480 may include instructions for delivering output device information, optionally including output device display capabilities 183 to be delivered by the or more client devices 103 or 104 over a network 260. The instructions 470 may include instructions 474 for receiving 140 two or more streams of content data 188 based on the capabilities of the output device. Thereafter, at 476 the instructions may include instructions for prioritizing 150 the display of the data streams on the output device 103 or 104 and subsequently, at 478 may include instructions for formatting text or other user interface (UI) such that the streamed content is displayed in a manner consistent with the display capabilities of the output device 103 or 104.
  • The above components may be implemented in hardware, software, firmware, or some combination thereof.
  • Although the examples described above assume a game stream and a UI stream, which are combined on the client side, additional streams, example a video chat stream, may be included. Such additional streams may come peer-to-peer from other users' client devices and could be combined by the client devices. In some implementations, the program 233 on the server 102 may be aware that there is a video chat stream between output devices 103 and 104 and could reserve resources for handling this stream. For latency reasons the server 102 could coordinate a peer-to-peer (p2p) session to let the client devices 103, 104 combine the video chat stream with the other streams. There may be types of local content which could be composited by the client devices.
  • Aspects of the present disclosure allow for formatting and streaming of data in different formats that are displayed together to be optimally displayed on different devices.
  • While the above is a complete description of the preferred embodiment of the present invention, it is possible to use various alternatives, modifications and equivalents. Therefore, the scope of the present invention should be determined not with reference to the above description but should, instead, be determined with reference to the appended claims, along with their full scope of equivalents. Any feature described herein, whether preferred or not, may be combined with any other feature described herein, whether preferred or not. In the claims that follow, the indefinite article “a”, or “an” refers to a quantity of one or more of the item following the article, except where expressly stated otherwise. The appended claims are not to be interpreted as including means-plus-function limitations, unless such a limitation is explicitly recited in a given claim using the phrase “means for.”

Claims (31)

What is claimed is:
1. A non-transitory computer readable medium having processor-executable instructions embodied therein, wherein execution of the instructions by a processor causes the processor to implement a method, the method comprising:
a) determining the display capabilities of an output device;
b) formatting two or more data streams configured for display on the output device such that the content is displayed in a manner consistent with the display capabilities of the output device, wherein the two or more data streams include a data stream for user interface data and a separate data stream for video data; and
c) delivering the two or more streams of content data to the output device.
2. The non-transitory computer readable medium of claim 1, wherein the two or more data streams include a data stream containing compressed data.
3. The non-transitory computer readable medium of claim 1, wherein the two or more streams of content include a data stream containing data to be displayed as text.
4. The non-transitory computer readable medium of claim 1, wherein the two or more streams of content include a data stream containing data to be displayed as a heads-up-display (HUD).
5. The non-transitory computer readable medium of claim 1, wherein the two or more data streams include a data stream containing data to be displayed as inventory information.
6. The non-transitory computer readable medium of claim 1, wherein the two or more data include a data stream containing data to be displayed map information.
7. The non-transitory computer readable medium of claim 1, wherein the display of one data stream is prioritized over the display of another.
8. The non-transitory computer readable medium of claim 7, wherein the prioritization of display is determined by the quality of the network connection.
9. The non-transitory computer readable medium of claim 1, wherein the display instructions are delivered by a server to a client device platform used in combination with the output device.
10. The non-transitory computer readable medium of claim 9, wherein the client device platform is a gaming console.
11. The non-transitory computer readable medium of claim 9, wherein the client device platform is a computer.
12. The non-transitory computer readable medium of claim 9, wherein the client device platform is a cellular phone.
13. The non-transitory computer readable medium of claim 9, wherein the client device platform is a tablet.
14. The non-transitory computer readable medium of claim 9, wherein the client device platform is a hand-held computing device.
15. The non-transitory computer readable medium of claim 9, wherein the client device platform is a set top box.
16. The non-transitory computer readable medium of claim 9, wherein the client device platform is a telephonic system.
17. The non-transitory computer readable medium of claim 9, wherein the client device platform is a dongle device.
18. The non-transitory computer readable medium of claim 9, wherein the client device is configured to query the output device to determine the display capabilities of the output device and deliver this information to the server.
19. The non-transitory computer readable medium of claim 1, wherein the display capability information also includes a picture orientation of the output device.
20. The non-transitory computer readable medium of claim 1, wherein the content data includes a video portion of a video game.
21. The non-transitory computer readable medium of claim 1, wherein the content data is a stream of a computer program session running on a remote machine.
22. The non-transitory computer readable medium of claim 1, wherein the output device is a television.
23. The non-transitory computer readable medium of claim 1, wherein the output device is a projector.
24. The non-transitory computer readable medium of claim 1, wherein the output device and the client device platform are the same device.
25. The non-transitory computer readable medium of claim 1, wherein b) includes formatting the video data with an application used to generate video frames for the data stream for video data.
26. The non-transitory computer readable medium of claim 1, wherein the two or more streams include a game stream and a user interface stream.
27. The non-transitory computer readable medium of claim 26, wherein the two or more streams include a game stream, a user interface stream and one or more additional streams.
28. The non-transitory computer readable medium of claim 26, wherein the two or more streams include a game stream, a user interface stream and a video chat stream between a first client device and a second client device.
29. The non-transitory computer readable medium of claim 26, wherein the two or more streams include a game stream, a user interface stream and a video chat stream between a first client device and a second client device, wherein the method further comprises coordinating a peer-to-peer session between the first and second client devices to let the first and second client devices combine the video chat stream with the game stream and the user interface stream.
30. On a server configured to operate on a network, a method, comprising:
a) determining the display capabilities of an output device;
b) formatting two or more data streams configured for display on the output device such that the content is displayed in a manner consistent with the display capabilities of the output device, wherein the two or more data streams include a data stream for user interface data and a separate data stream for video data.
31. A system comprising:
a processor, and
a memory coupled to the processor;
wherein the processor is configured to perform a method, the method comprising:
a) determining the display capabilities of an output device;
b) formatting two or more data streams configured for display on the output device such that the content is displayed in a manner consistent with the display capabilities of the output device, wherein the two or more data streams include a data stream for user interface data and a separate data stream for video data; and
c) delivering the two or more streams of content data to the output device.
US14/641,121 2015-03-06 2015-03-06 Dynamic adjustment of cloud game data streams to output device and network quality Pending US20160259453A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/641,121 US20160259453A1 (en) 2015-03-06 2015-03-06 Dynamic adjustment of cloud game data streams to output device and network quality

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US14/641,121 US20160259453A1 (en) 2015-03-06 2015-03-06 Dynamic adjustment of cloud game data streams to output device and network quality
TW105102720A TWI629086B (en) 2015-03-06 2016-01-28 Dynamic adjustment of cloud game data streams to output device and network quality
CN201680012884.XA CN107409237B (en) 2015-03-06 2016-03-04 Method, medium and system for dynamically adjusting cloud game data stream
CN202010485731.5A CN111711840A (en) 2015-03-06 2016-03-04 Cloud game data flow dynamic adjustment for output device and network quality
PCT/US2016/021053 WO2016144820A1 (en) 2015-03-06 2016-03-04 Dynamic adjustment of cloud game data streams to output device and network quality
JP2017544292A JP6563024B2 (en) 2015-03-06 2016-03-04 Dynamic adjustment of cloud game data stream and network characteristics to output devices
EP16762259.6A EP3266198A4 (en) 2015-03-06 2016-03-04 Dynamic adjustment of cloud game data streams to output device and network quality

Publications (1)

Publication Number Publication Date
US20160259453A1 true US20160259453A1 (en) 2016-09-08

Family

ID=56849851

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/641,121 Pending US20160259453A1 (en) 2015-03-06 2015-03-06 Dynamic adjustment of cloud game data streams to output device and network quality

Country Status (6)

Country Link
US (1) US20160259453A1 (en)
EP (1) EP3266198A4 (en)
JP (1) JP6563024B2 (en)
CN (2) CN111711840A (en)
TW (1) TWI629086B (en)
WO (1) WO2016144820A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160184712A1 (en) * 2014-12-31 2016-06-30 Sony Computer Entertainment America Llc Game State Save, Transfer and Resume for Cloud Gaming
US20170087475A1 (en) * 2015-09-30 2017-03-30 Sony Interactive Entertainment America Llc Systems and Methods for Providing Time-Shifted Intelligently Synchronized Game Video
US20180191908A1 (en) * 2016-12-30 2018-07-05 Akamai Technologies, Inc. Collecting and correlating microphone data from multiple co-located clients, and constructing 3D sound profile of a room
US20180300839A1 (en) * 2017-04-17 2018-10-18 Intel Corporation Power-based and target-based graphics quality adjustment

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080168493A1 (en) * 2007-01-08 2008-07-10 James Jeffrey Allen Mixing User-Specified Graphics with Video Streams
US7458894B2 (en) * 2004-09-15 2008-12-02 Microsoft Corporation Online gaming spectator system
US20090118017A1 (en) * 2002-12-10 2009-05-07 Onlive, Inc. Hosting and broadcasting virtual events using streaming interactive video
US20090119738A1 (en) * 2002-12-10 2009-05-07 Onlive, Inc. System for recursive recombination of streaming interactive video
US20090119737A1 (en) * 2002-12-10 2009-05-07 Onlive, Inc. System for collaborative conferencing using streaming interactive video
US20100066804A1 (en) * 2008-09-16 2010-03-18 Wham! Inc. Real time video communications system
US20100071010A1 (en) * 2006-09-06 2010-03-18 Amimon Ltd Method, device and system of generating a clock signal corresponding to a wireless video transmission
US20100167816A1 (en) * 2002-12-10 2010-07-01 Perlman Stephen G System and Method for Multi-Stream Video Compression
US20100166065A1 (en) * 2002-12-10 2010-07-01 Perlman Stephen G System and Method for Compressing Video Based on Latency Measurements and Other Feedback
US20120249736A1 (en) * 2010-01-07 2012-10-04 Thomson Licensing A Corporation System and method for providing optimal display of video content
US20130139091A1 (en) * 2003-07-28 2013-05-30 Limelight Networks, Inc. Rich content download
US20130337916A1 (en) * 2012-06-19 2013-12-19 Microsoft Corporation Companion gaming experience supporting near-real-time gameplay data
US20140040959A1 (en) * 2012-08-03 2014-02-06 Ozgur Oyman Device orientation capability exchange signaling and server adaptation of multimedia content in response to device orientation
US20140155154A1 (en) * 2012-11-30 2014-06-05 Applifier Oy System and method for sharing gameplay experiences
US20150150070A1 (en) * 2013-11-26 2015-05-28 At&T Intellectual Property I, Lp Method and apparatus for providing media content
US20160212054A1 (en) * 2015-01-20 2016-07-21 Microsoft Technology Licensing, Llc Multiple Protocol Media Streaming

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8840475B2 (en) 2002-12-10 2014-09-23 Ol2, Inc. Method for user session transitioning among streaming interactive video servers
JP4567646B2 (en) * 2006-09-25 2010-10-20 シャープ株式会社 Video / audio playback portable terminal, video / audio distribution terminal, and system
US7962580B2 (en) * 2007-12-13 2011-06-14 Highwinds Holdings, Inc. Content delivery network
US8613673B2 (en) 2008-12-15 2013-12-24 Sony Computer Entertainment America Llc Intelligent game loading
US20100030808A1 (en) * 2008-07-31 2010-02-04 Nortel Networks Limited Multimedia architecture for audio and visual content
US9582238B2 (en) * 2009-12-14 2017-02-28 Qualcomm Incorporated Decomposed multi-stream (DMS) techniques for video display systems
JP5471667B2 (en) * 2010-03-19 2014-04-16 日本電気株式会社 Client and image display system
JP5520190B2 (en) * 2010-10-20 2014-06-11 株式会社ソニー・コンピュータエンタテインメント Image processing system, image processing method, moving image transmitting apparatus, moving image receiving apparatus, program, and information storage medium
JP5076132B1 (en) * 2011-05-25 2012-11-21 株式会社スクウェア・エニックス・ホールディングス Drawing control apparatus, control method therefor, program, recording medium, drawing server, and drawing system
CN103455505B (en) * 2012-05-31 2017-06-27 华为技术有限公司 A kind of media acquisition methods, apparatus and system
WO2014069771A1 (en) * 2012-10-30 2014-05-08 에스케이플래닛 주식회사 Method for providing cloud streaming-based game, and system and apparatus for same
CA2798066A1 (en) * 2012-12-07 2014-06-07 Kabushiki Kaisha Square Enix Holdings (Also Trading As Square Enix Holdings Co., Ltd.) Method and system of creating and encoding video game screen images for transmission over a network
CA2831587A1 (en) 2013-02-06 2014-08-06 Kabushiki Kaisha Square Enix Holdings (Also Trading As Square Enix Holdings Co., Ltd.) Information processing apparatus, control method, program, and storage medium
CN105359063B (en) * 2013-06-09 2018-08-17 索尼电脑娱乐公司 Utilize the head-mounted display of tracking
US20160127508A1 (en) 2013-06-17 2016-05-05 Square Enix Holdings Co., Ltd. Image processing apparatus, image processing system, image processing method and storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100166065A1 (en) * 2002-12-10 2010-07-01 Perlman Stephen G System and Method for Compressing Video Based on Latency Measurements and Other Feedback
US20100167816A1 (en) * 2002-12-10 2010-07-01 Perlman Stephen G System and Method for Multi-Stream Video Compression
US20090118017A1 (en) * 2002-12-10 2009-05-07 Onlive, Inc. Hosting and broadcasting virtual events using streaming interactive video
US20090119738A1 (en) * 2002-12-10 2009-05-07 Onlive, Inc. System for recursive recombination of streaming interactive video
US20090119737A1 (en) * 2002-12-10 2009-05-07 Onlive, Inc. System for collaborative conferencing using streaming interactive video
US20130139091A1 (en) * 2003-07-28 2013-05-30 Limelight Networks, Inc. Rich content download
US7458894B2 (en) * 2004-09-15 2008-12-02 Microsoft Corporation Online gaming spectator system
US20100071010A1 (en) * 2006-09-06 2010-03-18 Amimon Ltd Method, device and system of generating a clock signal corresponding to a wireless video transmission
US20080168493A1 (en) * 2007-01-08 2008-07-10 James Jeffrey Allen Mixing User-Specified Graphics with Video Streams
US20100066804A1 (en) * 2008-09-16 2010-03-18 Wham! Inc. Real time video communications system
US20120249736A1 (en) * 2010-01-07 2012-10-04 Thomson Licensing A Corporation System and method for providing optimal display of video content
US20130337916A1 (en) * 2012-06-19 2013-12-19 Microsoft Corporation Companion gaming experience supporting near-real-time gameplay data
US20140040959A1 (en) * 2012-08-03 2014-02-06 Ozgur Oyman Device orientation capability exchange signaling and server adaptation of multimedia content in response to device orientation
US20140155154A1 (en) * 2012-11-30 2014-06-05 Applifier Oy System and method for sharing gameplay experiences
US20150150070A1 (en) * 2013-11-26 2015-05-28 At&T Intellectual Property I, Lp Method and apparatus for providing media content
US20160212054A1 (en) * 2015-01-20 2016-07-21 Microsoft Technology Licensing, Llc Multiple Protocol Media Streaming

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Head-up display. dota2.gamepedia.com. Online. 2013-11-22. Accessed via the Internet. Accessed 2017-01-08. <URL: http://wayback.archive.org/web/20131122153039/http://dota2.gamepedia.com/Head-up_display> *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160184712A1 (en) * 2014-12-31 2016-06-30 Sony Computer Entertainment America Llc Game State Save, Transfer and Resume for Cloud Gaming
US9795879B2 (en) * 2014-12-31 2017-10-24 Sony Interactive Entertainment America Llc Game state save, transfer and resume for cloud gaming
US20180043256A1 (en) * 2014-12-31 2018-02-15 Sony Interactive Entertainment America Llc Game State Save, Transfer and Resume for Cloud Gaming
US10512841B2 (en) * 2014-12-31 2019-12-24 Sony Interactive Entertainment America Llc Game state save, transfer and resume for cloud gaming
US20170087475A1 (en) * 2015-09-30 2017-03-30 Sony Interactive Entertainment America Llc Systems and Methods for Providing Time-Shifted Intelligently Synchronized Game Video
US10549203B2 (en) * 2015-09-30 2020-02-04 Sony Interactive Entertainment America Llc Systems and methods for providing time-shifted intelligently synchronized game video
US20180191908A1 (en) * 2016-12-30 2018-07-05 Akamai Technologies, Inc. Collecting and correlating microphone data from multiple co-located clients, and constructing 3D sound profile of a room
US10291783B2 (en) * 2016-12-30 2019-05-14 Akamai Technologies, Inc. Collecting and correlating microphone data from multiple co-located clients, and constructing 3D sound profile of a room
US20180300839A1 (en) * 2017-04-17 2018-10-18 Intel Corporation Power-based and target-based graphics quality adjustment
US10402932B2 (en) * 2017-04-17 2019-09-03 Intel Corporation Power-based and target-based graphics quality adjustment
US10909653B2 (en) 2017-04-17 2021-02-02 Intel Corporation Power-based and target-based graphics quality adjustment

Also Published As

Publication number Publication date
EP3266198A4 (en) 2019-01-16
EP3266198A1 (en) 2018-01-10
CN107409237B (en) 2020-06-26
TW201642942A (en) 2016-12-16
CN111711840A (en) 2020-09-25
JP6563024B2 (en) 2019-08-21
JP2018514013A (en) 2018-05-31
WO2016144820A1 (en) 2016-09-15
TWI629086B (en) 2018-07-11
CN107409237A (en) 2017-11-28

Similar Documents

Publication Publication Date Title
CN107409237B (en) Method, medium and system for dynamically adjusting cloud game data stream
US10284644B2 (en) Information processing and content transmission for multi-display
US9369678B2 (en) Video streaming method and system
US8789094B1 (en) Optimizing virtual collaboration sessions for mobile computing devices
US10002088B2 (en) Method and apparatus for improving decreasing presentation latency in response to receipt of latency reduction mode signal
JP2020518141A (en) Display device and control method thereof
US20200021795A1 (en) Method and client for playing back panoramic video
CN107371044B (en) Electronic equipment interaction method, electronic equipment, user terminal and server
CN111052750A (en) Method and device for point cloud stream transmission
WO2017193576A1 (en) Video resolution adaptation method and apparatus, and virtual reality terminal
US20200090324A1 (en) Method and Apparatus for Determining Experience Quality of VR Multimedia
JP6224516B2 (en) Encoding method and encoding program
US10404606B2 (en) Method and apparatus for acquiring video bitstream
US10237195B1 (en) IP video playback
WO2016065514A1 (en) Image display method, user terminal and video receiving equipment
KR20190092402A (en) Method and apparatus for collaborative content rendering
GB2567136A (en) Moving between spatially limited video content and omnidirectional video content
US9438643B2 (en) Multi-device conference participation
CN105141626A (en) Optimized SPICE WAN system and method
JP2020053904A (en) Data receiving apparatus, data distribution control method, and data distribution control program
KR20120065944A (en) Method and input-output device for rendering at least one of audio, video and computer graphics content and servicing device for delivering at least one of pre-rendered audio, pre-rendered video and pre-rendered computer graphics content
CN112181353A (en) Audio playing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY COMPUTER ENTERTAINMENT AMERICA LLC, CALIFORNI

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:COLENBRANDER, ROELOF RODERICK;PERRY, DAVID;SIGNING DATES FROM 20150306 TO 20150311;REEL/FRAME:035363/0913

AS Assignment

Owner name: SONY INTERACTIVE ENTERTAINMENT AMERICA LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT AMERICA LLC;REEL/FRAME:038630/0154

Effective date: 20160331

Owner name: SONY INTERACTIVE ENTERTAINMENT AMERICA LLC, CALIFO

Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT AMERICA LLC;REEL/FRAME:038630/0154

Effective date: 20160331

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: SONY INTERACTIVE ENTERTAINMENT LLC, CALIFORNIA

Free format text: MERGER AND CHANGE OF NAME;ASSIGNORS:SONY INTERACTIVE ENTERTAINMENT AMERICA LLC;SONY INTERACTIVE ENTERTAINMENT LLC;REEL/FRAME:049537/0497

Effective date: 20180323

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STCB Information on status: application discontinuation

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED