EP3528450B1 - Decoding complexity for mobile multimedia streaming - Google Patents

Decoding complexity for mobile multimedia streaming Download PDF

Info

Publication number
EP3528450B1
EP3528450B1 EP18211297.9A EP18211297A EP3528450B1 EP 3528450 B1 EP3528450 B1 EP 3528450B1 EP 18211297 A EP18211297 A EP 18211297A EP 3528450 B1 EP3528450 B1 EP 3528450B1
Authority
EP
European Patent Office
Prior art keywords
media content
decoding complexity
wtru
decoding
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP18211297.9A
Other languages
German (de)
French (fr)
Other versions
EP3528450A1 (en
Inventor
Osama Lotfallah
Eduardo Asbun
Hang Liu
Yuriy Reznik
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vid Scale Inc
Original Assignee
Vid Scale Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vid Scale Inc filed Critical Vid Scale Inc
Publication of EP3528450A1 publication Critical patent/EP3528450A1/en
Application granted granted Critical
Publication of EP3528450B1 publication Critical patent/EP3528450B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/765Media network packet handling intermediate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/612Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/613Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for the control of the source by the destination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/756Media network packet handling adapting media to device capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/762Media network packet handling at the source 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/764Media network packet handling at the destination 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4621Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/04TPC
    • H04W52/18TPC being performed according to specific parameters
    • H04W52/22TPC being performed according to specific parameters taking into account previous information or commands
    • H04W52/223TPC being performed according to specific parameters taking into account previous information or commands predicting future states of the transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B17/00Monitoring; Testing
    • H04B17/30Monitoring; Testing of propagation channels
    • H04B17/373Predicting channel quality or other radio frequency [RF] parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/04TPC
    • H04W52/30TPC using constraints in the total amount of available transmission power
    • H04W52/36TPC using constraints in the total amount of available transmission power with a discrete range or set of values, e.g. step size, ramping or offsets
    • H04W52/367Power values between minimum and maximum limits, e.g. dynamic range

Definitions

  • Multimedia content may be sent from a content provider to a mobile device using a wireless communications network.
  • the content may be provided by streaming the content such that the content may be received at the mobile device and/or provided to an end user of the mobile device.
  • Mobile multimedia streaming may allow the mobile device to begin playback of the media content before the entire media file has been received at the device.
  • Streaming the multimedia content to a mobile device may be challenging due to the variability of the available bandwidth and the potential demand on the battery during the period of the multimedia experience. Due to the variability of the available bandwidth on the wireless communication network, the power consumed by the radio interface of the mobile device for receiving the multimedia content may be unknown or difficult to determine. Once the multimedia content is received at the mobile device, the decoding and playback of the content may consume additional power that may also be unknown or difficult to determine.
  • WO2014/011622 A2 filed before the priority date of the present application but published after, discloses a method for power aware video decoding and streaming, which method includes a mobile device receiving a media file from a video server including complexity information from which a first complexity level for requesting a video segment may be determined by the mobile device based on said complexity information and a power metric for the mobile device.
  • US2009/0304085 A1 discloses an encoder and a method in said encoder to adaptively alter video deblocking complexity in a video encoding engine for generating a stream of encoded video data. and reduces the effects of blocking distortion on the encoded video data.
  • the de blocking filter is characterized by a level of deblocking complexity which may depend on the strength and granularity of the deb locking filter applied to the encoded video data.
  • a resource manager is coupled to the deblocking filter and is configured to adaptively alter the deblocking complexity in order to alter the overall computational complexity of the encoder.
  • WO2009/149100 A1 discloses a client died stream switching method, where the client side stream switching enables substantially uninterrupted transmission of a highest compatible bit rate of a stream of media to a client via a network connection.
  • the client may include one or more buffers for receiving the stream of media. Attributes including the buffer activity and a bandwidth of the network connection may be monitored by a streaming module to determine an alternative bit rate of the stream of media.
  • the stream of media may be transitioned from the first bit rate to the alternative bit rate without an interruption of the stream of media to provide the client with the highest compatible bit rate based on the monitored attributes.
  • WO2009/009166 A1 discloses an intelligent power-aware downloading for mobile communication devices.
  • a mobile communication device (100) includes intelligent, power-aware download capability for downloading or streaming content. Music, video and other media content is made available by the content provider with multiple content qualities. The mobile communication device (100) monitors its own power availability status, determines a desired content quality, and downloads the media content with the desired content quality.
  • a WTRU may request multimedia content from a content providing device.
  • the content providing device may receive the request for the multimedia content and may determine a decoding complexity associated with the multimedia content.
  • the decoding complexity may be based on data that indicates an amount of power used to receive, decode, and/or display the multimedia content at reference devices, such as WTRUs.
  • the decoding complexity may indicate a minimum decoding complexity value, a maximum decoding complexity value, and/or an average decoding complexity value for receiving, decoding, and/or displaying multimedia content at reference devices.
  • the multimedia content may include video content (e.g ., video streams, video files, etc. ) . images, audio content, and/or other forms of multimedia content.
  • the multimedia content may be comprised of different types of content. For example, where the media content includes video content the media content may include a standard version and a high definition version of the media content.
  • the multimedia content may be segmented. Each of the different types and/or segments of media content may have a corresponding decoding complexity.
  • the content providing device may send an indication of the decoding complexity to the WTRU or other network device.
  • the indication of the decoding complexity may be included in a media file or protocol.
  • the WTRU, or another network device may use the decoding complexity to determine whether to receive the requested media content. For example, the WTRU may compare the decoding complexity with the amount of available power resources available at the WTRU. Where multiple types or segments of content are included in the requested multimedia content, the WTRU, or other network device, may use the decoding complexity to select one or more types or segments of content.
  • the content providing device may also, or alternatively, determine whether to provide the requested media content, or one or more types or segments thereof, based on the associated decoding complexity.
  • the content providing device may know various WTRU device characteristics and may use them to determine whether to provide the requested media content, or one or more types or segments thereof.
  • the WTRU device characteristics may include the WTRU type, available power of the WTRU. and/or the power consumption configurations of the WTRU.
  • the invention is defined by a method according to independent claim 1 and a device according to independent claim 2. Further embodiments are set out in the dependent claims.
  • a content provider may provide multimedia content to wireless transmit/receive units (WTRUs) over a wireless communication network.
  • WTRUs wireless transmit/receive units
  • the amount of power consumed by a WTRU during a multimedia experience may be predicted based on the type of multimedia content and/or the WTRU characteristics. Measurements may be taken by WTRUs that have received, decoded, and/or displayed the multimedia content. These measurements may be used to predict the amount of power that may be consumed when receiving, decoding, and/or displaying the multimedia content.
  • a power consumption descriptor may be used to indicate the amount of power consumed by the WTRU during the multimedia experience.
  • the power consumption descriptor may indicate the amount of power consumed by the WTRU when receiving, decoding, and/or displaying the multimedia content.
  • the power consumption descriptor may be referred to as a decoding complexity.
  • the power consumption descriptor may be included in protocols, file formats, and/or the like.
  • the power consumption descriptor may be based on a media stream or streams and/or a WTRU type or types.
  • Power consumption at a WTRU may be affected by streaming multimedia content in a communications system.
  • the multimedia content may include video streams, images, audio, and/or other forms of multimedia data transmitted using the communication system.
  • a video stream may be coded into various sub-streams by varying the video resolution, frame rate, and/or quantization parameter.
  • the sub-streams may include H.264 streams, for example.
  • Audio streams may also be coded into various sub-streams by a varying number of channels (e.g ., 5.1, 2.0 ⁇ stereo, 1.0 ⁇ mono, etc.), different codecs, and/or codec extensions ( e.g ., MP3. advanced audio coding (AAC), high efficiency (HE)-AAC, etc. ) .
  • AAC advanced audio coding
  • HE high efficiency
  • Power may be consumed at the WTRU for multimedia data processing and/or display.
  • the power consumed by the wireless interface may account for a portion ( e.g ., about 15% to 25%) of the overall power consumption.
  • the wireless interface may be the wireless module capable of receiving a signal and/or decoding the signal into digital data that may be passed to a multimedia decoding module.
  • the wireless interface may communicate via Wi-Fi, 3G, 4G, and/or the like.
  • the display of the WTRU may account for a portion of the aggregate power consumed for video playback.
  • the display may account for the largest portion ( e.g ., between about 38% and 68%) of power consumed by components within the WTRU.
  • the processor may account for a portion of the power consumed for video playback.
  • the processor may account for the largest portion of power consumption for video playback after the display.
  • a call module may account for additional power that may be consumed for phone calls.
  • the call module may be a GSM module or other module that may be used for processing call data.
  • the power consumed by the call module may be static or dynamic.
  • Multimedia content may be pre fetched or may be streamed for display at the WTRU.
  • the multimedia content may be pre-fetched at a WTRU when the content is fully received at the WTRU before the WTRU may begin displaying a portion of the content.
  • the multimedia content may be streamed by being sent to the WTRU in segments that may be displayed at the WTRU before the multimedia content is fully received. By streaming the multimedia content, a portion of the content may be viewed prior to receiving the entirety of the multimedia content.
  • Multimedia content may be transmitted in segments using adaptive hypertext transfer protocol (HTTP) streaming. The periods between segments may be long enough for the air interface module to go to a sleep mode or idle mode in between transmissions.
  • HTTP adaptive hypertext transfer protocol
  • FIG. 1 is a diagram that depicts example communication states that may be used by a WTRU in a wireless communication system.
  • the communication states may include connected states for transmitting and/or receiving transmissions at a WTRU and/or idle states that may be used between transmissions.
  • the power consumed by the WTRU may depend on transition between the communication states. For example, the WTRU may save power by transitioning from a connected state to a power saving connected state and/or an idle state, as shown in FIG. 1 .
  • the example communication states may include radio resource control (RRC) states, such as the RRC connected state 102 and/or the RRC idle state 104.
  • RRC radio resource control
  • UMTS Evolved Universal Mobile Telecommunications System
  • E-UTRA Evolved Universal Mobile Telecommunications System
  • LTE long term evolution
  • the RRC connected state 102 may be an E-UTRA RRC connected state and/or the RRC idle state 104 may be an E-UTRA RRC idle state.
  • the RRC connected state 102 may be used to transfer data (e . g ., unicast data) to and/or from a WTRU.
  • the RRC connected state 102 may be used, for network controlled mobility, to monitor control channels associated with shared data channels, and/or the like.
  • the RRC idle state 104 may be used to monitor a paging channel to detect incoming calls, acquire system information, and/or perform logging of available measurements.
  • the RRC connected state 102 may use a higher power level than the RRC idle state 104.
  • the WTRU may transition between the RRC connected state 102 and the RRC idle state 104 at 106.
  • the WTRU may transition from the RRC connected state 102 to the RRC idle state 104 after a period of time has elapsed without receiving transmissions in the RRC connected state 102.
  • the WTRU may transition to the RRC idle state 104 to preserve battery power, as the RRC idle state 104 may use less power resources than the RRC connected state 102.
  • the WTRU may transition to the RRC connected state 102, for example, when data transmissions are received for being processed in the RRC connected state 102.
  • the WTRU may establish a connection in the RRC connected state 102 and/or release the resources used for the RRC connected state 102 when the WTRU transitions to the idle state 104.
  • the RRC connected state 102 may be subdivided into a continuous reception state, a short discontinuous reception (DRX) state, a long DRX state, and/or the like.
  • the short DRX state and/or the long DRX state may be power saving connected states that may use less power than the continuous reception state.
  • a WTRU may transition to the continuous reception state, for example, when the WTRU is promoted from the RRC idle state 104.
  • the continuous reception state may be used by the WTRU to transmit and/or receive data in the communication system.
  • the WTRU may transition from the continuous reception state to a DRX state when it is waiting for data in the RRC connected state 102.
  • the communication device may transition from the continuous reception state to the short DRX state.
  • the communication device may transition from the short DRX state to the long DRX state after a period of time has elapsed without receiving data in the RRC connected state 102.
  • the long DRX state may be an extended DRX state that may allow the WTRU to be in a DRX mode for a longer period of time prior to transitioning into idle mode.
  • the transition between states may be handled using inactivity timers.
  • the WTRU may transition from the continuous reception state to the short DRX state upon expiration of an inactivity timer T1.
  • the inactivity timer T1 may begin when the WTRU is no longer transmitting and/or receiving data.
  • the WTRU may transition from the short DRX state to the long DRX state upon expiration of an inactivity timer T2.
  • the WTRU may transition from the long DRX state to the RRC idle state 104 upon expiration of an inactivity timer T3.
  • a WTRU may transfer states for connectivity to other radio technologies.
  • the WTRU may transfer states for connectivity to a UMTS, global system for mobile communications (GSM), or other radio technologies for example.
  • GSM global system for mobile communications
  • the WTRU may perform a handover between the RRC connected state 102 and a UMTS connected state.
  • the WTRU may perform a handover between the RRC connected state 102 and a GSM/general packet radio service (GPRS) packet connected state.
  • GPRS general packet radio service
  • the UMTS states 110 may include connected states and/or idle states.
  • the connected states may include a CELL dedicated channel (DCH) state 112. a CELL forward access channel (FACH) state 114, and/or a CELL paging channel (PCH)/URA PCH 116.
  • DCH CELL dedicated channel
  • FACH CELL forward access channel
  • PCH CELL paging channel
  • the WTRU may be allocated dedicated transport channels in the downlink and/or the uplink direction.
  • the WTRU may transmit user data through shared channels with other WTRUs.
  • the shared channels may be low-speed channels, which may be less than 15kbps for example.
  • the WTRU may remain in connected mode for paging and/or may transfer to FACH.
  • the CELL_FACH state 114 and/or the CELL_PCH/URA_PCH state 116 may be a power saving connected state in UMTS, as they may use less power resources than the CELL_DCH state 112.
  • the UMTS idle states may include a UTRA idle state 120.
  • the WTRU may transition between a UMTS connected states 112, 114, 116 and the UTRA idle state 120 at 118. At 118. the WTRU may establish a connection with one state and/or release the resources for the other state.
  • the WTRU may perform reselection between a UMTS state 110 to an RRC state. For example, the WTRU may perform reselection from a UMTS connected state 112. 114, 116 to the RRC idle state 104 at 122. At 124, the WTRU may perform reselection between the UTRA idle state 120 and the RRC idle state 104.
  • the GSM/GPRS packet states 128 may include connected states and/or idle states.
  • the connected states may include a GSM connected state 130 and/or a GPRS packet transfer mode state 132.
  • the WTRU may have uplink or/and downlink radio resources allocated to it and may transmit and/or receive data.
  • the idle states may include the GSM idle state and/or the GPRS packet idle state 136.
  • the WTRU may transition between the GSM connected state 130 and the GSM idle state 136 or the GPRS packet transfer mode state 132 and the GPRS packet idle state 136.
  • the WTRU may establish a connection with one state and/or release the resources for the other state.
  • the WTRU may perform reselection from the RRC connected state 102 to the GSM idle state/GPRS packet idle state 136.
  • the WTRU may perform cell change over (CCO) reselection from a GSM/GPRS packet connected state 130, 132 to the RRC idle state 104 at 138.
  • CCO cell change over
  • the WTRU may perform CCO reselection from the RRC idle state 104 to the GSM idle/GPRS packet idle state 136.
  • the WTRU may perform CCO reselection from the GSM idle/GPRS packet idle state 136 to the RRC idle state 104 at 144.
  • any connected state and/or idle state may be implemented.
  • the connected states may be implemented to enable the WTRU to transmit and/or receive data.
  • the power saving connected states and/or the idle states may be implemented to reduce power consumption at the WTRU.
  • FIG. 2 is a graph that depicts an example for streaming video content 202 from a content provider (e.g ., Hulu ® , Youtube ® , Netflix ® , etc.).
  • the content provider may provide the video content 202 from one or more content providing devices that may be capable of storing and/or transmitting video content.
  • the video content 202 may be streamed as ad-sponsored video streaming.
  • the video content 202 may be provided at a bitrate between about 4.0 Mbps and about 1.4 Mbps.
  • the average bitrate may be about 850 Kbps.
  • the video content 202 may be transmitted and/or received in intervals.
  • the intervals of video transmission 204, 206 may be followed by intervals of inactivity 208, 210.
  • the intervals of transmission 204, 206 may be about seventy-five seconds.
  • the intervals of inactivity 208, 210 may be about seventy-five seconds.
  • the WTRU may put the transmit/receive module into a power saving state. For example, the WTRU may transition to a short DRX mode, a long DRX mode, or a CELL_FACH state during intervals of inactivity 208, 210. If the intervals of inactivity arc longer than the inactivity timer for transitioning into the idle state, the WTRU may transition from a connected state in intervals of transmission 204. 206 to idle mode in intervals of inactivity 208, 210.
  • a WTRU may have a static nature of treating intervals of multimedia traffic. For example, a WTRU may assume video segments are transmitted between the same inactivity timers.
  • FIG. 3 is a graph that depicts an example for streaming video content 302 from a content provider (e . g ., Hulu ® , Youtube ® , Netflix ® , etc.).
  • the video content 302 may be provided at a bitrate between about 4.5 Mbps and about .3 Mbps.
  • the average bitrate may be about 602 Kbps.
  • the video content 302 may be split into a number of N chunks to allow or enable a WTRU to transition from a connected slate into a power saving state in a connected mode or an idle mode.
  • the WTRU may transition from a DCH state to a FACH state to reduce or save energy.
  • the reduction in energy may be by about eighty percent, for example.
  • the WTRU may send a message (e.g., an RRC message) to transition into an idle state and/or release radio resources after a chunk is received.
  • the transition into the idle state and/or the release of radio resources may be performed prior to waiting for expiration of an inactivity timer for transitioning into the idle state. This early transition into idle state and/or release of resources may be referred to as fast dormancy.
  • a WTRU may use dynamic cache management to cache chunks of the video content. Power may be saved by using dynamic cache management to enable the WTRU to download content in chunks as fast as possible and/or releasing network resources, which may be referred to as fast dormancy.
  • the WTRU may turn off the transmit/receive module to save power.
  • the WTRU may close the connection (e . g ., transmission control protocol (TCP) connection) when the cache may be full.
  • TCP transmission control protocol
  • Power consumption models may be used for decoding media content in WTRUs.
  • the power consumed for hardware accelerated decoding of media content at WTRUs may be determined as described herein.
  • the media content may include video content, such as H.264 video content for example, images, audio content, and/or other types of media content.
  • Variables c s , c t , and/or c q may be modeling parameters that may be constants that may be used to obtain an accurate model.
  • the power consumption of the display may be modeled as shown in Equation (2) below: where variable i may be an instant time, x and y may be coordinates of the pixel ( e.g ., row and column coordinates). R, G , and B may be the red, green, and blue components, respectively, of a pixel displayed on a screen. For example, R ( x,y,i ) may be the value of a pixel ( x,y ) at an instant i .
  • Variables ⁇ , ⁇ , and ⁇ may be modeling parameters that may be determined to obtain an accurate model.
  • display power consumption may be constant ( e. g . ⁇ 250 mW).
  • Equation (1) The power model parameters obtained using Equation (1) may be applied to power ⁇ rate optimized adaptive video streaming, where a maximum quality (Q) may be searched subject to rate R and power constraints P as showing in Equation (3) below:
  • Mobile streaming of multimedia content may be performed based on the power consumption level associated with the multimedia data.
  • Mobile multimedia streaming may be performed to preserve power consumption at the WTRU.
  • Streaming protocols and/or file formats may include a decoding complexity that may be used to predict the amount of power consumption that may be caused by or due to receiving, decoding, and/or displaying of the underlying media streams.
  • the decoding complexity may include power consumption descriptors that may indicate an amount of power that may be consumed when receiving, decoding, and/or displaying multimedia content.
  • FIG. 4 is a flow diagram that illustrates an example process 400 for transmitting media content based on a decoding complexity.
  • the multimedia content may include video streams, images, audio, and/or other forms of multimedia data transmitted using the communication system.
  • One or more portions of the process 400 may be performed by content providing devices, such as a content providing server or servers for example.
  • the content providing device may receive data that may indicate an amount of power for receiving, decoding, and/or displaying media content at WTRUs.
  • the data may be actual data that may be measured by WTRUs that have received, decoded, and/or displayed the media content.
  • the data measured by the WTRUs may be stored at the content providing device and/or a remote location that may be accessible to the content providing device.
  • the content providing device may receive a request for the media content at 404.
  • the request may be received from a WTRU or other network entity.
  • the content providing device may determine a decoding complexity associated with the requested media content
  • the decoding complexity may indicate the amount of power used to receive, decode, and/or display the media content at one or more WTRUs.
  • the decoding complexity may be an absolute value or a relative value. For example, an absolute value of a decoding complexity may be followed by relative values that may indicate an increase or decrease in the decoding complexity.
  • the decoding complexity may be based on the data received at 402. For example, the decoding complexity may indicate a minimum amount of power, a maximum amount of power. and/or an average amount of power used by WTRUs that have received, decoded, and/or displayed the media content.
  • the WTRUs that have received, decoded, and/or displayed the media content may include the WTRU requesting the media content.
  • the decoding complexity may include separate representations for receiving the media content, decoding the media content, and/or displaying the media content.
  • the separate decoding complexity representations for receiving, decoding, and/or displaying the media content may allow WTRUs to tailor the complexity information based on its current state. For example, display power may be scaled to account for user brightness settings at the WTRU.
  • the decoding complexity may indicate a minimum amount of power, a maximum amount of power, and/or an average amount of power used by the WTRUs for decoding each type of media content.
  • the decoding complexity may be different for different types of media content. For example, standard definition video content may have a lower decoding complexity and may use less power than high definition video content.
  • Media content may be encoded using different video encoding profiles.
  • a media content type may use a baseline profile, a main profile, a high profile, etc. Each profile may use a different set of encoding/decoding tools.
  • Lower profiles e . g ., baseline profile
  • Media content may be encoded using the same resolution (e.g ., full HD - 1080p) and may use different video encoding profiles.
  • Media content may be encoded with different coding tools.
  • Media content may be encoded with/without coding tools that may increase decoding complexity.
  • CABAC context-adaptive binary arithmetic coding
  • VLC variable-length coding
  • Media content may be encoded using the same profile, but with/without coding tools. For a given profile, some coding tools may be optional. If the coding tools are not used, the WTRU may see a lower decoding complexity.
  • Media content may be encoded using the same profile, but using different coding tools.
  • the content providing device may select the media content for transmission at 408.
  • the media content may be selected at 408 based on the decoding complexity.
  • the content providing device may send an indication of the decoding complexity associated with the requested media content to the WTRU.
  • the content providing device may receive an indication from the WTRU of whether the WTRU would like to receive the media content based on the associated decoding complexity.
  • the content providing device may transmit the selected media content at 410.
  • the content providing device may send an indication of one or more of the available types of content and/or the decoding complexity associated with each type.
  • the content providing device may receive an indication from the WTRU of a preferred type of media content based on the decoding complexity associated therewith, and may select this content at 408 for transmission.
  • the content providing device may pre-filter the types of media content that may be sent to the WTRU based on the decoding complexity.
  • the content providing device may know the WTRU type, available power of the WTRU, and/or the power consumption configurations of the WTRU.
  • the WTRU type, available power of the WTRU, and/or the power consumption configurations of the WTRU may be received from the WTRU or may be stored at the content providing device or other remote location and may be looked up based on an identifier associated with the WTRU.
  • the WTRU identifier, the WTRU type, the WTRU power configurations, and/or the available power of the WTRU may be included in the request for media content received at 404.
  • the content providing device may use the device type, available power, and/or the power consumption configurations to determine whether the WTRU can process certain types of media content.
  • the content providing device may provide the decoding complexity of media content that may be relevant to a device based on the device configurations and/or device specifications of the WTRUs that have provided the decoding complexity feedback. For example, when a WTRU requests media content from a content providing device, the request may include an identifier of the WTRU. The identifier may be a unique identifier or one or more device configurations or specifications associated with the device. The content providing device may determine the relevant decoding complexity information based on the WTRU identification information provided by the WTRU and may provide decoding complexity information that may be tailored to the WTRU.
  • the content providing device may provide the decoding complexity information along with device identification information that may identify devices to which the decoding complexity information may be relevant. For example, when a WTRU requests media content from a content providing device, the content providing device may provide the decoding complexity for the media content, along with one or more identifiers that may indicate the devices to which the decoding complexity may be relevant. The content providing device may provide different versions of the decoding complexity such that the WTRU may determine the version of the decoding complexity is relevant, or most relevant, to the WTRU.
  • the content providing device may autonomously select the media content at 408 based on the decoding complexity.
  • the content providing device may use the device identifier and/or the WTRU configurations and/or specifications (e.g ., device type, make, model, etc. ) to select the media content to be transmitted at 408.
  • the device configurations and/or specifications may be available in the HTTP header of a request received from the WTRU, for example.
  • the content providing device may determine whether the WTRU has enough power available for receiving, decoding and/or displaying the requested media content.
  • the content providing device may determine whether the decoding complexity of the media content is within an acceptable range. When there are multiple types of the requested media content, the content providing device may select the type of content that is appropriate for transmission to the WTRU based on the WTRU type, the WTRU power configurations, and/or the available power of the WTRU.
  • Another network device such as a Node B for example, may select the media content at 408 based on the decoding complexity.
  • the network device may use the device identifier and/or the WTRU configurations and/or specifications (e.g. , device type, make. model, etc.) to select the media content to be transmitted at 408.
  • the device configurations and/or specifications may be available in the HTTP header of a request received from the WTRU, for example.
  • the network device may determine whether the WTRU has enough power available for receiving, decoding and/or displaying the requested media content.
  • the network device may determine whether the decoding complexity of the media content is within an acceptable range.
  • the network device may select the type of content that is appropriate for transmission to the WTRU based on the WTRU power configurations, the WTRU power configurations, network configurations ( e . g .. bandwidth), and/or the available power of the WTRU.
  • FIG. 5 is a flow diagram that illustrates an example process 500 for selecting multimedia content based on a decoding complexity.
  • the WTRU may request media content from a content providing device.
  • the WTRU may receive a decoding complexity associated with the requested media content.
  • the decoding complexity may indicate the amount of power that may be used to receive, decode. and/or display the media content at the WTRU.
  • the decoding complexity may indicate power metrics for the power used by other WTRUs that have received, decoded, and/or displayed the media content. For example, the decoding complexity may indicate a minimum amount of power, a maximum amount of power, and/or an average amount of power used by other WTRUs that have received, decoded, and/or displayed the media content.
  • the WTRU may determine whether to receive the media content at 506 based on the decoding complexity. For example, the WTRU may determine its current battery power and/or whether it has enough battery power available for decoding the requested content. The WTRU may determine whether the decoding complexity is within an acceptable range based on the current batter power. When there are different types of media content available at the content providing device, the WTRU may receive an indication of the decoding complexity for the different types of media content. For example, the WTRU may receive a minimum amount of power, a maximum amount of power, and/or an average amount of power used by the WTRUs for receiving, decoding, and/or displaying each type of media content. The WTRU may select a preferred type of media content based on the decoding complexity associated with each type of media content.
  • the WTRU may send an indication at 508 to indicate whether it wants to receive the requested media content.
  • the WTRU may also indicate the type of media content that it would like to receive.
  • the indication may be sent at 508 in response to the decoding complexity received from the content providing device. If the WTRU indicates at 508 that it wants to receive the media content and/or a selected type of media content, the WTRU may receive the media content in response to the indication.
  • the media content may be streamed via a communication network and/or stored at the WTRU for playback. If the WTRU indicates at 508 that it does not want to receive the media content, the media content may not be transmitted to the WTRU, or the WTRU may ignore the transmission.
  • FIG. 6 is a flow diagram that illustrates an example process 600 for measuring decoding complexity and/or sending decoding complexity feedback information that indicates the decoding complexity.
  • One or more portions of the process 600 may be performed by a WTRU.
  • the WTRU may request media content from a content providing device.
  • the WTRU may receive the requested media content.
  • the media content may be streamed via a communication network and/or stored at the WTRU for playback.
  • the WTRU may decode and/or display the media content at 606.
  • the WTRU may measure the decoding complexity at 608. For example, the WTRU may measure the amount of power used while receiving, decoding, and/or displaying the media content.
  • the WTRU may measure the maximum value, minimum value, and/or average value of power used by the WTRU.
  • the average value may be calculated by the minimum value plus the maximum value divided by two ( e.g ., (minValue + maxValue) / 2) or by calculating the average power value used over a period of time ( e . g ., the period for receiving, decoding, and/or displaying the media content or a segment thereof). If the media content is segmented at the WTRU, the WTRU may perform the measurements at 608 for each segment of the media content.
  • the WTRU may send decoding complexity feedback information to a remote source at 610.
  • the WTRU may send the decoding complexity feedback information to the content providing device and/or another network device for storage.
  • the decoding complexity feedback information may be aggregated from multiple devices, such as in the form of a database for example.
  • the decoding complexity feedback information may inform the content providing device of the actual decoding complexity for receiving, decoding, and/or displaying the multimedia content.
  • a separate decoding complexity representation may be used by WTRUs to report decoding complexity for receiving media content, decoding media content, and/or displaying media content.
  • the decoding complexity feedback information may be stored and/or used to send the decoding complexity to other receiving devices.
  • the decoding complexity feedback information may also include a time at which the decoding complexity was measured, a duration for which the decoding complexity was measured, and/or device configurations and/or specifications for the device measuring the decoding complexity.
  • the device configurations and/or specifications may include a device type, make, model, release date, OS version, device drivers, player agent, available power, power consumption configurations, and/or the like.
  • the decoding complexity feedback information may be used to provide the decoding complexity to WTRUs.
  • the decoding complexity may be included in a file format or protocol.
  • the decoding complexity may be included in a media presentation descriptor (MPD) of a dynamic adaptive streaming over HTTP (DASH), a session description protocol (SDP) of an IP multimedia subsystem (IMS), a real time streaming protocol (RTSP), a file format such as a 3GP file format, and/or the like.
  • MPD media presentation descriptor
  • DASH dynamic adaptive streaming over HTTP
  • SDP session description protocol
  • IMS IP multimedia subsystem
  • RTSP real time streaming protocol
  • file format such as a 3GP file format, and/or the like.
  • a streaming protocol may include a value that indicates the decoding complexity.
  • the protocol structure may include an abstract called DecodingComplexity and/or may include values that indicate the decoding complexity associated with media content being streamed via the protocol.
  • DecodingComplexity Provided below is an example of an abstract function called DecodingComplexity that may be used in a streaming protocol:
  • the DecodingComplexity function may include one or more decoding complexity values, a value indicating whether the decoding complexity value is an absolute value or a relative value, and/or a units value that may indicate the units in which the decoding complexity values are measured.
  • the value indicating whether the decoding complexity value is an absolute value or a relative value may be a boolean value that may be used to identify if one or more of the reported values minValue, max Value and/or avrValue may be an absolute or relative value. If the decoding complexity values are relative values, the values may be expressed as a relative increase or decrease from previous values.
  • the decoding complexity may include the values minValue, maxValue, and/or avrValue.
  • the decoding complexity may represent a variability of the amount of power and/or time it may take a WTRU to receive, decode, and/or display media content, such as a video stream, a video frame, a video file, an audio file, images, and/or other types of media content.
  • the decoding complexity may be based on values measured by a reference device.
  • the reference device may be another WTRU.
  • the reference device may be similarly configured to the device requesting the media content.
  • the reference device may have the same device type, make, model, release date, OS version, device drivers, player agent, available power, power consumption configurations, and/or the like.
  • One or more of the configurations of the reference device may also be different, but may be relevant for predicting the amount of power that may be used by the device requesting the media content.
  • the device configurations may be included in metadata.
  • the metadata about the configurations of the reference device may be included in a user defined box within a file format, such as a 3GP file format for example, or may be sent in an outbound signaling protocol, such as an email or an HTTP websocket for example.
  • the decoding complexity may be used to select the appropriate video stream.
  • the avrValue may be used to select the appropriate video stream.
  • the avrValue may be used when video streams may include a small number of frames or shorter videos.
  • Video streams that include a small number of frames or shorter videos may include videos less than a minute ( e . g .. 30 seconds) or a number of minutes.
  • the minValuc and/or the maxValue may be used to capture variability of frame decoding time.
  • the min Value and/or the max Value may be used with a larger number of video frames or longer videos. Longer videos or videos with a larger number of frames may be larger than the shorter videos.
  • the longer videos may be longer than an hour ( e . g ., 2 hours).
  • the avrValue may be estimated as (minValue+maxvaluc)/2 or may be calculated as an average value of each of the known reference devices or a subset of the reference devices ( e . g ., reference devices with similar configurations).
  • the decoding complexity function may include the units at which the values arc measured.
  • the decoding complexity function may include a string value for the units that may indicate the units for the minValue, max Value and/or avrValue.
  • the units value may be in milliseconds (msec), million instructions per second (MIPS), milliwatts (mW), and/or the like.
  • the maximum number of characters for the units value may be a fixed number ( e . g ., 8 characters) or a variable number. While an example is provided for the abstract decoding complexity function above, the decoding complexity may be modified and/or implemented for each streaming protocol.
  • the decoding complexity may be included in the moving picture experts group (MPEG) DASH protocol.
  • the media presentation description (MPD) in the MPEG DASH protocol may be used to indicate the decoding complexity.
  • decoding complexity descriptors may be included in the MPEG DASH schema at a period, an adaptation set, a representation, and/or the like.
  • the adaptation set may include the set of different types of media files stored at a content providing device.
  • the representation may include a media file in the adaptation set.
  • a representation may include a different version of the same media content. For example, different representations of the same video clip may be encoded at different bit rates and/or different resolutions.
  • a multimedia content may include a presentation ( e .
  • a period may be a chapter in the presentation.
  • ads may be displayed.
  • Each period may be partitioned into segments ( e . g ., two to ten second segments).
  • the decoding complexity may be included at the period level to inform the WTRU, or another device, about the minimum and/or maximum expected power resources that may be used to receive, decode, and/or display a representation of an adaptation set(s).
  • the decoding complexity values may be absolute values or may be relative values that may reference a decoding complexity in a prior period.
  • the absolute value may be included in the first period in a presentation, for example.
  • Subsequent periods may include a value relative to the first period.
  • the relative values may reference the absolute value.
  • the XML scheme includes a minDccodingComplexity and a inaxDecodingComplexity as a string value "xs:string.”
  • a similar XML scheme may be implemented that includes other forms of decoding complexity, such as average decoding complexity and/or the like.
  • the string representing minimum decoding complexity and/or the string representing the maximum decoding complexity may be formatted having sub-strings.
  • the sub-strings may be separated by commas or other indicators to show the separation between sub-strings.
  • the substrings may indicate whether the decoding complexity is an absolute or relative value, the decoding complexity value, and/or the units of each decoding complexity value.
  • the decoding complexity is represented as a string value
  • the decoding complexity may be represented as one or more integer values that may indicate whether the decoding complexity is an absolute or relative value, the decoding complexity value, and/or the units of each decoding complexity value.
  • the decoding complexity may be included at the adaptation set level. This may provide a finer level of adaptation than including the decoding complexity at the period level.
  • the decoding complexity may include the minimum, maximum, and/or average expected processing that may be used to receive, decode, and/or display a representation of the adaptation set.
  • the decoding complexity at the adaptation set level may include absolute or relative values. The relative values may refer to a prior relative or absolute value. For example, the first adaptation set within a period may include absolute values for the decoding complexity and/or subsequent adaptation set(s) may include absolute or relative values for the decoding complexity.
  • the adaptation set may include a minimum decoding complexity, a maximum decoding complexity, and/or an average decoding complexity.
  • a minimum decoding complexity a maximum decoding complexity
  • an average decoding complexity a value for the adaptation set that includes the minimum decoding complexity, the maximum decoding complexity, and the average decoding complexity as an "xs:string" value.
  • the decoding complexity may be included at the representation level.
  • the representation level may provide a finer level of adaptation compared to the period and/or the adaptation set level.
  • the minimum, maximum, and/or average expected consumption of power resources that may be used to receive, decode, and/or display a segment or sub-representation may be included at the representation level.
  • the decoding complexity at the representation level may include absolute or relative values.
  • the relative values may the relative to a prior absolute or relative value.
  • the first representation within an adaptation set may include an absolute value.
  • Subsequent representations may include absolute or relative values.
  • the decoding complexity at the representation level may include a minimum decoding complexity, a maximum decoding complexity, and/or an average decoding complexity.
  • a minimum decoding complexity a maximum decoding complexity
  • an average decoding complexity a value that is a value that is a value that is a value that is a value that is a value that is a value that is a value that is a value that is a value that is a value that is a value.
  • the string representing the minimum decoding complexity, the maximum decoding complexity, and/or the average decoding complexity may be formatted having sub-strings, as described herein. While the decoding complexity is represented as a string value, the decoding complexity may be represented as one or more integer values, as described herein.
  • the decoding complexity may also be included at the sub-representation level in a similar manner as the representation level.
  • a sub-representation may be used to provide a version of a representation to enable a video player to fast-forward and/or rewind efficiently.
  • the sub-representation may be a lower-qualily than the version of a representation.
  • Provided below is an example of an XML scheme for the sub-representation level that includes the minimum decoding complexity, the maximum decoding complexity, and the average decoding complexity as an "xs:string" value.
  • the decoding complexity is represented as a string value, the decoding complexity may be represented as one or more integer values, as described herein.
  • the decoding complexity, and/or other attributes, may be signaled in MPEG DASH using generic descriptors.
  • the generic descriptors may be declared at the adaptation set level, the representation level, and/or the sub-representation level.
  • the generic descriptors may indicate whether the decoding complexity is an essential or supplemental element of the MPEG DASH protocol.
  • Essential protocol elements of the MPEG DASH protocol may be supported by each compliant video player.
  • Supplemental protocol elements may include information that may or may not be understood each compliant video player.
  • the decoding complexity may be included as a parameter to a media streaming format.
  • the media streaming format may be the session description protocol (SDP) for a session initiation protocol, a real-time streaming protocol (RTSP), and/or the like.
  • the decoding complexity may include the minimum, maximum, and/or average expected processing resources that may be used to receive, decode, and/or display the media stream.
  • the decoding complexity may include absolute or relative values.
  • the first media component within the SDP may include absolute values.
  • Subsequent media components may include absolute or relative values.
  • the relative values may be relative to the values in the first media component within the SDP or may be relative to the previous media component.
  • SDP may implement a decoding complexity attribute.
  • the decoding attribute "a" may be implemented as follows: a-DceodingComplexity: ⁇ decoding Absolute> ⁇ minDecodingComplexity> ⁇ maxDecodingComplexity> ⁇ avrDecodingComplexity> ⁇ decoding Units>.
  • the decoding complexity attribute may include an indication of whether the decoding complexity is an absolute or relative value, a minimum decoding complexity, a maximum decoding complexity. an average decoding complexity, and/or the units for which the decoding complexity may be measured.
  • the decoding complexity attribute may be sent during an initial offer and/or answer negotiation, such as a negotiation between the WTRU and the content providing device for example.
  • the decoding complexity attribute may be sent during an ongoing video streaming session to modify one or more SDP attributes.
  • the decoding complexity may be included in real-time transport control protocol (RTCP) messages.
  • RTCP real-time transport control protocol
  • the decoding complexity may be included in an RTCP extension in an RTCP message.
  • Content providing devices may send the decoding complexity in sender reports to one or more receiving devices, such as a WTRU or other network device for example.
  • the receiving devices may be reference devices that may send decoding complexity feedback information to the content providing devices to inform the content providing devices of the actual decoding complexity associated with selected multimedia content.
  • the receiving devices may include the actual decoding complexity in RTCP messages sent to the content providing devices.
  • the decoding complexity feedback information may inform the content providing devices of the actual decoding complexity of selected media content at the receiving devices on which the media content has been received, decoded, and/or displayed.
  • the decoding complexity feedback information may be stored at the content providing devices and/or used by the content providing devices to provide the decoding complexity to other receiving devices.
  • the decoding complexity may be sent to other receiving devices during the initial SDP offer/answer negotiation, for example.
  • a codec control message may be used to convey control messages that may carry feedback information from receiving devices to content providing devices.
  • the CCM may be used for a real-time transport protocol (RTP) audio-visual profile with feedback (AVPF) profile to convey the control messages.
  • RTP real-time transport protocol
  • AVPF audio-visual profile with feedback
  • the control messages may include an intra-frame request and/or a temporary maximum bit rate.
  • FIG. 7 is a diagram that depicts an example message 700 that may be used for sending decoding complexity feedback information.
  • the message 700 may include a source identifier 702 of the message 700.
  • the source identifier 702 may indicate the source from which the message 700 may be transmitted.
  • the message 700 may include feedback control information 706.
  • the feedback control information 706 may include the decoding complexity feedback information.
  • the feedback control information 706 may be indicated using a binary representation.
  • a byte ( e . g ., the first byte) of the feedback control information 706 may indicate whether the decoding complexity feedback information is indicated as an absolute ( e . g ., '1') or relative (e.g., '0') measurement.
  • a first message 700 may carry absolute measurements. Subsequent messages may carry relative measurements.
  • the feedback control information 706 may include a minValue. max Value, and/or avrValue of the decoding complexity.
  • the minValue, max Value, and/or avrValue may follow ( e.g ., respectively) the byte indicating whether the decoding complexity feedback information is indicated as an absolute or relative value.
  • the minValue, max Value, and/or avrValue may be 32-bit values.
  • the feedback control information 706 ( e . g ., in the last bytes) may be assigned to the string of the units of measurement.
  • the length of the RTCP packet may be used to determine the last character and/or the length of the string of the units.
  • the message 700 may be in an RTCP format.
  • the message 700 may include a header that may include version (V) field 708, a padding bit (P) field 710, a feedback message type (FMT) field 712, a payload type (PT) field 714, and/or a length field 716.
  • the V field 708 may include a number of bits ( e . g ., 2 bits) that may identify the RTP version.
  • the P field 710 may include a number of bits ( e . g ., 1 bit) that may indicate whether the packet includes additional padding octets at the end that may be included in the length field 716.
  • the FMT field 712 may indicate the type of feedback message for the message 700.
  • the message 700 may be interpreted relative to the FMT field 712.
  • An FMT type of fifteen may indicate an application-layer feedback message, for example.
  • the PT field 714 may include a number of bits ( e . g ., 8 bits) that may identify the packet type.
  • the PT field 714 having a value of '206' may indicate that the packet type is an RTCP feedback message, for example.
  • the length field 716 may include a number of bits (e.g., 16 bits) that may indicate the length of the packet ( e . g .. in 32-bit words minus one), and may include the header and/or any padding bits.
  • the source identifier 702 may be a synchronization source (SSRC) packet sender field.
  • the source identifier field 702 may include a number of bits ( e . g ., 32 bits) that may indicate the source of the message 700.
  • the message 700 may include a media source field 704.
  • the media source field 704 may be an SSRC media source field.
  • the media source field 704 may be an identifier that may uniquely identify the source of the media. While the message 700 includes a number of fields, additional fields and/or a subset of the described fields may be implemented.
  • the decoding complexity may be indicated in media files.
  • decoding complexity descriptors may be included in metadata within the media files.
  • Decoding complexity descriptors may be included within stored media files.
  • the decoding complexity descriptors may indicate the amount of power that may be used to decode the media content in the media files.
  • the decoding complexity descriptors may be based on actual data measured for receiving, decoding, and/or displaying the media files.
  • the decoding complexity descriptors may indicate a minimum value, a maximum value, and/or an average value of power used to receive, decode, and/or display the media file or portions ( e.g ., video frames, audio segments, etc .) thereof.
  • the decoding complexity descriptors may indicate a minimum amount of time, a maximum amount of time, and/or an average amount of time to receive, decode, and/or display a media file or portions thereof.
  • the decoding complexity descriptors may indicate a minimum amount of computing resources, a maximum amount of computing resources, and/or an average amount of computing resources to receive, decode, and/or display a media file or portions thereof.
  • the decoding complexity descriptors may include absolute values or relative values. The relative values may indicate an increase or decrease value from a value t)f a prior relative decoding complexity descriptor.
  • the decoding complexity descriptors may include the units for the decoding complexity values. Storing the media files with the decoding complexity may allow content providing devices to use the media files within the underlying streaming protocols.
  • the decoding complexity descriptors may indicate the decoding complexity of one or more segments of a segmented media file. Each segment may have its own decoding complexity descriptor, or a decoding complexity descriptor may indicate a decoding complexity for a group of segments. If a relative value is used to indicate the decoding complexity of a media file segment, the value may be relative to a decoding complexity value of a prior segment of the media file. For example, the first decoding complexity descriptor of a media file may include an absolute value, while other decoding complexity descriptors may include a relative value.
  • the media files may be in a 3GP file format.
  • the 3GP media file may be configured for DASH streaming.
  • the 3GP file format may be structurally based on the ISO media file format.
  • the ISO media file format may be an object-oriented format.
  • the objects in the ISO file format may be referred to as boxes.
  • the boxes may include the media data.
  • the boxes may include video data.
  • the media data may include movie data.
  • the movie data may be included in movie fragment boxes (moots).
  • the decoding complexity may be included in sample index data, a fragment header, and/or fragment decoding data.
  • the decoding complexity may be indicated in sample index data using a segment index box (sidx).
  • the sidx may provide a compact index of a track or sub-segment within the media segment to which it may apply.
  • the sub-segment may be a media stream within the media segment, for example.
  • the sidx may refer to other sidxs in a media segment which may refer to other sub-segments. Each sidx may document how a segment may be divided into one or more sub-segments.
  • the decoding complexity data for each sub-segment may be included in a sidx as shown below.
  • the sidx box may include a minimum, maximum, and/or average decoding complexity.
  • the decoding complexity in the sidx box may include absolute or relative values.
  • the relative values may the relative to a prior absolute or relative value.
  • the first decoding complexity in a sidx box may include an absolute value.
  • Subsequent decoding complexity values may include absolute or relative values.
  • the sidx box may include the units of the decoding complexity.
  • the decoding complexity data for each sub-segment may be included in a track fragment header (tfhd) box.
  • the tfhd box may provide basic information about each movie fragment of an underlying media track.
  • the decoding complexity may be included in the header information for each fragment as shown below.
  • the tfhd box may include a minimum, maximum, and/or average decoding complexity.
  • the decoding complexity in the tfhd box may include absolute or relative values.
  • the relative values may the relative to a prior absolute or relative value.
  • the first decoding complexity in a tfhd box may include an absolute value.
  • Subsequent decoding complexity values may include absolute or relative values.
  • the tfhd box may include the units of the decoding complexity.
  • the decoding complexity may be included in a track fragment base media decode time (tfdt) box.
  • the tfdt box may provide the decode time of a sample in the track fragment.
  • the sample in which the decoding time may be included may be the first sample of the track fragment.
  • the tfdt box may be used when performing random access in a file.
  • the decoding complexity for each fragment or sub-sample may be included as shown in the class below.
  • the tfdt box may include a minimum, maximum, and/or average decoding complexity.
  • the decoding complexity in the tfdt box may include absolute or relative values.
  • the relative values may the relative to a prior absolute or relative value.
  • the first decoding complexity in a tfdt box may include an absolute value.
  • Subsequent decoding complexity values may include absolute or relative values.
  • the tfdt box may include the units of the decoding complexity.
  • the decoding complexity may be included in real-time protocol (RTP) and/or realtime control protocol (RTCP) streaming.
  • RTP real-time protocol
  • RTCP realtime control protocol
  • Hint tracks may carry transport specific information that may be used by a content providing device to create the RTP and/or RTCP packets.
  • the decoding complexity may be included in a hint media header (hmhd) box.
  • the decoding complexity may be included for a whole video hint track, which may provide a coarse approach for track selection by the WTRU or other receiving device of the media content.
  • An example of an hmhd box including the decoding complexity is shown below.
  • the decoding complexity feedback may be included in quality of experience (QoE) reporting.
  • decoding complexity descriptors may be included within an event reporting service that may be sent from the WTRU.
  • the actual WTRU decoding complexity feedback may be used by the content providing device to attach or update the decoding complexity descriptors.
  • the decoding complexity descriptors may be included within a file format, such as a 3GP file format for example, or within one or more streaming protocols.
  • the QoE reporting may include DASH QoE reporting.
  • QoE reporting may be optional.
  • the QoE reporting may use an average throughput metric, an initial playout delay metric, and/or a buffer level metric.
  • TABLE I shows an example for including the decoding complexity feedback as a metric in QoE reports.
  • the decoding complexity feedback may indicate the decoding complexity for receiving, decoding, and/or displaying media content at a WTRU.
  • the decoding complexity may be in terms of absolute or relative value.
  • the absolute or relative value may be a boolean value that may indicate if the reported values are in absolute or relative metrics ( e.g .. '1' for absolute and/or '0' for relative).
  • the decoding complexity may include a minimum, maximum, and/or average decoding complexity value measured during a measurement interval.
  • the minimum, maximum, and/or average decoding complexity value may be an integer value, a character value, and/or the like.
  • the decoding complexity feedback may indicate the units for the reported decoding complexity values.
  • the decoding complexity feedback may indicate a start time and/or a duration of the measurement interval.
  • the start time may include a real time value at which the decoding complexity values were measured.
  • the duration of the measurement interval may include a period of time over which the decoding complexity values were measured.
  • the duration may include an integer value that indicates the time (e.g., in milliseconds, seconds, etc. ) over which the measurements were performed.
  • the decoding complexity feedback may include the device configurations or device specifications that may identify the WTRU at which the measurements are performed.
  • decoding complexity feedback may include the device type, make, model, release date, OS version, device drivers, player agent, available power, power consumption configurations, and/or the like of the WTRU at which the measurements were performed for receiving, decoding, and/or displaying the media content.
  • the decoding complexity feedback may include a unique identifier that indicates the device from which the decoding complexity feedback is being received.
  • the RTCP protocol may allow or enable WTRUs to send WTRU reports that may be extended to include decoding complexity reports from the WTRUs in an application specific (APP) RTCP type.
  • the RTCP extension may be used by WTRUs to report QoE metrics.
  • FIG. 8A is a diagram that depicts an example communications system 800 in which one or more disclosed embodiments may be implemented.
  • the communications system 800 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc ., to multiple wireless users.
  • the communications system 800 may enable multiple wireless users to access such content through the sharing of system resources. including wireless bandwidth.
  • the communications systems 800 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), and/or the like.
  • CDMA code division multiple access
  • TDMA time division multiple access
  • FDMA frequency division multiple access
  • OFDMA orthogonal FDMA
  • SC-FDMA single-carrier FDMA
  • the communications system 800 may include wireless transmit/receive units (WTRUs) 802a, 802b, 802c, and/or 802d (which generally or collectively may be referred to as WTRU 802), a radio access network (RAN) 803/804/805, a core network 806/807/809, a public switched telephone network (PSTN) 808, the Internet 810, and other networks 812, though any number of WTRUs, base stations, networks, and/or network elements may be used.
  • WTRUs 802a, 802b, 802c, and/or 802d may be any type of device configured to operate and/or communicate in a wireless environment.
  • the WTRUs 802a, 802b, 802c, and/or 802d may be configured to transmit and/or receive wireless signals and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, and/or the like.
  • UE user equipment
  • PDA personal digital assistant
  • smartphone a laptop
  • netbook a personal computer
  • a wireless sensor consumer electronics, and/or the like.
  • the communications systems 800 may also include a base station 814a and/or a base station 814b.
  • Each of the base stations 814a, 814b may be any type of device configured to wirelessly interface with at least one of the WTRUs 802a, 802b, 802c, and/or 802d to facilitate access to one or more communication networks, such as the core network 806/8O7/809, the Internet 810, and/or the networks 812.
  • the base stations 814a and/or 814b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and/or the like. While the base stations 814a, 814b arc each depicted as a single element, the base stations 814a, 814b may include any number of interconnected base stations and/or network elements.
  • BTS base transceiver station
  • the base station 814a may be part of the RAN 803/804/805. which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc .
  • BSC base station controller
  • RNC radio network controller
  • the base station 814a and/or the base station 814b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown).
  • the cell may further be divided into cell sectors.
  • the cell associated with the base station 814a may be divided into three sectors.
  • the base station 814a may include three transceivers, e.g ., one for each sector of the cell.
  • the base station 814a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.
  • MIMO multiple-input multiple output
  • the base stations 814a and/or 814b may communicate with one or more of the WTRUs 802a, 802b, 802c, and/or 802d over an air interface 815/816/817, which may be any suitable wireless communication link (e.g ., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, etc .).
  • the air interface 815/816/817 may be established using any suitable radio access technology (RAT).
  • RAT radio access technology
  • the communications system 800 may be a multiple access system and/or may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and/or the like.
  • the base station 814a in the RAN 803/804/805 and the WTRUs 802a, 802b, and/or 802c may implement a radio technology such as UMTS Terrestrial Radio Access (UTRA), which may establish the air interface 815/816/817 using wideband CDMA (WCDMA).
  • WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+).
  • HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).
  • the base station 814a and the WTRUs 802a, 802b, and/or 802c may implement a radio technology such as E-UTRA, which may establish the air interface 815/816/817 using Long Tenn Evolution (LTE) and/or LTE-Advanced (LTE-A).
  • E-UTRA Long Tenn Evolution
  • LTE-A Long Telecommunications
  • LTE-A Long Telecommunications-Advanced
  • the base station 814a and the WTRUs 802a, 802b, and/or 802c may implement radio technologies such as IEEE 802.16 (e.g ., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO. Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and/or the like.
  • IEEE 802.16 e.g ., Worldwide Interoperability for Microwave Access (WiMAX)
  • CDMA2000 Code Division Multiple Access 2000
  • CDMA2000 IX Code Division Multiple Access 2000
  • CDMA2000 EV-DO Code Division Multiple Access 2000
  • IS-856 Interim Standard 95
  • GSM Global System for Mobile communications
  • EDGE Enhanced Data rates for GSM Evolution
  • GERAN GSM EDGERAN
  • the base station 814b in FIG. 8A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and/or the like.
  • the base station 814b and the WTRUs 802c, 802d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN).
  • the base station 814b and the WTRUs 802c, 802d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN).
  • WPAN wireless personal area network
  • the base station 814b and the WTRUs 802c may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a
  • the 802d may utilize a cellular-based RAT (e.g ., WCDMA, CDMA2000, GSM, LTE, LTE-A, etc .) to establish a picocell or femtocell.
  • a cellular-based RAT e.g ., WCDMA, CDMA2000, GSM, LTE, LTE-A, etc .
  • the base station 814b may have a direct connection to the Internet 810.
  • the base station 814b may not access the Internet 810 via the core network 806/807/809.
  • the RAN 803/804/805 may be in communication with the core network 806/807/809, which may be any type of network configured to provide content, such as voice, data, applications, media, and/or voice over internet protocol (VolP) services to one or more of the WTRUs 802a, 802b, 802c, and/or 802d.
  • the core network 806/807/809 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication.
  • FIG. 8A Although not shown in FIG. 8A .
  • the RAN 803/804/805 and/or the core network 806/807/809 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 803/804/805 or a different RAT.
  • the core network 806/807/809 may also be in communication with another RAN (not shown) employing a GSM radio technology.
  • the core network 806/807/809 may serve as a gateway for the WTRUs 802a, 802b, 802c, and/or 802d to access the PSTN 808, the Internet 810, and/or other networks 812.
  • the PSTN 808 may include circuit-switched telephone networks that provide plain old telephone service (POTS).
  • POTS plain old telephone service
  • the Internet 810 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite.
  • the networks 812 may include wired and/or wireless communications networks owned and/or operated by other service providers.
  • the networks 812 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 803/804/805 or a different RAT.
  • the WTRUs 802a. 802b, 802c, and/or 802d in the communications system 800 may include multi-mode capabilities, e.g ., the WTRUs 802a, 802b. 802c. and/or 802d may include multiple transceivers for communicating with different wireless networks over different wireless links.
  • the WTRU 802c shown in FIG. 8A may be configured to communicate with the base station 814a. which may employ a cellular-based radio technology, and with the base station 814b, which may employ an IEEE (e.g., IEEE 802.11) radio technology.
  • FIG. 8B depicts a system diagram of an example WTRU 802.
  • the WTRU 802 may include a processor 818, a transceiver 820, a transmit/receive element 822, a speaker/microphone 824, a keypad 826, a display/touchpad 828, non-removable memory 830, removable memory 832, a power source 834, a global positioning system (GPS) chipset 836, and/or other peripherals 838.
  • the WTRU 802 may include any sub-combination of the foregoing elements.
  • the base stations 814a and 814b, and/or the nodes that base stations 814a and 814b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HcNB), a home evolved node-B gateway, and proxy nodes, among others, may include some or all of the elements depicted in FIG. 8B and described herein.
  • a content providing device may also include some or all of the elements depicted in FIG. 8B and described herein.
  • the content providing device may use the non-removable memory 830 and/or removable memory 832 to store WTRU identifiers, WTRU configuration information, decoding complexity information, and/or other information described herein.
  • the processor 818 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs). Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
  • the processor 818 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 802 to operate in a wireless environment and/or operate to process video content.
  • the processor 818 may be coupled to the transceiver 820, which may be coupled to the transmit/receive element 822. While FIG. 8B depicts the processor 818 and the transceiver 820 as separate components, the processor 818 and the transceiver 820 may be integrated together in an electronic package or chip.
  • the transmit/receive element 822 may be configured to transmit signals to, or receive signals from, a base station (e.g .. the base station 814a) over the air interface 815/816/817.
  • the transmit/receive element 822 may be an antenna configured to transmit and/or receive RF signals.
  • the transmit/receive element 822 may be an emitter/detector configured to transmit and/or receive IR. UV, and/or visible light signals, for example.
  • the transmit/receive element 822 may be configured to transmit and receive RF and light signals.
  • the transmit/receive element 822 may be configured to transmit and/or receive any combination of wireless signals.
  • the WTRU 802 may include any number of transmit/receive elements 822. More specifically, the WTRU 802 may employ MIMO technology. Thus, the WTRU 802 may include two or more transmit/receive elements 822 ( e.g ., multiple antennas) for transmitting and/or receiving wireless signals over the air interface 815/816/817.
  • the transceiver 820 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 822 and/or to demodulate the signals that are received by the transmit/receive clement 822.
  • the WTRU 802 may have multi-mode capabilities.
  • the transceiver 820 may include multiple transceivers for enabling the WTRU 802 to communicate via multiple RATs, such as UTRA and IEEE 802.11, for example.
  • the processor 818 of the WTRU 802 may be coupled to, and may receive user input data from, the speaker/microphone 824, the keypad 826, and/or the display/touchpad 828 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 818 may output user data to the speaker/microphone 824.
  • the processor 818 may access information from, and/or store data in, any type of suitable memory, such as the non-removable memory 830 and/or the removable memory 832.
  • the non-removable memory 830 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 832 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and/or the like.
  • SIM subscriber identity module
  • SD secure digital
  • the processor 818 may access information from, and/or store data in, memory that is not physically located on the WTRU 802, such as on a server or a home computer (not shown).
  • the processor 818 may receive power from the power source 834, and may be configured to distribute and/or control the power to the other components in the WTRU 802.
  • the power source 834 may be any suitable device for powering the WTRU 802.
  • the power source 834 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and/or the like.
  • the processor 818 may also be coupled to the GPS chipset 836, which may be configured to provide location information (e.g. , longitude and latitude) regarding the current location of the WTRU 802.
  • location information e.g. , longitude and latitude
  • the WTRU 802 may receive location information over the air interface 815/816/817 from a base station ( e.g. , base stations 814a, 814b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations.
  • the WTRU 802 may acquire location information by way of any suitable location-determination method.
  • the processor 818 may further be coupled to other peripherals 838, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 838 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth ® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and/or the like.
  • an accelerometer an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth ® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet
  • FIG. 8C depicts a system diagram of the RAN 803 and the core network 806 according to an embodiment.
  • the RAN 803 may employ a UTRA radio technology to communicate with the WTRUs 802a, 802b, and/or 802c over the air interface 815.
  • the RAN 803 may also be in communication with the core network 806.
  • the RAN 803 may include Node-Bs 840a, 840b, and/or 840c, which may each include one or more transceivers for communicating with the WTRUs 802a, 802b, and/or 802c over the air interface 815.
  • the Node-Bs 840a, 840b, and/or 840c may each be associated with a particular cell (not shown) within the RAN 803.
  • the RAN 803 may also include RNCs 842a and/or 842b.
  • the RAN 803 may include any number of Node-Bs and RNCs.
  • the Node-Bs 840a and/or 840b may be in communication with the RNC 842a.
  • the Node-B 840c may be in communication with the RNC 842b.
  • the Node-Bs 840a, 840b, and/or 840c may communicate with the respective RNCs 842a, 842b via an lub interface.
  • the RNCs 842a, 842b may be in communication with one another via an lur interface.
  • Each of the RNCs 842a, 842b may be configured to control the respective Node-Bs 840a. 840b, and/or 840c to which it is connected.
  • Each of the RNCs 842a, 842b may be configured to carry out or support other functionality, such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and/or the like.
  • the core network 806 shown in FIG. 8C may include a media gateway (MGW) 844, a mobile switching center (MSC) 846, a serving GPRS support node (SGSN) 848, and/or a gateway GPRS support node (GGSN) 850. While each of the foregoing elements are depicted as part of the core network 806, any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • MGW media gateway
  • MSC mobile switching center
  • SGSN serving GPRS support node
  • GGSN gateway GPRS support node
  • the RNC 842a in the RAN 803 may be connected to the MSC 846 in the core network 806 via an IuCS interface.
  • the MSC 846 may be connected to the MGW 844.
  • the MSC 846 and the MGW 844 may provide the WTRUs 802a, 802b, and/or 802c with access to circuit-switched networks, such as the PSTN 808. to facilitate communications between the WTRUs 802a, 802b, and/or 802c and traditional land-line communications devices.
  • the RNC 842a in the RAN 803 may be connected to the SGSN 848 in the core network 806 via an luPS interface.
  • the SGSN 848 may be connected to the GGSN 850.
  • the SGSN 848 and the GGSN 850 may provide the WTRUs 802a. 802b, and/or 802c with access to packet-switched networks, such as the Internet 810, to facilitate communications between and the WTRUs 802a, 802b, and/or 802c and IP-enabled devices.
  • the core network 806 may also be connected to the networks 812, which may include other wired and/or wireless networks that may be owned and/or operated by other service providers.
  • FIG. 8D depicts a system diagram of the RAN 804 and the core network 807 according to an embodiment.
  • the RAN 804 may employ an E-UTRA radio technology to communicate with the WTRUs 802a, 802b, and/or 802c over the air interface 816.
  • the RAN 804 may also be in communication with the core network 807.
  • the RAN 804 may include eNode-Bs 860a, 860b, and/or 860c, though the RAN 804 may include any number of eNode-Bs.
  • the eNode-Bs 860a, 860b, and/or 860c may each include one or more transceivers for communicating with the WTRUs 802a, 802b, and/or 802c over the air interface 816.
  • the eNode-Bs 860a, 860b, and/or 860c may implement MIMO technology.
  • the eNode-B 860a for example, may use multiple antennas to transmit wireless signals to, and/or receive wireless signals from, the WTRU 802a.
  • Each of the eNode-Bs 860a, 860b, and/or 860c may be associated with a particular cell (not shown) and may be configured to handle radio resource management decisions, handover decisions, scheduling of users in the uplink and/or downlink, and/or the like. As shown in FIG. 8D , the eNode-Bs 860a, 860b, and/or 860c may communicate with one another over an X2 interface.
  • the core network 807 shown in FIG. 8D may include a mobility management gateway (MME) 862, a serving gateway 864, and a packet data network (PDN) gateway 866. While each of the foregoing elements are depicted as part of the core network 807, any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • MME mobility management gateway
  • PDN packet data network
  • the MME 862 may be connected to each of the eNode-Bs 860a, 860b, and/or 860c in the RAN 804 via an S1 interface and may serve as a control node.
  • the MME 862 may be responsible for authenticating users of the WTRUs 802a, 802b, and/or 802c, bearer activation/deactivation, selecting a particular serving gateway during an initial attach of the WTRUs 802a, 802b, and/or 802c, and/or the like.
  • the MME 862 may provide a control plane function for switching between the RAN 804 and other RANs (not shown) that may employ other radio technologies, such as GSM or WCDMA.
  • the serving gateway 864 may be connected to each of the eNode-Bs 860a, 860b, and/or 860c in the RAN 804 via the S1 interface.
  • the serving gateway 864 may generally route and/or forward user data packets to/from the WTRUs 802a, 802b, and/or 802c.
  • the serving gateway 864 may perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for the WTRUs 802a, 802b, and/or 802c, managing and storing contexts of the WTRUs 802a, 802b, and/or 802c, and/or the like.
  • the serving gateway 864 may be connected to the PDN gateway 866, which may provide the WTRUs 802a, 802b, and/or 802c with access to packet-switched networks, such as the Internet 810, to facilitate communications between the WTRUs 802a, 802b, and/or 802c and IP-enabled devices.
  • PDN gateway 866 may provide the WTRUs 802a, 802b, and/or 802c with access to packet-switched networks, such as the Internet 810, to facilitate communications between the WTRUs 802a, 802b, and/or 802c and IP-enabled devices.
  • the core network 807 may facilitate communications with other networks.
  • the core network 807 may provide the WTRUs 802a, 802b, and/or 802c with access to circuit-switched networks, such as the PSTN 808. to facilitate communications between the WTRUs 802a, 802b, and/or 802c and traditional land-line communications devices.
  • the core network 807 may include, or may communicate with, an IP gateway (e.g ., an IP multimedia subsystem (IMS) server) that serves as an interface between the core network 807 and the PSTN 808.
  • IMS IP multimedia subsystem
  • the core network 807 may provide the WTRUs 802a, 802b, and/or 802c with access to the networks 812, which may include other wired and/or wireless networks that may be owned and/or operated by other service providers.
  • FIG. 8E depicts a system diagram of the RAN 805 and the core network 809.
  • the RAN 805 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with the WTRUs 802a, 802b, and/or 802c over the air interface 817.
  • ASN access service network
  • the communication links between the different functional entities of the WTRUs 802a, 802b, and/or 802c, the RAN 805, and the core network 809 may be defined as reference points.
  • the RAN 805 may include base stations 880a, 880b, and/or 880c, and an ASN gateway 882, though the RAN 805 may include any number of base stations and/or ASN gateways.
  • the base stations 880a, 880b, and/or 880c may each be associated with a particular cell (not shown) in the RAN 805 and/or may each include one or more transceivers for communicating with the WTRUs 802a, 802b, and/or 802c over the air interface 817.
  • the base stations 880a, 880b, and/or 880c may implement MIMO technology.
  • the base station 880a may use multiple antennas to transmit wireless signals to, and/or receive wireless signals from, the WTRU 802a.
  • the base stations 880a, 880b, and/or 880c may provide mobility management functions, such as handoff triggering, tunnel establishment, radio resource management, traffic classification, quality of service (QoS) policy enforcement, and/or the like.
  • the ASN gateway 882 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 809, and/or the like.
  • the air interface 817 between the WTRUs 802a, 802b, and/or 802c and the RAN 805 may be defined as an R1 reference point that implements the IEEE 802.16 specification.
  • Each of the WTRUs 802a, 802b, and/or 802c may establish a logical interface (not shown) with the core network 809.
  • the logical interface between the WTRUs 802a, 802b, and/or 802c and the core network 809 may be defined as an R2 reference point, which may be used for authentication, authorization, IP host configuration management, and/or mobility management.
  • the communication link between each of the base stations 880a, 880b, and/or 880c may be defined as an R8 reference point that includes protocols for facilitating WTRU handovers and/or the transfer of data between base stations.
  • the communication link between the base stations 880a, 880b, and/or 880c and the ASN gateway 882 may be defined as an R6 reference point.
  • the R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of the WTRUs 802a, 802b, and/or 802c.
  • the RAN 805 may be connected to the core network 809.
  • the communication link between the RAN 805 and the core network 809 may defined as an R3 reference point that may include protocols for facilitating data transfer and mobility management capabilities, for example.
  • the core network 809 may include a mobile IP home agent (MIP-HA) 884, an authentication, authorization, accounting (AAA) server 886, and/or a gateway 888. While each of the foregoing elements are depicted as part of the core network 809, any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • MIP-HA mobile IP home agent
  • AAA authentication, authorization, accounting
  • the MIP-HA 884 may be responsible for IP address management, and may enable the WTRUs 802a, 802b, and/or 802c to roam between different ASNs and/or different core networks.
  • the MIP-HA 884 may provide the WTRUs 802a, 802b, and/or 802c with access to packet-switched networks, such as the Internet 810, to facilitate communications between the WTRUs 802a, 802b, and/or 802c and IP-enabled devices.
  • the AAA server 886 may be responsible for user authentication and for supporting user services.
  • the gateway 888 may facilitate interworking with other networks.
  • the gateway 888 may provide the WTRUs 802a, 802b, and/or 802c with access to circuit-switched networks, such as the PSTN 808, to facilitate communications between the WTRUs 802a, 802b, and/or 802c and traditional land-line communications devices.
  • the gateway 888 may provide the WTRUs 802a, 802b, and/or 802c with access to the networks 812, which may include other wired and/or wireless networks that may be owned and/or operated by other service providers.
  • the RAN 805 may be connected to other ASNs and/or the core network 809 may be connected to other core networks.
  • the communication link between the RAN 805 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of the WTRUs 802a, 802b, and/or 802c between the RAN 805 and the other ASNs.
  • the communication link between the core network 809 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks.
  • Examples of computer-readable storage media include, but arc not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and/or optical media such as CD-ROM disks, and digital versatile disks (DVDs).
  • ROM read only memory
  • RAM random access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and/or optical media such as CD-ROM disks, and digital versatile disks (DVDs).
  • a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, content providing device, and/or any host computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)
  • Power Sources (AREA)
  • Telephone Function (AREA)
  • Transmitters (AREA)
  • Circuits Of Receivers In General (AREA)
  • Telephonic Communication Services (AREA)

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • BACKGROUND
  • Multimedia content may be sent from a content provider to a mobile device using a wireless communications network. The content may be provided by streaming the content such that the content may be received at the mobile device and/or provided to an end user of the mobile device. Mobile multimedia streaming may allow the mobile device to begin playback of the media content before the entire media file has been received at the device.
  • Streaming the multimedia content to a mobile device may be challenging due to the variability of the available bandwidth and the potential demand on the battery during the period of the multimedia experience. Due to the variability of the available bandwidth on the wireless communication network, the power consumed by the radio interface of the mobile device for receiving the multimedia content may be unknown or difficult to determine. Once the multimedia content is received at the mobile device, the decoding and playback of the content may consume additional power that may also be unknown or difficult to determine.
  • Due to the difficulty in determining the power consumed by the mobile device during the multimedia experience, unnecessary resources are being used by the mobile device and the network to transmit and/or display media data.
  • WO2014/011622 A2 , filed before the priority date of the present application but published after, discloses a method for power aware video decoding and streaming, which method includes a mobile device receiving a media file from a video server including complexity information from which a first complexity level for requesting a video segment may be determined by the mobile device based on said complexity information and a power metric for the mobile device.
  • US2009/0304085 A1 discloses an encoder and a method in said encoder to adaptively alter video deblocking complexity in a video encoding engine for generating a stream of encoded video data. and reduces the effects of blocking distortion on the encoded video data. The de blocking filter is characterized by a level of deblocking complexity which may depend on the strength and granularity of the deb locking filter applied to the encoded video data. A resource manager is coupled to the deblocking filter and is configured to adaptively alter the deblocking complexity in order to alter the overall computational complexity of the encoder.
  • WO2009/149100 A1 discloses a client died stream switching method, where the client side stream switching enables substantially uninterrupted transmission of a highest compatible bit rate of a stream of media to a client via a network connection. The client may include one or more buffers for receiving the stream of media. Attributes including the buffer activity and a bandwidth of the network connection may be monitored by a streaming module to determine an alternative bit rate of the stream of media. The stream of media may be transitioned from the first bit rate to the alternative bit rate without an interruption of the stream of media to provide the client with the highest compatible bit rate based on the monitored attributes.
  • WO2009/009166 A1 discloses an intelligent power-aware downloading for mobile communication devices. A mobile communication device (100) includes intelligent, power-aware download capability for downloading or streaming content. Music, video and other media content is made available by the content provider with multiple content qualities. The mobile communication device (100) monitors its own power availability status, determines a desired content quality, and downloads the media content with the desired content quality.
  • SUMMARY
  • Systems, methods, and apparatuses are described herein for predicting power consumption for receiving, decoding, and/or displaying multimedia content at a wireless transmit/receive unit (WTRU). A WTRU may request multimedia content from a content providing device. The content providing device may receive the request for the multimedia content and may determine a decoding complexity associated with the multimedia content. The decoding complexity may be based on data that indicates an amount of power used to receive, decode, and/or display the multimedia content at reference devices, such as WTRUs. For example, the decoding complexity may indicate a minimum decoding complexity value, a maximum decoding complexity value, and/or an average decoding complexity value for receiving, decoding, and/or displaying multimedia content at reference devices.
  • The multimedia content may include video content (e.g., video streams, video files, etc.). images, audio content, and/or other forms of multimedia content. The multimedia content may be comprised of different types of content. For example, where the media content includes video content the media content may include a standard version and a high definition version of the media content. The multimedia content may be segmented. Each of the different types and/or segments of media content may have a corresponding decoding complexity.
  • The content providing device may send an indication of the decoding complexity to the WTRU or other network device. The indication of the decoding complexity may be included in a media file or protocol. The WTRU, or another network device, may use the decoding complexity to determine whether to receive the requested media content. For example, the WTRU may compare the decoding complexity with the amount of available power resources available at the WTRU. Where multiple types or segments of content are included in the requested multimedia content, the WTRU, or other network device, may use the decoding complexity to select one or more types or segments of content.
  • The content providing device may also, or alternatively, determine whether to provide the requested media content, or one or more types or segments thereof, based on the associated decoding complexity. The content providing device may know various WTRU device characteristics and may use them to determine whether to provide the requested media content, or one or more types or segments thereof. The WTRU device characteristics may include the WTRU type, available power of the WTRU. and/or the power consumption configurations of the WTRU.
  • The invention is defined by a method according to independent claim 1 and a device according to independent claim 2. Further embodiments are set out in the dependent claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more detailed understanding of the embodiments disclosed herein may be had from the following description, given by way of example in conjunction with the accompanying drawings.
    • FIG. 1 is a diagram that depicts example communication states that may be used by a wireless transmit/receive unit (WTRU) in a wireless communication system.
    • FIG. 2 is a graph that depicts an example tor streaming video content from a content provider.
    • FIG. 3 is another graph that depicts an example for streaming video content from a content provider
    • FIG. 4 is a flow diagram that illustrates an example process fot transmitting media content based on a decoding complexity.
    • FIG. 5 is a flow diagram that illustrates an example process for selecting multimedia content based on a decoding complexity.
    • FIG. 6 is a flow diagram that illustrates an example process for measuring decoding complexity and/or sending decoding complexity feedback information that indicates the decoding complexity.
    • FIG. 7 is a diagram that depicts an example message that may be used for sending decoding complexity feedback information.
    • FIG. 8A depicts a system diagram of an example communications system in which one or more disclosed embodiments may be implemented.
    • FIG. 8B depicts a system diagram of an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated in FIG. 8A.
    • FIG. 8C depicts a system diagram of un example radio access network and an example core network that may be used within the communications system illustrated in FIG. 8A.
    • FIG. 8D depicts a system diagram of another example radio access network and another example core network that may be used within the communications system illustrated in FIG. 8A.
    • FIG. 8E depicts a system diagram of another example radio access network and another example core network that may be used within the communications system illustrated in FIG. 8A.
    DETAILED DESCRIPTION
  • A content provider may provide multimedia content to wireless transmit/receive units (WTRUs) over a wireless communication network. The amount of power consumed by a WTRU during a multimedia experience may be predicted based on the type of multimedia content and/or the WTRU characteristics. Measurements may be taken by WTRUs that have received, decoded, and/or displayed the multimedia content. These measurements may be used to predict the amount of power that may be consumed when receiving, decoding, and/or displaying the multimedia content.
  • A power consumption descriptor may be used to indicate the amount of power consumed by the WTRU during the multimedia experience. For example, the power consumption descriptor may indicate the amount of power consumed by the WTRU when receiving, decoding, and/or displaying the multimedia content. The power consumption descriptor may be referred to as a decoding complexity. The power consumption descriptor may be included in protocols, file formats, and/or the like. The power consumption descriptor may be based on a media stream or streams and/or a WTRU type or types.
  • Power consumption at a WTRU may be affected by streaming multimedia content in a communications system. The multimedia content may include video streams, images, audio, and/or other forms of multimedia data transmitted using the communication system. A video stream may be coded into various sub-streams by varying the video resolution, frame rate, and/or quantization parameter. The sub-streams may include H.264 streams, for example. Audio streams may also be coded into various sub-streams by a varying number of channels (e.g., 5.1, 2.0 ··· stereo, 1.0 ··· mono, etc.), different codecs, and/or codec extensions (e.g., MP3. advanced audio coding (AAC), high efficiency (HE)-AAC, etc.).
  • Power may be consumed at the WTRU for multimedia data processing and/or display. The power consumed by the wireless interface may account for a portion (e.g., about 15% to 25%) of the overall power consumption. The wireless interface may be the wireless module capable of receiving a signal and/or decoding the signal into digital data that may be passed to a multimedia decoding module. The wireless interface may communicate via Wi-Fi, 3G, 4G, and/or the like. The display of the WTRU may account for a portion of the aggregate power consumed for video playback. The display may account for the largest portion (e.g., between about 38% and 68%) of power consumed by components within the WTRU. The processor may account for a portion of the power consumed for video playback. The processor may account for the largest portion of power consumption for video playback after the display.
  • A call module may account for additional power that may be consumed for phone calls. The call module may be a GSM module or other module that may be used for processing call data. The power consumed by the call module may be static or dynamic.
  • Multimedia content may be pre fetched or may be streamed for display at the WTRU. The multimedia content may be pre-fetched at a WTRU when the content is fully received at the WTRU before the WTRU may begin displaying a portion of the content. The multimedia content may be streamed by being sent to the WTRU in segments that may be displayed at the WTRU before the multimedia content is fully received. By streaming the multimedia content, a portion of the content may be viewed prior to receiving the entirety of the multimedia content. Multimedia content may be transmitted in segments using adaptive hypertext transfer protocol (HTTP) streaming. The periods between segments may be long enough for the air interface module to go to a sleep mode or idle mode in between transmissions.
  • FIG. 1 is a diagram that depicts example communication states that may be used by a WTRU in a wireless communication system. The communication states may include connected states for transmitting and/or receiving transmissions at a WTRU and/or idle states that may be used between transmissions. The power consumed by the WTRU may depend on transition between the communication states. For example, the WTRU may save power by transitioning from a connected state to a power saving connected state and/or an idle state, as shown in FIG. 1.
  • The example communication states may include radio resource control (RRC) states, such as the RRC connected state 102 and/or the RRC idle state 104. When Evolved Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (E-UTRA) is implemented as a radio technology, such as in a long term evolution (LTE) communication system for example. the RRC connected state 102 may be an E-UTRA RRC connected state and/or the RRC idle state 104 may be an E-UTRA RRC idle state. The RRC connected state 102 may be used to transfer data (e.g., unicast data) to and/or from a WTRU. The RRC connected state 102 may be used, for network controlled mobility, to monitor control channels associated with shared data channels, and/or the like. The RRC idle state 104 may be used to monitor a paging channel to detect incoming calls, acquire system information, and/or perform logging of available measurements. The RRC connected state 102 may use a higher power level than the RRC idle state 104.
  • The WTRU may transition between the RRC connected state 102 and the RRC idle state 104 at 106. For example, the WTRU may transition from the RRC connected state 102 to the RRC idle state 104 after a period of time has elapsed without receiving transmissions in the RRC connected state 102. The WTRU may transition to the RRC idle state 104 to preserve battery power, as the RRC idle state 104 may use less power resources than the RRC connected state 102. The WTRU may transition to the RRC connected state 102, for example, when data transmissions are received for being processed in the RRC connected state 102. When the WTRU transitions between the RRC connected state 102 and the RRC idle state 104 at 106, the WTRU may establish a connection in the RRC connected state 102 and/or release the resources used for the RRC connected state 102 when the WTRU transitions to the idle state 104.
  • The RRC connected state 102 may be subdivided into a continuous reception state, a short discontinuous reception (DRX) state, a long DRX state, and/or the like. The short DRX state and/or the long DRX state may be power saving connected states that may use less power than the continuous reception state. A WTRU may transition to the continuous reception state, for example, when the WTRU is promoted from the RRC idle state 104. The continuous reception state may be used by the WTRU to transmit and/or receive data in the communication system. The WTRU may transition from the continuous reception state to a DRX state when it is waiting for data in the RRC connected state 102. For example, the communication device may transition from the continuous reception state to the short DRX state. The communication device may transition from the short DRX state to the long DRX state after a period of time has elapsed without receiving data in the RRC connected state 102. The long DRX state may be an extended DRX state that may allow the WTRU to be in a DRX mode for a longer period of time prior to transitioning into idle mode.
  • The transition between states may be handled using inactivity timers. The WTRU may transition from the continuous reception state to the short DRX state upon expiration of an inactivity timer T1. The inactivity timer T1 may begin when the WTRU is no longer transmitting and/or receiving data. The WTRU may transition from the short DRX state to the long DRX state upon expiration of an inactivity timer T2. The WTRU may transition from the long DRX state to the RRC idle state 104 upon expiration of an inactivity timer T3.
  • A WTRU may transfer states for connectivity to other radio technologies. The WTRU may transfer states for connectivity to a UMTS, global system for mobile communications (GSM), or other radio technologies for example. At 108. the WTRU may perform a handover between the RRC connected state 102 and a UMTS connected state. At 126, the WTRU may perform a handover between the RRC connected state 102 and a GSM/general packet radio service (GPRS) packet connected state.
  • The UMTS states 110 may include connected states and/or idle states. The connected states may include a CELL dedicated channel (DCH) state 112. a CELL forward access channel (FACH) state 114, and/or a CELL paging channel (PCH)/URA PCH 116. In the CELL_DCH state 112, the WTRU may be allocated dedicated transport channels in the downlink and/or the uplink direction. In the CELL_FACH state 114, the WTRU may transmit user data through shared channels with other WTRUs. The shared channels may be low-speed channels, which may be less than 15kbps for example. In the CELL PCH/URA_PCH state 116, the WTRU may remain in connected mode for paging and/or may transfer to FACH. The CELL_FACH state 114 and/or the CELL_PCH/URA_PCH state 116 may be a power saving connected state in UMTS, as they may use less power resources than the CELL_DCH state 112. The UMTS idle states may include a UTRA idle state 120. The WTRU may transition between a UMTS connected states 112, 114, 116 and the UTRA idle state 120 at 118. At 118. the WTRU may establish a connection with one state and/or release the resources for the other state.
  • The WTRU may perform reselection between a UMTS state 110 to an RRC state. For example, the WTRU may perform reselection from a UMTS connected state 112. 114, 116 to the RRC idle state 104 at 122. At 124, the WTRU may perform reselection between the UTRA idle state 120 and the RRC idle state 104.
  • The GSM/GPRS packet states 128 may include connected states and/or idle states. The connected states may include a GSM connected state 130 and/or a GPRS packet transfer mode state 132. In the GSM connected state 130 and/or the GPRS packet transfer mode 132, the WTRU may have uplink or/and downlink radio resources allocated to it and may transmit and/or receive data. The idle states may include the GSM idle state and/or the GPRS packet idle state 136. At 134, the WTRU may transition between the GSM connected state 130 and the GSM idle state 136 or the GPRS packet transfer mode state 132 and the GPRS packet idle state 136. The WTRU may establish a connection with one state and/or release the resources for the other state.
  • At 140, the WTRU may perform reselection from the RRC connected state 102 to the GSM idle state/GPRS packet idle state 136. The WTRU may perform cell change over (CCO) reselection from a GSM/GPRS packet connected state 130, 132 to the RRC idle state 104 at 138. At 142, the WTRU may perform CCO reselection from the RRC idle state 104 to the GSM idle/GPRS packet idle state 136. The WTRU may perform CCO reselection from the GSM idle/GPRS packet idle state 136 to the RRC idle state 104 at 144.
  • While various connected states and idle states are provided as examples in FIG. 1, any connected state and/or idle state may be implemented. The connected states may be implemented to enable the WTRU to transmit and/or receive data. The power saving connected states and/or the idle states may be implemented to reduce power consumption at the WTRU.
  • FIG. 2 is a graph that depicts an example for streaming video content 202 from a content provider (e.g., Hulu®, Youtube®, Netflix®, etc.). The content provider may provide the video content 202 from one or more content providing devices that may be capable of storing and/or transmitting video content. The video content 202 may be streamed as ad-sponsored video streaming. The video content 202 may be provided at a bitrate between about 4.0 Mbps and about 1.4 Mbps. The average bitrate may be about 850 Kbps.
  • The video content 202 may be transmitted and/or received in intervals. The intervals of video transmission 204, 206 may be followed by intervals of inactivity 208, 210. The intervals of transmission 204, 206 may be about seventy-five seconds. The intervals of inactivity 208, 210 may be about seventy-five seconds. During these intervals of inactivity 208, 210, the WTRU may put the transmit/receive module into a power saving state. For example, the WTRU may transition to a short DRX mode, a long DRX mode, or a CELL_FACH state during intervals of inactivity 208, 210. If the intervals of inactivity arc longer than the inactivity timer for transitioning into the idle state, the WTRU may transition from a connected state in intervals of transmission 204. 206 to idle mode in intervals of inactivity 208, 210.
  • A WTRU may have a static nature of treating intervals of multimedia traffic. For example, a WTRU may assume video segments are transmitted between the same inactivity timers. FIG. 3 is a graph that depicts an example for streaming video content 302 from a content provider (e.g., Hulu®, Youtube®, Netflix®, etc.). The video content 302 may be provided at a bitrate between about 4.5 Mbps and about .3 Mbps. The average bitrate may be about 602 Kbps. The video content 302 may be split into a number of N chunks to allow or enable a WTRU to transition from a connected slate into a power saving state in a connected mode or an idle mode. For example, the WTRU may transition from a DCH state to a FACH state to reduce or save energy. The reduction in energy may be by about eighty percent, for example. The WTRU may send a message (e.g., an RRC message) to transition into an idle state and/or release radio resources after a chunk is received. The transition into the idle state and/or the release of radio resources may be performed prior to waiting for expiration of an inactivity timer for transitioning into the idle state. This early transition into idle state and/or release of resources may be referred to as fast dormancy.
  • A WTRU may use dynamic cache management to cache chunks of the video content. Power may be saved by using dynamic cache management to enable the WTRU to download content in chunks as fast as possible and/or releasing network resources, which may be referred to as fast dormancy. The WTRU may turn off the transmit/receive module to save power. The WTRU may close the connection (e.g., transmission control protocol (TCP) connection) when the cache may be full.
  • Power consumption models may be used for decoding media content in WTRUs. The power consumed for hardware accelerated decoding of media content at WTRUs may be determined as described herein. The media content may include video content, such as H.264 video content for example, images, audio content, and/or other types of media content. The decoding of video content may be modeled as a multiplicative of three exponential functions of main H.264 encoding parameters, as shown in Equation (1):
    Figure imgb0001
    where s may be the resolution, t may be the frame rate, q may be the quantization parameter, and/or Pmax= P (smax,tmax,qmin). Variables cs, ct, and/or cq may be modeling parameters that may be constants that may be used to obtain an accurate model. For example, the modeling parameters cs=0.4, ct=0.25, and/or cq=0.0 may be used for Equation 1.
  • The power consumption of the display may be modeled as shown in Equation (2) below:
    Figure imgb0002
    where variable i may be an instant time, x and y may be coordinates of the pixel (e.g., row and column coordinates). R, G, and B may be the red, green, and blue components, respectively, of a pixel displayed on a screen. For example, R(x,y,i) may be the value of a pixel (x,y) at an instant i. Variables α, β, and γ may be modeling parameters that may be determined to obtain an accurate model. For example, a display on a WTRU may use α= 1.02, β= 1.91, and/or γ=2.43 in Equation 2 to determine the power consumption of the display. For some WTRUs, display power consumption may be constant (e.g. ≈ 250 mW).
  • The power model parameters obtained using Equation (1) may be applied to power·rate optimized adaptive video streaming, where a maximum quality (Q) may be searched subject to rate R and power constraints P as showing in Equation (3) below:
    Figure imgb0003
  • Mobile streaming of multimedia content may be performed based on the power consumption level associated with the multimedia data. Mobile multimedia streaming may be performed to preserve power consumption at the WTRU. Streaming protocols and/or file formats may include a decoding complexity that may be used to predict the amount of power consumption that may be caused by or due to receiving, decoding, and/or displaying of the underlying media streams. For example, the decoding complexity may include power consumption descriptors that may indicate an amount of power that may be consumed when receiving, decoding, and/or displaying multimedia content.
  • FIG. 4 is a flow diagram that illustrates an example process 400 for transmitting media content based on a decoding complexity. The multimedia content may include video streams, images, audio, and/or other forms of multimedia data transmitted using the communication system. One or more portions of the process 400 may be performed by content providing devices, such as a content providing server or servers for example. As shown at 402, the content providing device may receive data that may indicate an amount of power for receiving, decoding, and/or displaying media content at WTRUs. The data may be actual data that may be measured by WTRUs that have received, decoded, and/or displayed the media content. The data measured by the WTRUs may be stored at the content providing device and/or a remote location that may be accessible to the content providing device.
  • The content providing device may receive a request for the media content at 404. The request may be received from a WTRU or other network entity. At 406, the content providing device may determine a decoding complexity associated with the requested media content The decoding complexity may indicate the amount of power used to receive, decode, and/or display the media content at one or more WTRUs. The decoding complexity may be an absolute value or a relative value. For example, an absolute value of a decoding complexity may be followed by relative values that may indicate an increase or decrease in the decoding complexity. The decoding complexity may be based on the data received at 402. For example, the decoding complexity may indicate a minimum amount of power, a maximum amount of power. and/or an average amount of power used by WTRUs that have received, decoded, and/or displayed the media content. The WTRUs that have received, decoded, and/or displayed the media content may include the WTRU requesting the media content.
  • The decoding complexity may include separate representations for receiving the media content, decoding the media content, and/or displaying the media content. The separate decoding complexity representations for receiving, decoding, and/or displaying the media content may allow WTRUs to tailor the complexity information based on its current state. For example, display power may be scaled to account for user brightness settings at the WTRU.
  • Different types of the requested media content may be available for transmission to the WTRU. For example, where the media content includes video content, standard definition and high definition video content may be available. The decoding complexity may indicate a minimum amount of power, a maximum amount of power, and/or an average amount of power used by the WTRUs for decoding each type of media content. The decoding complexity may be different for different types of media content. For example, standard definition video content may have a lower decoding complexity and may use less power than high definition video content.
  • Media content may be encoded using different video encoding profiles. For example, in H.264, a media content type may use a baseline profile, a main profile, a high profile, etc. Each profile may use a different set of encoding/decoding tools. Lower profiles (e.g., baseline profile) may be used to limit complexity and/or power consumption at the WTRU. Media content may be encoded using the same resolution (e.g., full HD - 1080p) and may use different video encoding profiles.
  • Media content may be encoded with different coding tools. Media content may be encoded with/without coding tools that may increase decoding complexity. For example. context-adaptive binary arithmetic coding (CABAC) may be more complex than variable-length coding (VLC). Media content may be encoded using the same profile, but with/without coding tools. For a given profile, some coding tools may be optional. If the coding tools are not used, the WTRU may see a lower decoding complexity. Media content may be encoded using the same profile, but using different coding tools.
  • The content providing device may select the media content for transmission at 408. The media content may be selected at 408 based on the decoding complexity. In order to select the media content, the content providing device may send an indication of the decoding complexity associated with the requested media content to the WTRU. The content providing device may receive an indication from the WTRU of whether the WTRU would like to receive the media content based on the associated decoding complexity. The content providing device may transmit the selected media content at 410.
  • When there arc different types of media content available at the content providing device, the content providing device may send an indication of one or more of the available types of content and/or the decoding complexity associated with each type. The content providing device may receive an indication from the WTRU of a preferred type of media content based on the decoding complexity associated therewith, and may select this content at 408 for transmission. The content providing device may pre-filter the types of media content that may be sent to the WTRU based on the decoding complexity. The content providing device may know the WTRU type, available power of the WTRU, and/or the power consumption configurations of the WTRU. The WTRU type, available power of the WTRU, and/or the power consumption configurations of the WTRU may be received from the WTRU or may be stored at the content providing device or other remote location and may be looked up based on an identifier associated with the WTRU. The WTRU identifier, the WTRU type, the WTRU power configurations, and/or the available power of the WTRU may be included in the request for media content received at 404. The content providing device may use the device type, available power, and/or the power consumption configurations to determine whether the WTRU can process certain types of media content.
  • The content providing device may provide the decoding complexity of media content that may be relevant to a device based on the device configurations and/or device specifications of the WTRUs that have provided the decoding complexity feedback. For example, when a WTRU requests media content from a content providing device, the request may include an identifier of the WTRU. The identifier may be a unique identifier or one or more device configurations or specifications associated with the device. The content providing device may determine the relevant decoding complexity information based on the WTRU identification information provided by the WTRU and may provide decoding complexity information that may be tailored to the WTRU.
  • The content providing device may provide the decoding complexity information along with device identification information that may identify devices to which the decoding complexity information may be relevant. For example, when a WTRU requests media content from a content providing device, the content providing device may provide the decoding complexity for the media content, along with one or more identifiers that may indicate the devices to which the decoding complexity may be relevant. The content providing device may provide different versions of the decoding complexity such that the WTRU may determine the version of the decoding complexity is relevant, or most relevant, to the WTRU.
  • The content providing device may autonomously select the media content at 408 based on the decoding complexity. The content providing device may use the device identifier and/or the WTRU configurations and/or specifications (e.g., device type, make, model, etc.) to select the media content to be transmitted at 408. The device configurations and/or specifications may be available in the HTTP header of a request received from the WTRU, for example. The content providing device may determine whether the WTRU has enough power available for receiving, decoding and/or displaying the requested media content. The content providing device may determine whether the decoding complexity of the media content is within an acceptable range. When there are multiple types of the requested media content, the content providing device may select the type of content that is appropriate for transmission to the WTRU based on the WTRU type, the WTRU power configurations, and/or the available power of the WTRU.
  • Another network device, such as a Node B for example, may select the media content at 408 based on the decoding complexity. The network device may use the device identifier and/or the WTRU configurations and/or specifications (e.g., device type, make. model, etc.) to select the media content to be transmitted at 408. The device configurations and/or specifications may be available in the HTTP header of a request received from the WTRU, for example. The network device may determine whether the WTRU has enough power available for receiving, decoding and/or displaying the requested media content. The network device may determine whether the decoding complexity of the media content is within an acceptable range. When there are multiple types of the requested media content, the network device may select the type of content that is appropriate for transmission to the WTRU based on the WTRU power configurations, the WTRU power configurations, network configurations (e.g.. bandwidth), and/or the available power of the WTRU.
  • FIG. 5 is a flow diagram that illustrates an example process 500 for selecting multimedia content based on a decoding complexity. One or more portions of the process 500 may be performed by a WTRU or other network device. As shown at 502, the WTRU may request media content from a content providing device. At 504, the WTRU may receive a decoding complexity associated with the requested media content. The decoding complexity may indicate the amount of power that may be used to receive, decode. and/or display the media content at the WTRU. The decoding complexity may indicate power metrics for the power used by other WTRUs that have received, decoded, and/or displayed the media content. For example, the decoding complexity may indicate a minimum amount of power, a maximum amount of power, and/or an average amount of power used by other WTRUs that have received, decoded, and/or displayed the media content.
  • The WTRU may determine whether to receive the media content at 506 based on the decoding complexity. For example, the WTRU may determine its current battery power and/or whether it has enough battery power available for decoding the requested content. The WTRU may determine whether the decoding complexity is within an acceptable range based on the current batter power. When there are different types of media content available at the content providing device, the WTRU may receive an indication of the decoding complexity for the different types of media content. For example, the WTRU may receive a minimum amount of power, a maximum amount of power, and/or an average amount of power used by the WTRUs for receiving, decoding, and/or displaying each type of media content. The WTRU may select a preferred type of media content based on the decoding complexity associated with each type of media content.
  • The WTRU may send an indication at 508 to indicate whether it wants to receive the requested media content. The WTRU may also indicate the type of media content that it would like to receive. The indication may be sent at 508 in response to the decoding complexity received from the content providing device. If the WTRU indicates at 508 that it wants to receive the media content and/or a selected type of media content, the WTRU may receive the media content in response to the indication. The media content may be streamed via a communication network and/or stored at the WTRU for playback. If the WTRU indicates at 508 that it does not want to receive the media content, the media content may not be transmitted to the WTRU, or the WTRU may ignore the transmission.
  • FIG. 6 is a flow diagram that illustrates an example process 600 for measuring decoding complexity and/or sending decoding complexity feedback information that indicates the decoding complexity. One or more portions of the process 600 may be performed by a WTRU. As shown at 602, the WTRU may request media content from a content providing device. At 604, the WTRU may receive the requested media content. The media content may be streamed via a communication network and/or stored at the WTRU for playback. The WTRU may decode and/or display the media content at 606. The WTRU may measure the decoding complexity at 608. For example, the WTRU may measure the amount of power used while receiving, decoding, and/or displaying the media content. The WTRU may measure the maximum value, minimum value, and/or average value of power used by the WTRU. The average value may be calculated by the minimum value plus the maximum value divided by two (e.g., (minValue + maxValue) / 2) or by calculating the average power value used over a period of time (e.g., the period for receiving, decoding, and/or displaying the media content or a segment thereof). If the media content is segmented at the WTRU, the WTRU may perform the measurements at 608 for each segment of the media content.
  • The WTRU may send decoding complexity feedback information to a remote source at 610. For example, the WTRU may send the decoding complexity feedback information to the content providing device and/or another network device for storage. The decoding complexity feedback information may be aggregated from multiple devices, such as in the form of a database for example. The decoding complexity feedback information may inform the content providing device of the actual decoding complexity for receiving, decoding, and/or displaying the multimedia content. A separate decoding complexity representation may be used by WTRUs to report decoding complexity for receiving media content, decoding media content, and/or displaying media content. The decoding complexity feedback information may be stored and/or used to send the decoding complexity to other receiving devices.
  • The decoding complexity feedback information may also include a time at which the decoding complexity was measured, a duration for which the decoding complexity was measured, and/or device configurations and/or specifications for the device measuring the decoding complexity. The device configurations and/or specifications may include a device type, make, model, release date, OS version, device drivers, player agent, available power, power consumption configurations, and/or the like. The decoding complexity feedback information may be used to provide the decoding complexity to WTRUs.
  • The decoding complexity may be included in a file format or protocol. For example, the decoding complexity may be included in a media presentation descriptor (MPD) of a dynamic adaptive streaming over HTTP (DASH), a session description protocol (SDP) of an IP multimedia subsystem (IMS), a real time streaming protocol (RTSP), a file format such as a 3GP file format, and/or the like.
  • A streaming protocol may include a value that indicates the decoding complexity. For example, the protocol structure may include an abstract called DecodingComplexity and/or may include values that indicate the decoding complexity associated with media content being streamed via the protocol. Provided below is an example of an abstract function called DecodingComplexity that may be used in a streaming protocol:
 Struct DecodingComplexity{
      Bool Absolute;
      Int min Value,max Value,avrValue;
      Char Units{8];
 };.
  • As shown above, the DecodingComplexity function may include one or more decoding complexity values, a value indicating whether the decoding complexity value is an absolute value or a relative value, and/or a units value that may indicate the units in which the decoding complexity values are measured. The value indicating whether the decoding complexity value is an absolute value or a relative value may be a boolean value that may be used to identify if one or more of the reported values minValue, max Value and/or avrValue may be an absolute or relative value. If the decoding complexity values are relative values, the values may be expressed as a relative increase or decrease from previous values. The decoding complexity may include the values minValue, maxValue, and/or avrValue. The decoding complexity may represent a variability of the amount of power and/or time it may take a WTRU to receive, decode, and/or display media content, such as a video stream, a video frame, a video file, an audio file, images, and/or other types of media content.
  • The decoding complexity may be based on values measured by a reference device. The reference device may be another WTRU. The reference device may be similarly configured to the device requesting the media content. For example, the reference device may have the same device type, make, model, release date, OS version, device drivers, player agent, available power, power consumption configurations, and/or the like. One or more of the configurations of the reference device may also be different, but may be relevant for predicting the amount of power that may be used by the device requesting the media content. The device configurations may be included in metadata. The metadata about the configurations of the reference device may be included in a user defined box within a file format, such as a 3GP file format for example, or may be sent in an outbound signaling protocol, such as an email or an HTTP websocket for example.
  • When the media content includes video streams, the decoding complexity may be used to select the appropriate video stream. For example, the avrValue may be used to select the appropriate video stream. The avrValue may be used when video streams may include a small number of frames or shorter videos. Video streams that include a small number of frames or shorter videos may include videos less than a minute (e.g.. 30 seconds) or a number of minutes. The minValuc and/or the maxValue may be used to capture variability of frame decoding time. The min Value and/or the max Value may be used with a larger number of video frames or longer videos. Longer videos or videos with a larger number of frames may be larger than the shorter videos. For example, the longer videos may be longer than an hour (e.g., 2 hours). The avrValue may be estimated as (minValue+maxvaluc)/2 or may be calculated as an average value of each of the known reference devices or a subset of the reference devices (e.g., reference devices with similar configurations).
  • The decoding complexity function may include the units at which the values arc measured. For example, the decoding complexity function may include a string value for the units that may indicate the units for the minValue, max Value and/or avrValue. The units value may be in milliseconds (msec), million instructions per second (MIPS), milliwatts (mW), and/or the like. The maximum number of characters for the units value may be a fixed number (e.g., 8 characters) or a variable number. While an example is provided for the abstract decoding complexity function above, the decoding complexity may be modified and/or implemented for each streaming protocol.
  • The decoding complexity may be included in the moving picture experts group (MPEG) DASH protocol. The media presentation description (MPD) in the MPEG DASH protocol may be used to indicate the decoding complexity. For example, decoding complexity descriptors may be included in the MPEG DASH schema at a period, an adaptation set, a representation, and/or the like. The adaptation set may include the set of different types of media files stored at a content providing device. The representation may include a media file in the adaptation set. A representation may include a different version of the same media content. For example, different representations of the same video clip may be encoded at different bit rates and/or different resolutions. A multimedia content may include a presentation (e.g., a video clip or movie) that may be partitioned into periods. For example, a period may be a chapter in the presentation. In between periods, ads may be displayed. Each period may be partitioned into segments (e.g., two to ten second segments).
  • The decoding complexity may be included at the period level to inform the WTRU, or another device, about the minimum and/or maximum expected power resources that may be used to receive, decode, and/or display a representation of an adaptation set(s). The decoding complexity values may be absolute values or may be relative values that may reference a decoding complexity in a prior period. The absolute value may be included in the first period in a presentation, for example. Subsequent periods may include a value relative to the first period. The relative values may reference the absolute value.
  • Provided below is an example of an extensible markup language (XML) scheme for the period. The XML scheme includes a minDccodingComplexity and a inaxDecodingComplexity as a string value "xs:string." A similar XML scheme may be implemented that includes other forms of decoding complexity, such as average decoding complexity and/or the like.
  •  <!-- Period of a presentation -->
      <xs:complexType name="PeriodType">
        <xs:sequence>
          <xs:element name-'BaseURL" type="BaseURLType" minOccurs="0"
     maxOccurs="unbounded"/>
        <xs:element name="SegmentBase" type="SegmentBaseType" minOccurs="0"/>
        <xs:element name="SegmentList" type="SegmentListType" minOccurs="0"/>
        <xs:element name="SegmentTemplate" type="SegmentTemplateType"
     minOccurs="0"/>
        <xs:element name="AdaptationSet" type="AdaptationSetType" minOccurs="0"
            maxOccurs="unbounded"/>
        <xs:any namespace-"##other" processContents="lax" minOccurs="0"
     maxOccurs:"unbounded"/>
      </xs:sequence>
      <xs:attribute ref="xlink:href'/>
      <xs:attribute ref="xlink:acnate" default="onRequest"/>
      <xs:attribute name="id" type="xs:string":=1
      <xs:attribute name="start" type="xs:duration"/>
      <xs:attribute namc="duration" type="xs:duration"/>
      <xs:aitribute name="bitStreainSwitching" type="xs:boolean" default="false"/>
      <xs:attribute name="minDecodingComplexity" type="xs:string" minOccurs="0"
     maxOccurs="1"/>
      <xs:attribute name="maxDecodingComplexity" type="xs:string" minOccurs="0"
     maxOccurs-"I"/>
        <xs:anyAttribute namespace="##other" processContents="lax"/>
      </xs:complex Type>
  • The string representing minimum decoding complexity and/or the string representing the maximum decoding complexity may be formatted having sub-strings. The sub-strings may be separated by commas or other indicators to show the separation between sub-strings. The substrings may indicate whether the decoding complexity is an absolute or relative value, the decoding complexity value, and/or the units of each decoding complexity value. For example. the sub-strings may indicate that the decoding complexity is an absolute value, the value of the decoding complexity, and/or the units in a comma-delimited list (e.g., minDecodingComplexity=1, 100, MIPS). While the decoding complexity is represented as a string value, the decoding complexity may be represented as one or more integer values that may indicate whether the decoding complexity is an absolute or relative value, the decoding complexity value, and/or the units of each decoding complexity value.
  • The decoding complexity may be included at the adaptation set level. This may provide a finer level of adaptation than including the decoding complexity at the period level. At the adaptation set level, the decoding complexity may include the minimum, maximum, and/or average expected processing that may be used to receive, decode, and/or display a representation of the adaptation set. The decoding complexity at the adaptation set level may include absolute or relative values. The relative values may refer to a prior relative or absolute value. For example, the first adaptation set within a period may include absolute values for the decoding complexity and/or subsequent adaptation set(s) may include absolute or relative values for the decoding complexity.
  • The adaptation set may include a minimum decoding complexity, a maximum decoding complexity, and/or an average decoding complexity. Provided below is an example of an XML scheme for the adaptation set that includes the minimum decoding complexity, the maximum decoding complexity, and the average decoding complexity as an "xs:string" value.
  •  <!-- Group to contain information common to an adaptation set;
        extends RepresentationBaseType -->
     <xs:complexType name="AdaptationSetTypc">
        <xs:complexContent>
          <xs:extension base="RepresentationBaseType">
            <xs:sequence>
              <xs:element name-"Accessibility" type="DescriptorType" minOccurs="0"
     maxOccurs="unbounded"/>
              <xs:element name="Role" type="DcscriptorType" minOccurs="0"
     maxOccurs-"unbounded"/>
              <xs:element name="Rating" type="DescriptorType" minOccurs="0"
     maxOccurs="unbounded"/>
              <xs:element name="Viewpoint" type="DescriptorType" minOccurs="0"
     maxOccurs="unbounded"/>
              <xs:element name="ContentComponent" type="ContentComponentType"
     minOccurs="0" maxOccurs="unbounded"/>
              <xs:element name="BaecURL" type="BaseURLType" minOccurs="0"
     maxOccurs="unbounded"/>
              <xs:element name='SegmentBase" type="SegmentBaseType" minOccurs="0"/>
              <xs:element name="SegmentList" type="SegmentListType" minOccurs="0"/>
              <:xs:element name="ScgnicnlTemplate" type="SegmentTemplateType"
     minOccurs="0"/>
              <xs:element namc="Representation" typc="RepresentationType" minOccurs="0"
                  maxOccurs="unbounded"/>
              <xs:any namespace="##other" processContents="lax" minOccurs="0"
                  maxOccurs="unbounded"/>
            </xs:sequence>
            <xs:attribute ref="xlink:href"/>
            <xs:attribute ref-"xlink:actuate" default="onRequest"/>
            <xs:attribute name="id" type="xs:string"/>
            <xs:attribute name="group" type="xs:unsignedInt"/>
            <xs:attribute name="lang" type="xs:language"/>
            <xs:attribute name="contentType" type="xs:string"/>
            <xs:attribute name="minBandwidth" type="xs:unsignedInt"/>
            <xs:attribute name="maxBandwidth" type="xs:unsignedInt"/>
            <xs:attribute name="minWidth" type="xs:unigneigndInt"/>
            <xs:attribute name="maxWidth" type="xs:unsignedInt"/>
            <xs:attribute name="minHeight" type="xs:unsignedlnt"/>
            <xs:attribute name="maxHeight" type="xs:unsignedInt"/>
            <xs:attribute name='minFrameRate" type="FrameRateType"/>
            <xs:attribute name="maxFrameRate" type-"FrameRateType"/>
            <xs:attribute name-"segmentAlignment" type="ConditionalUintType"
     default="false"/>
            <xs:attribute name-"subsegmentAlignment" type="ConditionalUntType"
     default="false"/>
            <xs:attribute name=subsegmentStartsWithSAP" type="SAPTypc" detau!t="0"/>
            <xs:attribute name-"bitStreamSwitching" type="xs:boolean"/>
            <xs:attribute name="minDecodingComplexity" type="xs:string" minOccurs="0"
                 maxOccurs="I"/>
            <xs:attribute name="maxDecodingComplexity" type="xs:string" minOccurs="0"
                 maxOccurs-" 1"/>
            <xs:attributenamc="avTDcc0dingComplcxity" type="xs:string" min0ccurs="0"
                 maxOccurs=" I "/>
          </xs:extension>
        </xs:complexContent>
     </xs:complex Type>
     
     <!-- Conditional Unsigned Integer (unsignedlnt or boolean) -->
     <xs:simpleType name="ConditionalUintType">
      <xs:union memberTypes="xs:unsignedInt xs:boolean"/>
     </xs:simpleType>
    The string representing the minimum decoding complexity, the maximum decoding complexity, and/or the average decoding complexity may be formatted having sub-strings, as described herein. While the decoding complexity is represented as a string value, the decoding complexity may be represented as one or more integer values. as described herein.
  • The decoding complexity may be included at the representation level. The representation level may provide a finer level of adaptation compared to the period and/or the adaptation set level. The minimum, maximum, and/or average expected consumption of power resources that may be used to receive, decode, and/or display a segment or sub-representation may be included at the representation level. The decoding complexity at the representation level may include absolute or relative values. The relative values may the relative to a prior absolute or relative value. For example, the first representation within an adaptation set may include an absolute value. Subsequent representations) may include absolute or relative values.
  • The decoding complexity at the representation level may include a minimum decoding complexity, a maximum decoding complexity, and/or an average decoding complexity. Provided below is an example of an XML scheme for the representation level that includes the minimum decoding complexity, the maximum decoding complexity, and the average decoding complexity as an "'xs:string" value.
  •  <!.. A Representation of the presentation content for a specific Period →
      <xs:complexType name="RepreseniationType">
        <xs:complexContent>
          <xs:extension base="RepresentationBaseType">
            <xs:sequence>
               <xs:element name="BaseURL" typc="BaseURLType" minOccurs="0"
     maxOccurs="unbounded"/>
               <xs:element name="SubRepresentation" type="SubRepresentationType"
     minOccurs="0"/>
               <xs:element name-"SegmentBase" type-"SegmenBaseType" minOccurs="0'/>
               <xs:element name="SegmentList" type="SegmentListType" minOccurs="0"/>
               <xs:element name="SegmentTemplate" type="SegmentTemplateType"
     minOccurs="0"/>
            </xs:sequence->
            <xs:attribute name-"id" type="StringNoWhitespaccType" use="required"/>
            <xs:attribute name="bandwidth" type="xs:unsignedlnt" use="required"/>
            <xs:attribute name="qualityRanking" type="xs:unsignedlnt"/>
            <xs:attribute name="mediaStreamStructureId" type="StringVectorType"/>
            <xs:attributename="minDecodingComplexity" type="xs:string" minOccurs="0"
                 maxOccurs="I"/>
            <xs:attribute name="maxDecodingComplexity" type-"xs:string" niinOccurs="0"
                 maxOccurs="I"/>
            <xs:attribute name="avrDecodingComplexity" type"xs:string" minOccurs="0"
                 maxOccurs="I"/>
     
          </xs:extension)>
        </xs:complexContent>
      </xs:complexType>
     
     <!.. String without white spaces →
     <xs:simpleType name="StringNoWhitespaceType">
      <xs:restriction base="xs:string">
       <xs :pattern value="[^\r\n\t\p{Z}]*"/>
      </xs:restriction>
     </xs:simpleType>
     
     <!-- Whitespace-separated list of strings -->
     <xs:simpleType namc="StringVectorType">
      <xs:list itemType="xs:string"/>
     </xs:simpleType>
  • The string representing the minimum decoding complexity, the maximum decoding complexity, and/or the average decoding complexity may be formatted having sub-strings, as described herein. While the decoding complexity is represented as a string value, the decoding complexity may be represented as one or more integer values, as described herein.
  • The decoding complexity may also be included at the sub-representation level in a similar manner as the representation level. A sub-representation may be used to provide a version of a representation to enable a video player to fast-forward and/or rewind efficiently. The sub-representation may be a lower-qualily than the version of a representation. Provided below is an example of an XML scheme for the sub-representation level that includes the minimum decoding complexity, the maximum decoding complexity, and the average decoding complexity as an "xs:string" value.
  •  <!.. SubRepresentation →
     <xs:complexType name="SubRepresentationType">
     <xs :complexContent>
     <xs:extension base="RepresentationBaseType">
     <xs:attributename"="level" type="xs:unsignedInt"/>
     <xs:attribute name="dependencyLevel"type"="UlntVectorType"/>
     <xs:attributename="bandwidth" type="xs:unsignedlnt"/>
     <xs:attribute name="contentComponent" type="StringVectorType"/>
     <xs:attribute name="minDecodingComplexity" type="xs:string" minOccurs="0"
                 maxOccurs="I"/>
            <xs:attributename="maxDecodingComplexity" type="xs:string" minOccurs=0"
                 maxOccurs="1"/>
            <xs:attributename="avrDecodingComplexity" type="xs:string" minOccurs="0"
                 maxOccurs="1"/>
     </xs:extension>
     </xs:complexContent>
     </xs:complexType>
     
     <!-- Whitespace-separated list of unsigned integers -->
     <xs:simpleType name="UIntVectorType">
     <xs:list itemType="xs:unsignedInt"/>
     </xs:simpleType>
  • While the decoding complexity is represented as a string value, the decoding complexity may be represented as one or more integer values, as described herein.
  • The decoding complexity, and/or other attributes, may be signaled in MPEG DASH using generic descriptors. The generic descriptors may be declared at the adaptation set level, the representation level, and/or the sub-representation level. The generic descriptors may indicate whether the decoding complexity is an essential or supplemental element of the MPEG DASH protocol. Essential protocol elements of the MPEG DASH protocol may be supported by each compliant video player. Supplemental protocol elements may include information that may or may not be understood each compliant video player.
  • Provided below is an example of an XML scheme for he representation level that includes generic descriptors that indicate the decoding complexity as a supplemental element of the MPEG DASH protocol.
  •  <Representation id="720kbps" bandwidth="792000" width="640" height="368" >
       <SubRepresentation level="1" contentComponent="481" maxPlayoutRate="32"/>
       <SegmentBase timescale="90000" presentationTimeOffset="162000000"/>
       <Supplemental schemeIdUri="um:sdoX:dashX:min-decoding-complexity" value="10" />
       Supplemental schemeIdUri="urn:sdoX:dashX:max-decoding-complexity" value="100" />
       Supplemental schemeIdUri="urn:sdoX:dashX:avg-decoding-complexity" value="50" />
     </Representation>
    If uniform resource names (URNs) are used in the protocol, as shown above, the URNs may include "um:sdoX:dashX:min-decoding-complexity" for the minimum decoding complexity, "urn:sdoX:dashX:max-decoding-complexity" for the maximum decoding complexity, and/or "urn:sdoX:dashX:avg·decoding-complexity" for the average decoding complexity.
  • The decoding complexity may be included as a parameter to a media streaming format. The media streaming format may be the session description protocol (SDP) for a session initiation protocol, a real-time streaming protocol (RTSP), and/or the like. The decoding complexity may be included in the SDP following a media attributes variable "m=video," for example. The decoding complexity may include the minimum, maximum, and/or average expected processing resources that may be used to receive, decode, and/or display the media stream. The decoding complexity may include absolute or relative values. The first media component within the SDP may include absolute values. Subsequent media components may include absolute or relative values. The relative values may be relative to the values in the first media component within the SDP or may be relative to the previous media component.
  • SDP may implement a decoding complexity attribute. For example, the decoding attribute "a" may be implemented as follows: a-DceodingComplexity:<decoding Absolute> <minDecodingComplexity> <maxDecodingComplexity> <avrDecodingComplexity> <decoding Units>. The decoding complexity attribute may include an indication of whether the decoding complexity is an absolute or relative value, a minimum decoding complexity, a maximum decoding complexity. an average decoding complexity, and/or the units for which the decoding complexity may be measured. The decoding complexity attribute may be sent during an initial offer and/or answer negotiation, such as a negotiation between the WTRU and the content providing device for example. The decoding complexity attribute may be sent during an ongoing video streaming session to modify one or more SDP attributes.
  • The decoding complexity may be included in real-time transport control protocol (RTCP) messages. For example. the decoding complexity may be included in an RTCP extension in an RTCP message. Content providing devices may send the decoding complexity in sender reports to one or more receiving devices, such as a WTRU or other network device for example.
  • The receiving devices may be reference devices that may send decoding complexity feedback information to the content providing devices to inform the content providing devices of the actual decoding complexity associated with selected multimedia content. For example, the receiving devices may include the actual decoding complexity in RTCP messages sent to the content providing devices. The decoding complexity feedback information may inform the content providing devices of the actual decoding complexity of selected media content at the receiving devices on which the media content has been received, decoded, and/or displayed. The decoding complexity feedback information may be stored at the content providing devices and/or used by the content providing devices to provide the decoding complexity to other receiving devices. The decoding complexity may be sent to other receiving devices during the initial SDP offer/answer negotiation, for example.
  • A codec control message (CCM) may be used to convey control messages that may carry feedback information from receiving devices to content providing devices. For example, the CCM may be used for a real-time transport protocol (RTP) audio-visual profile with feedback (AVPF) profile to convey the control messages. The control messages may include an intra-frame request and/or a temporary maximum bit rate.
  • FIG. 7 is a diagram that depicts an example message 700 that may be used for sending decoding complexity feedback information. The message 700 may include a source identifier 702 of the message 700. The source identifier 702 may indicate the source from which the message 700 may be transmitted. The message 700 may include feedback control information 706. The feedback control information 706 may include the decoding complexity feedback information.
  • The feedback control information 706 may be indicated using a binary representation. A byte (e.g., the first byte) of the feedback control information 706 may indicate whether the decoding complexity feedback information is indicated as an absolute (e.g., '1') or relative (e.g., '0') measurement. A first message 700 may carry absolute measurements. Subsequent messages may carry relative measurements.
  • The feedback control information 706 may include a minValue. max Value, and/or avrValue of the decoding complexity. The minValue, max Value, and/or avrValue may follow (e.g., respectively) the byte indicating whether the decoding complexity feedback information is indicated as an absolute or relative value. The minValue, max Value, and/or avrValue may be 32-bit values. The feedback control information 706 (e.g., in the last bytes) may be assigned to the string of the units of measurement. The length of the RTCP packet may be used to determine the last character and/or the length of the string of the units. The number of bytes N in the length of the RTCP packet may be determined by subtracting the bytc(s) indicating the absolute or relative value and/or the 32-bit values for the minValue, maxValue, and/or avrValue (e.g.. 12 bits where all 3 are used). For example, the number of bytes N may be determined by N= (length of RTCP pkt) ― 12 ― 1.
  • The message 700 may be in an RTCP format. The message 700 may include a header that may include version (V) field 708, a padding bit (P) field 710, a feedback message type (FMT) field 712, a payload type (PT) field 714, and/or a length field 716. The V field 708 may include a number of bits (e.g., 2 bits) that may identify the RTP version. The P field 710 may include a number of bits (e.g., 1 bit) that may indicate whether the packet includes additional padding octets at the end that may be included in the length field 716. The FMT field 712 may indicate the type of feedback message for the message 700. The message 700 may be interpreted relative to the FMT field 712. An FMT type of fifteen may indicate an application-layer feedback message, for example. The PT field 714 may include a number of bits (e.g., 8 bits) that may identify the packet type. The PT field 714 having a value of '206' may indicate that the packet type is an RTCP feedback message, for example. The length field 716 may include a number of bits (e.g., 16 bits) that may indicate the length of the packet (e.g.. in 32-bit words minus one), and may include the header and/or any padding bits.
  • The source identifier 702 may be a synchronization source (SSRC) packet sender field. The source identifier field 702 may include a number of bits (e.g., 32 bits) that may indicate the source of the message 700. The message 700 may include a media source field 704. The media source field 704 may be an SSRC media source field. The media source field 704 may be an identifier that may uniquely identify the source of the media. While the message 700 includes a number of fields, additional fields and/or a subset of the described fields may be implemented.
  • The decoding complexity may be indicated in media files. For example, decoding complexity descriptors may be included in metadata within the media files. Decoding complexity descriptors may be included within stored media files. The decoding complexity descriptors may indicate the amount of power that may be used to decode the media content in the media files. The decoding complexity descriptors may be based on actual data measured for receiving, decoding, and/or displaying the media files. For example, the decoding complexity descriptors may indicate a minimum value, a maximum value, and/or an average value of power used to receive, decode, and/or display the media file or portions (e.g., video frames, audio segments, etc.) thereof. The decoding complexity descriptors may indicate a minimum amount of time, a maximum amount of time, and/or an average amount of time to receive, decode, and/or display a media file or portions thereof. The decoding complexity descriptors may indicate a minimum amount of computing resources, a maximum amount of computing resources, and/or an average amount of computing resources to receive, decode, and/or display a media file or portions thereof. The decoding complexity descriptors may include absolute values or relative values. The relative values may indicate an increase or decrease value from a value t)f a prior relative decoding complexity descriptor. The decoding complexity descriptors may include the units for the decoding complexity values. Storing the media files with the decoding complexity may allow content providing devices to use the media files within the underlying streaming protocols.
  • The decoding complexity descriptors may indicate the decoding complexity of one or more segments of a segmented media file. Each segment may have its own decoding complexity descriptor, or a decoding complexity descriptor may indicate a decoding complexity for a group of segments. If a relative value is used to indicate the decoding complexity of a media file segment, the value may be relative to a decoding complexity value of a prior segment of the media file. For example, the first decoding complexity descriptor of a media file may include an absolute value, while other decoding complexity descriptors may include a relative value.
  • The media files may be in a 3GP file format. The 3GP media file may be configured for DASH streaming. The 3GP file format may be structurally based on the ISO media file format. The ISO media file format may be an object-oriented format. The objects in the ISO file format may be referred to as boxes. The boxes may include the media data. In a 3GP adaptive-streaming profile, the boxes may include video data. For example, in video profiles that are branded '3gh9', the media data may include movie data. The movie data may be included in movie fragment boxes (moots).
  • In 3GP media files, the decoding complexity may be included in sample index data, a fragment header, and/or fragment decoding data. The decoding complexity may be indicated in sample index data using a segment index box (sidx). The sidx may provide a compact index of a track or sub-segment within the media segment to which it may apply. The sub-segment may be a media stream within the media segment, for example. The sidx may refer to other sidxs in a media segment which may refer to other sub-segments. Each sidx may document how a segment may be divided into one or more sub-segments.
  • The decoding complexity data for each sub-segment may be included in a sidx as shown below.
  •  aligned(8) class Segment Index Box extends FullBox('sidx', version, 0) {
       unsigned int(32) reference_ID;
       unsigned int(32) timescale;
       if (version 0)
          {
              unsigned int(32) carlicst_presentation_time;
              unsigned int(32) first_offset; }
          else
          {
              unsigned int(64) earliest_presentation_time;
              unsigned int(64) first_offset; }
      unsigned int(16) reserved = 0;
     unsigned int(16) reference_count;
     unsigned int(8) Absolute;
     unsigned int(8)[8] UnitsDecodingComplexity;
     
     for(i=1; i <= reference_count; i++)
     {
          bit(1)                           reference_type:
          unsigned int(31)                 referenced_size;
          unsigned int(32)                 subsegment_duration;
          bit(1)                           starts with SAP;
          unsigned int(3)                  SAP_type;
          unsigned int(28)                 SAP_delta _time;
          unsigned int(32)                 minDecodingComplexity;
          unsigned int(32)                 maxDccodingComplexity;
          unsigned int(32)                 avrDecodingComplexity;
        }
     }
  • The sidx box may include a minimum, maximum, and/or average decoding complexity. The decoding complexity in the sidx box may include absolute or relative values. The relative values may the relative to a prior absolute or relative value. For example, the first decoding complexity in a sidx box may include an absolute value. Subsequent decoding complexity values may include absolute or relative values. The sidx box may include the units of the decoding complexity.
  • The decoding complexity data for each sub-segment may be included in a track fragment header (tfhd) box. The tfhd box may provide basic information about each movie fragment of an underlying media track. The decoding complexity may be included in the header information for each fragment as shown below.
  •  aligned(8) class TrackFragmentHeaderBox
       extends FullBox('tfhd', 0, tf_flags) {
     unsigned int(32) track_ID;
       unsigned int(64) base_data_offset;
       unsigned int(32) sample_description_index;
       unsigned int(32) default_sample_duration;
       unsigned int(32) default_sample_size;
       unsigned int(32) default_sample_flags
       unsigned int(8) Absolute;
       unsigned int(32)    minDecodingComplexity;
       unsigned int(32)    maxDecodingComplexity;
       unsigned int(32)    avrDecodingComplexity;
       unsigned int(8)[8] UnitsDecodingComplexity;
     }
  • The tfhd box may include a minimum, maximum, and/or average decoding complexity. The decoding complexity in the tfhd box may include absolute or relative values. The relative values may the relative to a prior absolute or relative value. For example, the first decoding complexity in a tfhd box may include an absolute value. Subsequent decoding complexity values may include absolute or relative values. The tfhd box may include the units of the decoding complexity.
  • The decoding complexity may be included in a track fragment base media decode time (tfdt) box. The tfdt box may provide the decode time of a sample in the track fragment. The sample in which the decoding time may be included may be the first sample of the track fragment. The tfdt box may be used when performing random access in a file. The decoding complexity for each fragment or sub-sample may be included as shown in the class below.
  •  aligned(8) class TrackFragmentBaseMediaDecodeTimeBox
       extends FullBox('tfdt'; version, 0) {
       if (version==1) {
           unsigned int(64) baseMediaDecodeTime;
       } else version==0
           unsigned int(32) baseMediaDecodeTime;
       }
       unsigned ini(8) Absolute;
       unsigned int(32)    minDecodingComplexity;
       unsigned int(32)    maxDecodingComplexity;
       unsigned int(32)    avrDecodingComplexity;
       unsigned int(8)[8] UnitsDecodingComplexity;
     }
  • The tfdt box may include a minimum, maximum, and/or average decoding complexity. The decoding complexity in the tfdt box may include absolute or relative values. The relative values may the relative to a prior absolute or relative value. For example, the first decoding complexity in a tfdt box may include an absolute value. Subsequent decoding complexity values may include absolute or relative values. The tfdt box may include the units of the decoding complexity.
  • The decoding complexity may be included in real-time protocol (RTP) and/or realtime control protocol (RTCP) streaming. For example, if media tracks may be stored in fragmented boxes, the decoding complexity may be included in tfdt and/or tfhd boxes that may be applied to RTP and/or RTCP streaming. Hint tracks may carry transport specific information that may be used by a content providing device to create the RTP and/or RTCP packets.
  • The decoding complexity may be included in a hint media header (hmhd) box. For example, the decoding complexity may be included for a whole video hint track, which may provide a coarse approach for track selection by the WTRU or other receiving device of the media content. An example of an hmhd box including the decoding complexity is shown below.
  • aligned(8) class HintMediaHeaderBox
       extends FullBox('bmhd', version = 0, 0) {
       unsigned int(16) maxPDUsize;
       unsigned int(16) avgPDUsize;
       unsigned int(32) maxbitrate;
       unsigned int(32) avgbitrate;
       unsigned int(32) reserved = 0;
       unsigned int(8) Absolute;
       unsigned int(32)    minDecodingComplexity;
       unsigned int(32)    maxDecodingComptexity:
       unsigned int(32)    avrDecodingComplexity;
       unsigned int(8)[8] UnitsDecodingComplexity:
       }
    The hmbd box may include a minimum, maximum, and/or average decoding complexity. The decoding complexity in the hmhd box may include absolute or relative values. The relative values may the relative to a prior absolute or relative value. For example, the first decoding complexity in a hmhd box may include an absolute value. Subsequent decoding complexity values may include absolute or relative values. The hmhd box may include the units of the decoding complexity.
  • The decoding complexity feedback may be included in quality of experience (QoE) reporting. For example, decoding complexity descriptors may be included within an event reporting service that may be sent from the WTRU. The actual WTRU decoding complexity feedback may be used by the content providing device to attach or update the decoding complexity descriptors. The decoding complexity descriptors may be included within a file format, such as a 3GP file format for example, or within one or more streaming protocols.
  • The QoE reporting may include DASH QoE reporting. For a DASH client, QoE reporting may be optional. The QoE reporting may use an average throughput metric, an initial playout delay metric, and/or a buffer level metric. TABLE I shows an example for including the decoding complexity feedback as a metric in QoE reports.
    Figure imgb0004
    As shown in TABLE 1. the decoding complexity feedback may indicate the decoding complexity for receiving, decoding, and/or displaying media content at a WTRU. The decoding complexity may be in terms of absolute or relative value. The absolute or relative value may be a boolean value that may indicate if the reported values are in absolute or relative metrics (e.g.. '1' for absolute and/or '0' for relative). The decoding complexity may include a minimum, maximum, and/or average decoding complexity value measured during a measurement interval. The minimum, maximum, and/or average decoding complexity value may be an integer value, a character value, and/or the like. The decoding complexity feedback may indicate the units for the reported decoding complexity values.
  • The decoding complexity feedback may indicate a start time and/or a duration of the measurement interval. The start time may include a real time value at which the decoding complexity values were measured. The duration of the measurement interval may include a period of time over which the decoding complexity values were measured. For example, the duration may include an integer value that indicates the time (e.g., in milliseconds, seconds, etc.) over which the measurements were performed.
  • The decoding complexity feedback may include the device configurations or device specifications that may identify the WTRU at which the measurements are performed. For example, decoding complexity feedback may include the device type, make, model, release date, OS version, device drivers, player agent, available power, power consumption configurations, and/or the like of the WTRU at which the measurements were performed for receiving, decoding, and/or displaying the media content. The decoding complexity feedback may include a unique identifier that indicates the device from which the decoding complexity feedback is being received.
  • The RTCP protocol may allow or enable WTRUs to send WTRU reports that may be extended to include decoding complexity reports from the WTRUs in an application specific (APP) RTCP type. The RTCP extension may be used by WTRUs to report QoE metrics. An RTCP application type (e.g., FMT=15) may be used to send device specs from WTRUs to content delivery devices during (e.g., at the beginning of) a streaming session.
  • FIG. 8A is a diagram that depicts an example communications system 800 in which one or more disclosed embodiments may be implemented. The communications system 800 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users. The communications system 800 may enable multiple wireless users to access such content through the sharing of system resources. including wireless bandwidth. For example, the communications systems 800 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), and/or the like.
  • As shown in FIG. 8A, the communications system 800 may include wireless transmit/receive units (WTRUs) 802a, 802b, 802c, and/or 802d (which generally or collectively may be referred to as WTRU 802), a radio access network (RAN) 803/804/805, a core network 806/807/809, a public switched telephone network (PSTN) 808, the Internet 810, and other networks 812, though any number of WTRUs, base stations, networks, and/or network elements may be used. Each of the WTRUs 802a, 802b, 802c, and/or 802d may be any type of device configured to operate and/or communicate in a wireless environment. The WTRUs 802a, 802b, 802c, and/or 802d may be configured to transmit and/or receive wireless signals and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, and/or the like.
  • The communications systems 800 may also include a base station 814a and/or a base station 814b. Each of the base stations 814a, 814b may be any type of device configured to wirelessly interface with at least one of the WTRUs 802a, 802b, 802c, and/or 802d to facilitate access to one or more communication networks, such as the core network 806/8O7/809, the Internet 810, and/or the networks 812. The base stations 814a and/or 814b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and/or the like. While the base stations 814a, 814b arc each depicted as a single element, the base stations 814a, 814b may include any number of interconnected base stations and/or network elements.
  • The base station 814a may be part of the RAN 803/804/805. which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc. The base station 814a and/or the base station 814b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown). The cell may further be divided into cell sectors. For example, the cell associated with the base station 814a may be divided into three sectors. Thus, the base station 814a may include three transceivers, e.g., one for each sector of the cell. The base station 814a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.
  • The base stations 814a and/or 814b may communicate with one or more of the WTRUs 802a, 802b, 802c, and/or 802d over an air interface 815/816/817, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, etc.). The air interface 815/816/817 may be established using any suitable radio access technology (RAT).
  • The communications system 800 may be a multiple access system and/or may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and/or the like. For example, the base station 814a in the RAN 803/804/805 and the WTRUs 802a, 802b, and/or 802c may implement a radio technology such as UMTS Terrestrial Radio Access (UTRA), which may establish the air interface 815/816/817 using wideband CDMA (WCDMA). WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+). HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).
  • The base station 814a and the WTRUs 802a, 802b, and/or 802c may implement a radio technology such as E-UTRA, which may establish the air interface 815/816/817 using Long Tenn Evolution (LTE) and/or LTE-Advanced (LTE-A).
  • The base station 814a and the WTRUs 802a, 802b, and/or 802c may implement radio technologies such as IEEE 802.16 (e.g., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO. Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and/or the like.
  • The base station 814b in FIG. 8A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and/or the like. The base station 814b and the WTRUs 802c, 802d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN). The base station 814b and the WTRUs 802c, 802d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN). The base station 814b and the WTRUs 802c. 802d may utilize a cellular-based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE-A, etc.) to establish a picocell or femtocell. As shown in FIG. 8A, the base station 814b may have a direct connection to the Internet 810. Thus, the base station 814b may not access the Internet 810 via the core network 806/807/809.
  • The RAN 803/804/805 may be in communication with the core network 806/807/809, which may be any type of network configured to provide content, such as voice, data, applications, media, and/or voice over internet protocol (VolP) services to one or more of the WTRUs 802a, 802b, 802c, and/or 802d. For example, the core network 806/807/809 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication. Although not shown in FIG. 8A. the RAN 803/804/805 and/or the core network 806/807/809 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 803/804/805 or a different RAT. For example, in addition to being connected to the RAN 803/804/805, which may be utilizing an E-UTRA radio technology, the core network 806/807/809 may also be in communication with another RAN (not shown) employing a GSM radio technology.
  • The core network 806/807/809 may serve as a gateway for the WTRUs 802a, 802b, 802c, and/or 802d to access the PSTN 808, the Internet 810, and/or other networks 812. The PSTN 808 may include circuit-switched telephone networks that provide plain old telephone service (POTS). The Internet 810 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite. The networks 812 may include wired and/or wireless communications networks owned and/or operated by other service providers. For example, the networks 812 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 803/804/805 or a different RAT.
  • Some or all of the WTRUs 802a. 802b, 802c, and/or 802d in the communications system 800 may include multi-mode capabilities, e.g., the WTRUs 802a, 802b. 802c. and/or 802d may include multiple transceivers for communicating with different wireless networks over different wireless links. For example, the WTRU 802c shown in FIG. 8A may be configured to communicate with the base station 814a. which may employ a cellular-based radio technology, and with the base station 814b, which may employ an IEEE (e.g., IEEE 802.11) radio technology.
  • FIG. 8B depicts a system diagram of an example WTRU 802. As shown in FIG. 8B, the WTRU 802 may include a processor 818, a transceiver 820, a transmit/receive element 822, a speaker/microphone 824, a keypad 826, a display/touchpad 828, non-removable memory 830, removable memory 832, a power source 834, a global positioning system (GPS) chipset 836, and/or other peripherals 838. The WTRU 802 may include any sub-combination of the foregoing elements.
  • Other devices described herein may include some or all of the elements depicted in FIG. 8B and described herein. For example, the base stations 814a and 814b, and/or the nodes that base stations 814a and 814b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HcNB), a home evolved node-B gateway, and proxy nodes, among others, may include some or all of the elements depicted in FIG. 8B and described herein. A content providing device may also include some or all of the elements depicted in FIG. 8B and described herein. For example, the content providing device may use the non-removable memory 830 and/or removable memory 832 to store WTRU identifiers, WTRU configuration information, decoding complexity information, and/or other information described herein.
  • The processor 818 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs). Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 818 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 802 to operate in a wireless environment and/or operate to process video content. The processor 818 may be coupled to the transceiver 820, which may be coupled to the transmit/receive element 822. While FIG. 8B depicts the processor 818 and the transceiver 820 as separate components, the processor 818 and the transceiver 820 may be integrated together in an electronic package or chip.
  • The transmit/receive element 822 may be configured to transmit signals to, or receive signals from, a base station (e.g.. the base station 814a) over the air interface 815/816/817. For example, the transmit/receive element 822 may be an antenna configured to transmit and/or receive RF signals. The transmit/receive element 822 may be an emitter/detector configured to transmit and/or receive IR. UV, and/or visible light signals, for example. The transmit/receive element 822 may be configured to transmit and receive RF and light signals. The transmit/receive element 822 may be configured to transmit and/or receive any combination of wireless signals.
  • Although the transmit/receive element 822 is depicted in FIG. 8B as a single element, the WTRU 802 may include any number of transmit/receive elements 822. More specifically, the WTRU 802 may employ MIMO technology. Thus, the WTRU 802 may include two or more transmit/receive elements 822 (e.g., multiple antennas) for transmitting and/or receiving wireless signals over the air interface 815/816/817.
  • The transceiver 820 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 822 and/or to demodulate the signals that are received by the transmit/receive clement 822. As noted above, the WTRU 802 may have multi-mode capabilities. Thus, the transceiver 820 may include multiple transceivers for enabling the WTRU 802 to communicate via multiple RATs, such as UTRA and IEEE 802.11, for example.
  • The processor 818 of the WTRU 802 may be coupled to, and may receive user input data from, the speaker/microphone 824, the keypad 826, and/or the display/touchpad 828 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 818 may output user data to the speaker/microphone 824. the keypad 826, and/or the display/touchpad 828. The processor 818 may access information from, and/or store data in, any type of suitable memory, such as the non-removable memory 830 and/or the removable memory 832. The non-removable memory 830 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 832 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and/or the like. The processor 818 may access information from, and/or store data in, memory that is not physically located on the WTRU 802, such as on a server or a home computer (not shown).
  • The processor 818 may receive power from the power source 834, and may be configured to distribute and/or control the power to the other components in the WTRU 802. The power source 834 may be any suitable device for powering the WTRU 802. For example, the power source 834 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and/or the like.
  • The processor 818 may also be coupled to the GPS chipset 836, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 802. In addition to, or in lieu of, the information from the GPS chipset 836, the WTRU 802 may receive location information over the air interface 815/816/817 from a base station (e.g., base stations 814a, 814b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. The WTRU 802 may acquire location information by way of any suitable location-determination method.
  • The processor 818 may further be coupled to other peripherals 838, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 838 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and/or the like.
  • FIG. 8C depicts a system diagram of the RAN 803 and the core network 806 according to an embodiment. As noted above, the RAN 803 may employ a UTRA radio technology to communicate with the WTRUs 802a, 802b, and/or 802c over the air interface 815. The RAN 803 may also be in communication with the core network 806. As shown in FIG. 8C, the RAN 803 may include Node- Bs 840a, 840b, and/or 840c, which may each include one or more transceivers for communicating with the WTRUs 802a, 802b, and/or 802c over the air interface 815. The Node- Bs 840a, 840b, and/or 840c may each be associated with a particular cell (not shown) within the RAN 803. The RAN 803 may also include RNCs 842a and/or 842b. The RAN 803 may include any number of Node-Bs and RNCs.
  • As shown in FIG. 8C, the Node-Bs 840a and/or 840b may be in communication with the RNC 842a. The Node-B 840c may be in communication with the RNC 842b. The Node- Bs 840a, 840b, and/or 840c may communicate with the respective RNCs 842a, 842b via an lub interface. The RNCs 842a, 842b may be in communication with one another via an lur interface. Each of the RNCs 842a, 842b may be configured to control the respective Node-Bs 840a. 840b, and/or 840c to which it is connected. Each of the RNCs 842a, 842b may be configured to carry out or support other functionality, such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and/or the like.
  • The core network 806 shown in FIG. 8C may include a media gateway (MGW) 844, a mobile switching center (MSC) 846, a serving GPRS support node (SGSN) 848, and/or a gateway GPRS support node (GGSN) 850. While each of the foregoing elements are depicted as part of the core network 806, any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • The RNC 842a in the RAN 803 may be connected to the MSC 846 in the core network 806 via an IuCS interface. The MSC 846 may be connected to the MGW 844. The MSC 846 and the MGW 844 may provide the WTRUs 802a, 802b, and/or 802c with access to circuit-switched networks, such as the PSTN 808. to facilitate communications between the WTRUs 802a, 802b, and/or 802c and traditional land-line communications devices.
  • The RNC 842a in the RAN 803 may be connected to the SGSN 848 in the core network 806 via an luPS interface. The SGSN 848 may be connected to the GGSN 850. The SGSN 848 and the GGSN 850 may provide the WTRUs 802a. 802b, and/or 802c with access to packet-switched networks, such as the Internet 810, to facilitate communications between and the WTRUs 802a, 802b, and/or 802c and IP-enabled devices.
  • As noted above, the core network 806 may also be connected to the networks 812, which may include other wired and/or wireless networks that may be owned and/or operated by other service providers.
  • FIG. 8D depicts a system diagram of the RAN 804 and the core network 807 according to an embodiment. As noted above, the RAN 804 may employ an E-UTRA radio technology to communicate with the WTRUs 802a, 802b, and/or 802c over the air interface 816. The RAN 804 may also be in communication with the core network 807.
  • The RAN 804 may include eNode- Bs 860a, 860b, and/or 860c, though the RAN 804 may include any number of eNode-Bs. The eNode- Bs 860a, 860b, and/or 860c may each include one or more transceivers for communicating with the WTRUs 802a, 802b, and/or 802c over the air interface 816. The eNode- Bs 860a, 860b, and/or 860c may implement MIMO technology. Thus, the eNode-B 860a, for example, may use multiple antennas to transmit wireless signals to, and/or receive wireless signals from, the WTRU 802a.
  • Each of the eNode- Bs 860a, 860b, and/or 860c may be associated with a particular cell (not shown) and may be configured to handle radio resource management decisions, handover decisions, scheduling of users in the uplink and/or downlink, and/or the like. As shown in FIG. 8D, the eNode- Bs 860a, 860b, and/or 860c may communicate with one another over an X2 interface.
  • The core network 807 shown in FIG. 8D may include a mobility management gateway (MME) 862, a serving gateway 864, and a packet data network (PDN) gateway 866. While each of the foregoing elements are depicted as part of the core network 807, any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • The MME 862 may be connected to each of the eNode- Bs 860a, 860b, and/or 860c in the RAN 804 via an S1 interface and may serve as a control node. For example, the MME 862 may be responsible for authenticating users of the WTRUs 802a, 802b, and/or 802c, bearer activation/deactivation, selecting a particular serving gateway during an initial attach of the WTRUs 802a, 802b, and/or 802c, and/or the like. The MME 862 may provide a control plane function for switching between the RAN 804 and other RANs (not shown) that may employ other radio technologies, such as GSM or WCDMA.
  • The serving gateway 864 may be connected to each of the eNode- Bs 860a, 860b, and/or 860c in the RAN 804 via the S1 interface. The serving gateway 864 may generally route and/or forward user data packets to/from the WTRUs 802a, 802b, and/or 802c. The serving gateway 864 may perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for the WTRUs 802a, 802b, and/or 802c, managing and storing contexts of the WTRUs 802a, 802b, and/or 802c, and/or the like.
  • The serving gateway 864 may be connected to the PDN gateway 866, which may provide the WTRUs 802a, 802b, and/or 802c with access to packet-switched networks, such as the Internet 810, to facilitate communications between the WTRUs 802a, 802b, and/or 802c and IP-enabled devices.
  • The core network 807 may facilitate communications with other networks. For example, the core network 807 may provide the WTRUs 802a, 802b, and/or 802c with access to circuit-switched networks, such as the PSTN 808. to facilitate communications between the WTRUs 802a, 802b, and/or 802c and traditional land-line communications devices. For example, the core network 807 may include, or may communicate with, an IP gateway (e.g., an IP multimedia subsystem (IMS) server) that serves as an interface between the core network 807 and the PSTN 808. The core network 807 may provide the WTRUs 802a, 802b, and/or 802c with access to the networks 812, which may include other wired and/or wireless networks that may be owned and/or operated by other service providers.
  • FIG. 8E depicts a system diagram of the RAN 805 and the core network 809. The RAN 805 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with the WTRUs 802a, 802b, and/or 802c over the air interface 817. As will be further described herein, the communication links between the different functional entities of the WTRUs 802a, 802b, and/or 802c, the RAN 805, and the core network 809 may be defined as reference points.
  • As shown in FIG. 8E, the RAN 805 may include base stations 880a, 880b, and/or 880c, and an ASN gateway 882, though the RAN 805 may include any number of base stations and/or ASN gateways. The base stations 880a, 880b, and/or 880c may each be associated with a particular cell (not shown) in the RAN 805 and/or may each include one or more transceivers for communicating with the WTRUs 802a, 802b, and/or 802c over the air interface 817. The base stations 880a, 880b, and/or 880c may implement MIMO technology. Thus, the base station 880a, for example, may use multiple antennas to transmit wireless signals to, and/or receive wireless signals from, the WTRU 802a. The base stations 880a, 880b, and/or 880c may provide mobility management functions, such as handoff triggering, tunnel establishment, radio resource management, traffic classification, quality of service (QoS) policy enforcement, and/or the like. The ASN gateway 882 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 809, and/or the like.
  • The air interface 817 between the WTRUs 802a, 802b, and/or 802c and the RAN 805 may be defined as an R1 reference point that implements the IEEE 802.16 specification. Each of the WTRUs 802a, 802b, and/or 802c may establish a logical interface (not shown) with the core network 809. The logical interface between the WTRUs 802a, 802b, and/or 802c and the core network 809 may be defined as an R2 reference point, which may be used for authentication, authorization, IP host configuration management, and/or mobility management.
  • The communication link between each of the base stations 880a, 880b, and/or 880c may be defined as an R8 reference point that includes protocols for facilitating WTRU handovers and/or the transfer of data between base stations. The communication link between the base stations 880a, 880b, and/or 880c and the ASN gateway 882 may be defined as an R6 reference point. The R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of the WTRUs 802a, 802b, and/or 802c.
  • As shown in FIG. 8E, the RAN 805 may be connected to the core network 809. The communication link between the RAN 805 and the core network 809 may defined as an R3 reference point that may include protocols for facilitating data transfer and mobility management capabilities, for example. The core network 809 may include a mobile IP home agent (MIP-HA) 884, an authentication, authorization, accounting (AAA) server 886, and/or a gateway 888. While each of the foregoing elements are depicted as part of the core network 809, any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • The MIP-HA 884 may be responsible for IP address management, and may enable the WTRUs 802a, 802b, and/or 802c to roam between different ASNs and/or different core networks. The MIP-HA 884 may provide the WTRUs 802a, 802b, and/or 802c with access to packet-switched networks, such as the Internet 810, to facilitate communications between the WTRUs 802a, 802b, and/or 802c and IP-enabled devices. The AAA server 886 may be responsible for user authentication and for supporting user services. The gateway 888 may facilitate interworking with other networks. For example, the gateway 888 may provide the WTRUs 802a, 802b, and/or 802c with access to circuit-switched networks, such as the PSTN 808, to facilitate communications between the WTRUs 802a, 802b, and/or 802c and traditional land-line communications devices. The gateway 888 may provide the WTRUs 802a, 802b, and/or 802c with access to the networks 812, which may include other wired and/or wireless networks that may be owned and/or operated by other service providers.
  • Although not shown in FIG. 8E, the RAN 805 may be connected to other ASNs and/or the core network 809 may be connected to other core networks. The communication link between the RAN 805 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of the WTRUs 802a, 802b, and/or 802c between the RAN 805 and the other ASNs. The communication link between the core network 809 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks.
  • Although features and elements are described above in particular combinations, each feature or element may be used alone or in any combination with the other features and/or elements, as long as these combinations fall within the scope of the invention which is defined by the appended claims. The methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer-readable media include electronic signals (transmitted over wired or wireless connections) and/or computer-readable storage media. Examples of computer-readable storage media include, but arc not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and/or optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, content providing device, and/or any host computer.
  • Claims (15)

    1. A method performed by a device, to determine decoding complexity information associated with media content, the method comprising:
      sending a request for media content to a content providing device;
      receiving the media content;
      decoding the media content;
      displaying the media content;
      the method characterized by:
      determining the decoding complexity information that indicates at least one of the following: at least one of a maximum, a minimum, or an average amount of power, at least one of a maximum, a minimum, or an average amount of time, or at least one of a maximum, a minimum, or an average amount of computing resources, used to decode or display the media content at the device; and
      sending the decoding complexity information to a remote source.
    2. A device, comprising:
      a processor configured to:
      send a request for media content to a content providing device;
      receive the media content;
      decode the media content;
      display the media content; the device characterized by the processor being further configured to:
      determine decoding complexity information that indicates at least one of the following: at least one of a maximum, a minimum, or an average amount of power, at least one of a maximum, a minimum, or an average amount of time, or at least one of a maximum, a minimum, or an average amount of computing resources, used to decode or display the media content at the device; and
      send the decoding complexity information to a remote source.
    3. The method of claim 1, further comprising:
      receiving decoding complexity information associated with second media content, wherein the decoding complexity information associated with the second media content indicates at least one of power, time, or computing resources, used to decode or display the second media content at one or more other reference devices;
      determining to receive the second media content based on the decoding complexity information associated with the second media content; and
      sending a request to the content providing device to send the second media content.
    4. The method of claim 3, further comprising:
      receiving decoding complexity information for each of a plurality of different types of the second media content;
      determining to receive a preferred type out of the plurality of different types of the second media content based on the decoding complexity information associated with each type of the second media content; and
      sending a request to the content providing device to send the preferred type of the second media content.
    5. The method of claim 1, wherein the media content comprises a plurality of segments, and wherein the method further comprises:
      determining a corresponding decoding complexity information for each segment of the media content; and
      sending the decoding complexity information for each segment of the media content to the remote source.
    6. The method of claim 1 or the device of claim 2, wherein the decoding complexity information comprises a time at which decoding complexity was measured, a duration for which the decoding complexity was measured, or device specifications relating to the device.
    7. The method of claim 1, further comprising measuring an amount of power used while decoding or displaying the media content to determine the decoding complexity information associated with the media content.
    8. The method of claim 1, further comprising measuring the maximum amount of power, the minimum amount of power, or the average amount of power used while decoding or displaying the media content to determine the decoding complexity information associated with the media content.
    9. The method of claim 3, further comprising receiving a media presentation descriptor, MPD, file for the second media content, the MPD file comprising the decoding complexity information associated with the second media content.
    10. The method of claim 1 or the device of claim 2, wherein the decoding complexity information is included in a Hypertext Transfer Protocol, HTTP, request sent to a dynamic adaptive streaming over HTTP, DASH, server.
    11. The device of claim 2, wherein the processor is further configured to:
      receive decoding complexity information associated with second media content,
      wherein the decoding complexity information associated with the second media content indicates at least one of power, time, or computing resources, used to decode or display the second media content at one or more other reference devices;
      determine to receive the second media content based on the decoding complexity information associated with the second media content; and
      send a request to the content providing device to send the second media content.
    12. The device of claim 11, wherein the processor is further configured to:
      receive decoding complexity information for each of a plurality of different types of the second media content;
      determine to receive a preferred type out of the plurality of different types of the second media content based on the decoding complexity information associated with each type of the second media content; and
      send a request to the content providing device to send the preferred type of the second media content.
    13. The device of claim 2, wherein the processor is further configured to measure an amount of power used while decoding or displaying the media content to determine the decoding complexity information associated with the media content.
    14. The device of claim 2, wherein the processor is further configured to measure a maximum amount of power, a minimum amount of power, or an average amount of power used while decoding or displaying the media content to determine the decoding complexity information associated with the media content.
    15. The device of claim 11, wherein the processor is further configured to receive a media presentation descriptor, MPD, file for the second media content, the MPD file comprising the decoding complexity information associated with the second media content.
    EP18211297.9A 2012-10-18 2013-10-18 Decoding complexity for mobile multimedia streaming Active EP3528450B1 (en)

    Applications Claiming Priority (3)

    Application Number Priority Date Filing Date Title
    US201261715466P 2012-10-18 2012-10-18
    PCT/US2013/065638 WO2014063026A1 (en) 2012-10-18 2013-10-18 Decoding complexity for mobile multimedia streaming
    EP13785748.8A EP2909990B1 (en) 2012-10-18 2013-10-18 Decoding complexity for mobile multimedia streaming

    Related Parent Applications (2)

    Application Number Title Priority Date Filing Date
    EP13785748.8A Division EP2909990B1 (en) 2012-10-18 2013-10-18 Decoding complexity for mobile multimedia streaming
    EP13785748.8A Division-Into EP2909990B1 (en) 2012-10-18 2013-10-18 Decoding complexity for mobile multimedia streaming

    Publications (2)

    Publication Number Publication Date
    EP3528450A1 EP3528450A1 (en) 2019-08-21
    EP3528450B1 true EP3528450B1 (en) 2021-12-01

    Family

    ID=49515526

    Family Applications (2)

    Application Number Title Priority Date Filing Date
    EP18211297.9A Active EP3528450B1 (en) 2012-10-18 2013-10-18 Decoding complexity for mobile multimedia streaming
    EP13785748.8A Active EP2909990B1 (en) 2012-10-18 2013-10-18 Decoding complexity for mobile multimedia streaming

    Family Applications After (1)

    Application Number Title Priority Date Filing Date
    EP13785748.8A Active EP2909990B1 (en) 2012-10-18 2013-10-18 Decoding complexity for mobile multimedia streaming

    Country Status (6)

    Country Link
    US (3) US10237321B2 (en)
    EP (2) EP3528450B1 (en)
    JP (2) JP6097840B2 (en)
    KR (2) KR101748505B1 (en)
    CN (2) CN109890071B (en)
    WO (1) WO2014063026A1 (en)

    Families Citing this family (10)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    KR101748505B1 (en) * 2012-10-18 2017-06-16 브이아이디 스케일, 인크. Decoding complexity for mobile multimedia streaming
    WO2015061794A1 (en) * 2013-10-25 2015-04-30 Futurewei Technologies, Inc. Associating representations in adaptive streaming
    US9860612B2 (en) * 2014-04-10 2018-01-02 Wowza Media Systems, LLC Manifest generation and segment packetization
    US11622137B2 (en) * 2015-02-11 2023-04-04 Vid Scale, Inc. Systems and methods for generalized HTTP headers in dynamic adaptive streaming over HTTP (DASH)
    US10412132B2 (en) * 2015-02-16 2019-09-10 Lg Electronics Inc. Broadcasting signal transmission device, broadcast signal reception device, broadcast signal transmission method, and broadcast signal reception method
    EP3345436B1 (en) * 2015-09-03 2020-09-02 Deutsche Telekom AG Method for an enhanced power consumption management of a user equipment using a mobile communication network, wherein different sets of radio resources are used in dependency of a power consumption information, mobile communication network, program comprising a computer readable program code, and computer program product
    US10034200B2 (en) * 2015-10-23 2018-07-24 Motorola Mobility Llc Iteratively transmitting random linear network encoded packets from multiple transmission nodes
    US10237681B2 (en) 2017-02-06 2019-03-19 Samsung Electronics Co., Ltd. Registration management method for terminal accessing 5G network on non-3GPP access
    KR102165255B1 (en) * 2017-02-06 2020-10-13 삼성전자 주식회사 Registration management method for terminal accessing 5g network on non-3gpp
    US11438670B2 (en) * 2020-11-20 2022-09-06 At&T Intellectual Property I, L.P. Video complexity detection for network traffic management

    Family Cites Families (51)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    KR19990072122A (en) * 1995-12-12 1999-09-27 바자니 크레이그 에스 Method and apparatus for real-time image transmission
    US7032236B1 (en) * 1998-02-20 2006-04-18 Thomson Licensing Multimedia system for processing program guides and associated multimedia objects
    US6285405B1 (en) * 1998-10-14 2001-09-04 Vtel Corporation System and method for synchronizing data signals
    US6470378B1 (en) * 1999-03-31 2002-10-22 Intel Corporation Dynamic content customization in a clientserver environment
    JP2002077377A (en) 2000-08-29 2002-03-15 Toshiba Corp Portable terminal and method of suppressing power consumption of the portable terminal
    JP2002132622A (en) 2000-10-19 2002-05-10 Seiko Epson Corp Device for supplying program and method of supplying program, portable terminal, network system and computer-readable medium
    US8352991B2 (en) * 2002-12-09 2013-01-08 Thomson Licensing System and method for modifying a video stream based on a client or network environment
    US7146185B2 (en) 2003-06-12 2006-12-05 Richard Lane Mobile station-centric method for managing bandwidth and QoS in error-prone system
    EP1642464B1 (en) * 2003-06-27 2008-10-15 Nxp B.V. Method of encoding for handheld apparatuses
    GB0400658D0 (en) 2004-01-13 2004-02-11 Koninkl Philips Electronics Nv Portable device for receiving media content
    RU2402885C2 (en) 2005-03-10 2010-10-27 Квэлкомм Инкорпорейтед Classification of content for processing multimedia data
    US20060235883A1 (en) * 2005-04-18 2006-10-19 Krebs Mark S Multimedia system for mobile client platforms
    US8031766B2 (en) * 2005-08-02 2011-10-04 Lsi Corporation Performance adaptive video encoding with concurrent decoding
    US7653386B2 (en) * 2006-05-05 2010-01-26 Broadcom Corporation Access point multi-level transmission power control supporting periodic high power level transmissions
    US8358704B2 (en) * 2006-04-04 2013-01-22 Qualcomm Incorporated Frame level multimedia decoding with frame information table
    US9380096B2 (en) * 2006-06-09 2016-06-28 Qualcomm Incorporated Enhanced block-request streaming system for handling low-latency streaming
    US7804435B2 (en) * 2006-08-31 2010-09-28 Ati Technologies Ulc Video decoder with reduced power consumption and method thereof
    CN101601288A (en) * 2006-12-08 2009-12-09 艾利森电话股份有限公司 Be used for receiver action and enforcement that efficient media is handled
    US20090017860A1 (en) * 2007-07-09 2009-01-15 Sony Ericsson Mobile Communications Ab Intelligent Power-Aware Downloading for Mobile Communication Devices
    US8165644B2 (en) * 2007-08-29 2012-04-24 Qualcomm Incorporated Server initiated power mode switching in portable communication devices
    US8948822B2 (en) * 2008-04-23 2015-02-03 Qualcomm Incorporated Coordinating power management functions in a multi-media device
    CN100571396C (en) 2008-04-25 2009-12-16 清华大学 Be used for the method for video coding under the decoding complex degree restriction
    US20090304085A1 (en) * 2008-06-04 2009-12-10 Novafora, Inc. Adaptive Deblocking Complexity Control Apparatus and Method
    EP3200423B1 (en) * 2008-06-06 2023-05-31 Amazon Technologies, Inc. Media host transmitting media stream with adapted bit rate
    US8838824B2 (en) * 2009-03-16 2014-09-16 Onmobile Global Limited Method and apparatus for delivery of adapted media
    US8538484B2 (en) 2009-08-14 2013-09-17 Google Inc. Providing a user with feedback regarding power consumption in battery-operated electronic devices
    US9124642B2 (en) * 2009-10-16 2015-09-01 Qualcomm Incorporated Adaptively streaming multimedia
    EP3145189B1 (en) * 2010-01-06 2019-06-19 Dolby Laboratories Licensing Corporation Complexity-adaptive scalable decoding and streaming for multi-layered video systems
    CN102771134B (en) * 2010-01-18 2016-04-13 瑞典爱立信有限公司 For supporting method and the device of play content
    US9716920B2 (en) * 2010-08-05 2017-07-25 Qualcomm Incorporated Signaling attributes for network-streamed video data
    US8689267B2 (en) * 2010-12-06 2014-04-01 Netflix, Inc. Variable bit video streams for adaptive streaming
    WO2012090962A1 (en) * 2010-12-28 2012-07-05 シャープ株式会社 Image decoding device, image encoding device, data structure of encoded data, arithmetic decoding device, and arithmetic encoding device
    CN102547272B (en) * 2010-12-30 2015-03-11 中国移动通信集团公司 Decoding method, device and terminal
    US9661104B2 (en) * 2011-02-07 2017-05-23 Blackberry Limited Method and apparatus for receiving presentation metadata
    JP5837621B2 (en) * 2011-02-11 2015-12-24 インターデイジタル パテント ホールディングス インコーポレイテッド Content distribution and reception method and apparatus
    US20120275502A1 (en) * 2011-04-26 2012-11-01 Fang-Yi Hsieh Apparatus for dynamically adjusting video decoding complexity, and associated method
    CN102209242B (en) * 2011-05-26 2012-11-07 大连理工大学 Optimal retractable video transmitting and decoding system based on power consumption model
    JP5626129B2 (en) 2011-06-06 2014-11-19 ソニー株式会社 Reception apparatus and method, information distribution apparatus and method, and information distribution system
    CN102857746B (en) * 2011-06-28 2017-03-29 中兴通讯股份有限公司 Loop filtering decoding method and device
    JP2013038766A (en) * 2011-07-12 2013-02-21 Sharp Corp Transmitter, transmitter control method, control program, and recording medium
    US20130042013A1 (en) * 2011-08-10 2013-02-14 Nokia Corporation Methods, apparatuses and computer program products for enabling live sharing of data
    US9503497B2 (en) * 2011-12-10 2016-11-22 LogMeln, Inc. Optimizing transfer to a remote access client of a high definition (HD) host screen image
    ES2571863T3 (en) * 2012-01-17 2016-05-27 ERICSSON TELEFON AB L M (publ) Management of reference image lists
    CA2869132C (en) * 2012-04-06 2017-08-22 Vidyo, Inc. Level signaling for layered video coding
    JP5994367B2 (en) * 2012-04-27 2016-09-21 富士通株式会社 Moving picture coding apparatus and moving picture coding method
    TWI558183B (en) * 2012-07-09 2016-11-11 Vid衡器股份有限公司 Power aware video decoding and streaming
    US8949440B2 (en) * 2012-07-19 2015-02-03 Alcatel Lucent System and method for adaptive rate determination in mobile video streaming
    US9357272B2 (en) * 2012-08-03 2016-05-31 Intel Corporation Device orientation capability exchange signaling and server adaptation of multimedia content in response to device orientation
    US8923880B2 (en) * 2012-09-28 2014-12-30 Intel Corporation Selective joinder of user equipment with wireless cell
    KR101748505B1 (en) * 2012-10-18 2017-06-16 브이아이디 스케일, 인크. Decoding complexity for mobile multimedia streaming
    US11812116B2 (en) * 2019-10-16 2023-11-07 Charter Communications Operating, Llc Apparatus and methods for enhanced content control, consumption and delivery in a content distribution network

    Non-Patent Citations (1)

    * Cited by examiner, † Cited by third party
    Title
    None *

    Also Published As

    Publication number Publication date
    CN109890071A (en) 2019-06-14
    EP2909990A1 (en) 2015-08-26
    CN104871514B (en) 2019-04-05
    JP2017142803A (en) 2017-08-17
    JP2016503600A (en) 2016-02-04
    KR102059084B1 (en) 2019-12-24
    EP2909990B1 (en) 2019-01-16
    WO2014063026A1 (en) 2014-04-24
    KR20150074078A (en) 2015-07-01
    US10237321B2 (en) 2019-03-19
    WO2014063026A8 (en) 2015-04-16
    KR20170068616A (en) 2017-06-19
    JP6097840B2 (en) 2017-03-15
    CN109890071B (en) 2023-05-12
    JP6463788B2 (en) 2019-02-06
    US20150237103A1 (en) 2015-08-20
    US20220394075A1 (en) 2022-12-08
    CN104871514A (en) 2015-08-26
    KR101748505B1 (en) 2017-06-16
    EP3528450A1 (en) 2019-08-21
    US11368509B2 (en) 2022-06-21
    US20190158562A1 (en) 2019-05-23

    Similar Documents

    Publication Publication Date Title
    US20220394075A1 (en) Decoding complexity for mobile multimedia streaming
    JP7072592B2 (en) Quality driven streaming
    US9986003B2 (en) Mediating content delivery via one or more services
    AU2017202995B2 (en) System and method for adapting video communications
    US8782165B2 (en) Method and transcoding proxy for transcoding a media stream that is delivered to an end-user device over a communications network
    US20140019635A1 (en) Operation and architecture for dash streaming clients
    EP2901751A1 (en) Energy-aware multimedia adaptation for streaming and conversational services
    US20140282811A1 (en) Media distribution network system with media burst transmission via an access network
    Liu et al. A measurement study of resource utilization in internet mobile streaming
    Mavromoustakis et al. Mobile video streaming resource management

    Legal Events

    Date Code Title Description
    PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

    Free format text: ORIGINAL CODE: 0009012

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

    17P Request for examination filed

    Effective date: 20181210

    AC Divisional application: reference to earlier application

    Ref document number: 2909990

    Country of ref document: EP

    Kind code of ref document: P

    AK Designated contracting states

    Kind code of ref document: A1

    Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

    GRAP Despatch of communication of intention to grant a patent

    Free format text: ORIGINAL CODE: EPIDOSNIGR1

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: GRANT OF PATENT IS INTENDED

    INTG Intention to grant announced

    Effective date: 20200624

    GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

    Free format text: ORIGINAL CODE: EPIDOSDIGR1

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

    INTC Intention to grant announced (deleted)
    GRAP Despatch of communication of intention to grant a patent

    Free format text: ORIGINAL CODE: EPIDOSNIGR1

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: GRANT OF PATENT IS INTENDED

    INTG Intention to grant announced

    Effective date: 20201208

    GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

    Free format text: ORIGINAL CODE: EPIDOSDIGR1

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

    INTC Intention to grant announced (deleted)
    GRAP Despatch of communication of intention to grant a patent

    Free format text: ORIGINAL CODE: EPIDOSNIGR1

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: GRANT OF PATENT IS INTENDED

    INTG Intention to grant announced

    Effective date: 20210527

    GRAS Grant fee paid

    Free format text: ORIGINAL CODE: EPIDOSNIGR3

    GRAA (expected) grant

    Free format text: ORIGINAL CODE: 0009210

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: THE PATENT HAS BEEN GRANTED

    REG Reference to a national code

    Ref country code: DE

    Ref legal event code: R079

    Ref document number: 602013080306

    Country of ref document: DE

    Free format text: PREVIOUS MAIN CLASS: H04L0029060000

    Ipc: H04L0065000000

    AC Divisional application: reference to earlier application

    Ref document number: 2909990

    Country of ref document: EP

    Kind code of ref document: P

    AK Designated contracting states

    Kind code of ref document: B1

    Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

    REG Reference to a national code

    Ref country code: GB

    Ref legal event code: FG4D

    REG Reference to a national code

    Ref country code: AT

    Ref legal event code: REF

    Ref document number: 1452866

    Country of ref document: AT

    Kind code of ref document: T

    Effective date: 20211215

    Ref country code: CH

    Ref legal event code: EP

    REG Reference to a national code

    Ref country code: IE

    Ref legal event code: FG4D

    REG Reference to a national code

    Ref country code: DE

    Ref legal event code: R096

    Ref document number: 602013080306

    Country of ref document: DE

    REG Reference to a national code

    Ref country code: NL

    Ref legal event code: FP

    REG Reference to a national code

    Ref country code: LT

    Ref legal event code: MG9D

    REG Reference to a national code

    Ref country code: AT

    Ref legal event code: MK05

    Ref document number: 1452866

    Country of ref document: AT

    Kind code of ref document: T

    Effective date: 20211201

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: RS

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: LT

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: FI

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: BG

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20220301

    Ref country code: AT

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: SE

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: PL

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: NO

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20220301

    Ref country code: LV

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: HR

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: GR

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20220302

    Ref country code: ES

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: SM

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: SK

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: RO

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: PT

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20220401

    Ref country code: EE

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: CZ

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    REG Reference to a national code

    Ref country code: DE

    Ref legal event code: R097

    Ref document number: 602013080306

    Country of ref document: DE

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: IS

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20220401

    PLBE No opposition filed within time limit

    Free format text: ORIGINAL CODE: 0009261

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: DK

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: AL

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    26N No opposition filed

    Effective date: 20220902

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: SI

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: MC

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    Ref country code: IT

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    REG Reference to a national code

    Ref country code: CH

    Ref legal event code: PL

    P01 Opt-out of the competence of the unified patent court (upc) registered

    Effective date: 20230514

    REG Reference to a national code

    Ref country code: BE

    Ref legal event code: MM

    Effective date: 20221031

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: LU

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20221018

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: LI

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20221031

    Ref country code: CH

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20221031

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: BE

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20221031

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: IE

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20221018

    PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

    Ref country code: NL

    Payment date: 20231026

    Year of fee payment: 11

    PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

    Ref country code: GB

    Payment date: 20231024

    Year of fee payment: 11

    PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

    Ref country code: FR

    Payment date: 20231026

    Year of fee payment: 11

    Ref country code: DE

    Payment date: 20231027

    Year of fee payment: 11

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: HU

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

    Effective date: 20131018

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: CY

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: MK

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20211201