WO2014138331A2 - Adaptation sensible à l'énergie pour diffusion vidéo en flux - Google Patents

Adaptation sensible à l'énergie pour diffusion vidéo en flux Download PDF

Info

Publication number
WO2014138331A2
WO2014138331A2 PCT/US2014/020999 US2014020999W WO2014138331A2 WO 2014138331 A2 WO2014138331 A2 WO 2014138331A2 US 2014020999 W US2014020999 W US 2014020999W WO 2014138331 A2 WO2014138331 A2 WO 2014138331A2
Authority
WO
WIPO (PCT)
Prior art keywords
pdr
complexity level
complexity
level
processor
Prior art date
Application number
PCT/US2014/020999
Other languages
English (en)
Other versions
WO2014138331A3 (fr
Inventor
Yuwen He
Markus KUNSTNER
Yan Ye
Ralph Neff
Original Assignee
Interdigital Patent Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2015561634A priority Critical patent/JP2016517197A/ja
Priority to US14/773,078 priority patent/US10063921B2/en
Priority to KR1020157027754A priority patent/KR101879318B1/ko
Priority to EP18208512.6A priority patent/EP3499905B1/fr
Priority to CN201811154594.6A priority patent/CN109510999B/zh
Priority to KR1020187019821A priority patent/KR101991214B1/ko
Priority to EP22190891.6A priority patent/EP4156703A1/fr
Priority to EP14715156.7A priority patent/EP2965529A2/fr
Application filed by Interdigital Patent Holdings, Inc. filed Critical Interdigital Patent Holdings, Inc.
Priority to CN201480012503.9A priority patent/CN105191329B/zh
Publication of WO2014138331A2 publication Critical patent/WO2014138331A2/fr
Publication of WO2014138331A3 publication Critical patent/WO2014138331A3/fr
Priority to US16/038,881 priority patent/US10469904B2/en
Priority to US16/580,848 priority patent/US11153645B2/en
Priority to US17/503,809 priority patent/US11695991B2/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/23439Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements for generating different versions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4424Monitoring of the internal components or processes of the client device, e.g. CPU or memory load, processing speed, timer, counter or percentage of the hard disk space used
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4436Power management, e.g. shutting down unused components of the receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4621Controlling the complexity of the content stream or additional data, e.g. lowering the resolution or bit-rate of the video stream for a mobile client with a small screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Definitions

  • SoC Systems on Chip
  • WiFi wireless communication technologies
  • a user may use a mobile device to access a sendee at any time and at any place.
  • Video streaming is an often requested video sendee for mobile platforms operating in a wireless network.
  • Power aware adaptation for a power aware video streaming system may be based on complexity information, which may be conveyed in a number of ways.
  • a complexity level of a data stream such as a video data stream, may be selected as a function of a remaining battery power of a wireless transmit/receive unit (WTRU) and on a state set of a plurality of state sets that may be stored and/or managed by the WTRU.
  • WTRU wireless transmit/receive unit
  • state sets may correspond to, for
  • different content sources and/or different complexity estimation algorithms may he used to select the complexity level of the data stream.
  • the data stream may then be received at the selected complexity level.
  • the complexity level and'Or a bitrate of the data stream may be adapted to accommodate, for example, the remaining battery power and'Or other circumstances.
  • the adaptation may be customized according to the objectives of use cases.
  • a decoder device such as a WTRU, may set a limit on the mtmber of state sets it may track and may delete state sets when this limit may be exceeded.
  • the decoder device may merge state sets that may be similar, and'Or may quantize complexity levels to power dissipation rate (PDR) states.
  • PDR power dissipation rate
  • a device for power aware streaming may be provided.
  • the device may include a processor ihai may perform a number of actions.
  • a complexity level for a data segment may be determined.
  • the complexity for the data segment may be received from a server or via a signal.
  • the data segment may be a segment of a video stream.
  • the processor may determine a complexity level for a data segment that may be used by a decoder.
  • the PDR for the complexity level may be based on a power that may be dissipated while decoding the data segment.
  • the PDR for the complexity level may be determined using a first battery level and a second battery level A state, such as a PDR state, may be calculated using the PDR.
  • a second PDR may be determined for the complexity level.
  • the state such as the PDR state for the complexity level, may be calculated using a first PDR and a second PDR.
  • the PDR state may be calculated by calculating a weighted average of the first PDR and the second PDR.
  • the PDR state may be calculated by calculating a first weighted PDR by apply ing a first weight to the first PDR; calculating a second weighted PDR by applying a second weight to the second PDR; and setting the PDR state to the average of the first weighted PDR and the second weighted PDR .
  • An amount of power to play a video stream may be determined. For example, the length or duration of the video stream may be determined.
  • the power needed to play the video stream at a complexity level may be calculated by multiplying the length or duration of the video stream by the PDR (e.g. the PDR state) for the complexity level.
  • a remaining battery capacity may be determined.
  • the power that may be used to decode and/or play a video stream at a complexity level may be determined.
  • a determination may be made as to whether the power may exceed the remaining battery capacity. If the power may exceed the remaining battery capacity, another complexity level may be used to decode and play the video stream within the remaining battery capacity.
  • a device for power aware streaming may be provided.
  • the device may include a processor that may be configured to perform a number of actions. For example, a first complexity level may be determined for a data segment. The complexity level for a data segment may be determined by receiving the complexity level from a server or via a signal.
  • a computing load for a decoder may be determined.
  • a computing threshold may be determined. The computing threshold may be set by a user. It may be determined that the computing load may be above or below the computing threshold.
  • a second complexity level may be selected using the computing load.
  • a bit rate may be determined for the data segment.
  • FIG. 1 depicts an example HTTP-based video streaming system
  • FIG. 2 depicts an example block-based video encoder.
  • FIG. 3 depicts example block-based video decoder.
  • FIG. 4 depicts an example of power usage in an example video playback scenario.
  • FIG. 5 depicts an example power aware streaming system.
  • FIG. 6 depicts example contents that may be generated at the server side by considering resolution, bitrate, and complexity.
  • FIG. 7 depicts an example of a complexity aware media presentation description • M Ps) ; file.
  • FIG. 8 depicts an example process that may be implemented by power adaptation logic for a quality mode.
  • FIG. 9 depicts an example process that may be implemented by power adaptation logic for a multitasking environment
  • FIG. 10 depicts an example system in which a decoder device may stream media content from multiple different content sources.
  • FIG. 1 1 depicts an example of quantizing complexity levels.
  • FIG. 12 depicts an example of bias in computation of power dissipation state using a reduced state set.
  • FIG. 13 depicts an example of interpolation for reducing or eliminating bias when updating power dissipation states of a reduced state set.
  • FIG. 14 A is a system diagram of an example communications system in which one or more disclosed embodiments may be implemented.
  • FIG. 14B is a system diagram of an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated in FIG. 14A.
  • WTRU wireless transmit/receive unit
  • FIG. 14C is a system diagram of an example radio access network and an example core network that may be used within the communications system illustrated in FIG. 14 A.
  • FIG. 14D is a system diagram of an another example radio access network and another example core network that may be used within the communications system illustrated in FIG. I4A.
  • FIG. 14E is a system diagram of an another example radio access network and another example core network (hat may be used within the communications system illustrated in FIG. 14A.
  • Power aware adaptation for a power aware video streaming system may be based on complexity information, which may be conveyed in a number of ways.
  • a complexity level of a data stream such as a video data stream, may be selected as a function of a remaining batter '- power of a wireless transmit/receive unit (WTRU) and on a state set of a plurality of state sets that may be stored and/or managed by the WTRU. These state sets may correspond to, for example, different content sources and/or different complexity estimation algorithms and may be used to select the complexity level of the data stream.
  • the data stream may then be received at the selected complexity level.
  • the complexity level and/or a bitrate of the data stream may be adapted to accommodate, for example, the remaining battery power and/or other circumstances.
  • the adaptation may be customized according to the objectives of use cases.
  • a decoder device such as a WTRU, may set a limit on the number of state sets it may track and may delete state sets when this limit may be exceeded.
  • the decoder device may merge state sets that may be similar, and/or may quantize complexity levels to power dissipation rate (PDR) states.
  • PDR power dissipation rate
  • a device for power aware streaming may be provided.
  • the device may include a processor ihai may perform a number of actions.
  • a complexity level for a data segment may be determined.
  • the complexity for the data segment may be received from a server or via a signal.
  • the data segment may be a segment of a video stream.
  • the processor may determine a complexity level for a data segment that may be used by a decoder.
  • the PDR for the complexity level may be based on a power that may be dissipated while decoding the data segment.
  • the PDR for the complexity level may be determined using a first battery level and a second battery level A state, such as a PDR state, may be calculated using the PDR.
  • a second PDR may be determined for the complexity level.
  • the state such as the PDR state for the complexity level, may be calculated using a first PDR and a second PDR.
  • the PDR state may be calculated by calculating a weighted average of the first PDR and the second PDR.
  • the PDR state may be calculated by calculating a first weighted PDR by apply ing a first weight to the first PDR; calculating a second weighted PDR by applying a second weight to the second PDR; and setting the PDR state to the average of the first weighted PDR and the second weighted PDR.
  • An amount of power to play a video stream may be determined. For example, the length or duration of the video stream may be determined.
  • the power needed to play the video stream at a complexity level may be calculated by multiplying the length or duration of the video stream by the PDR (e.g. the PDR state) for the complexity level.
  • a remaining battery capacity may be determined.
  • the power thai may be used to decode and/or play a video stream at a complexity level may be determined.
  • a determination may be made as to whether the power may exceed the remaining battery capacity. If the power may exceed the remaining battery capacity, another complexity level may be used to decode and play the video strea within the remaining battery capacity.
  • a device for power aware streaming may be provided.
  • the device may include a processor that may be configured to perform a number of actions. For example, a first complexity level may be determined for a data segment. The complexity level for a data segment may be determined by receiving the complexity level from a server or via a signal.
  • a computing load for a decoder may be determined.
  • a computing threshold may be determined. The computing threshold may be set by a user. It may be determined that the computing load may be above or below the computing threshold.
  • a second complexity level may be selected using the computing load.
  • a bit rate may be determined for the data segment.
  • power adaptation may be performed at the client side. For example, power adaptation may be applied in a power aware streaming system.
  • Power dissipation states may be tracked, maintained, and used by a decoder device, such as a WTRU.
  • a decoder device such as a WTRU.
  • content providers or content sources may use different algorithms to estimate complexity of content, which may be related to the power dissipation associated with decoding the content.
  • a decoder device may recognize and adapt to these different algorithms,
  • Video streaming may be a requested video service for mobile platforms that may operate in a wireless network.
  • network conditions may vary, display sizes may vary, and processing capabilities may vary, and there may be a battery life.
  • DASH dynamic adaptive streaming over HTTP
  • EdgeCast and Level 3 use Smooth Streaming from Microsoft
  • Akamai and CloudFront use Dynamic FITTP Streaming from Adobe
  • iOS devices may support Apple's HTTP Live Streaming.
  • media may be organized into segments that may be decodable.
  • the content may be encoded at different qualities or resolutions and may be chopped into segments.
  • the information of those contents, such as bitrate, byte range, and URL, may be described with : XML-based manifest file (MF) called a Media Presentation Description (MPD).
  • MF XML-based manifest file
  • MPD Media Presentation Description
  • the client may access this content through FITTP and may select the segments that may fulfill its bandwidth or resolution criteria according to the MPD file.
  • FIG. 1 depicts an example HTTP-based video streaming system, such as system 200.
  • Captured content may be compressed and may be chopped into small segments. For example, segments may be between 2 and 10 seconds long in some streaming systems.
  • the segments may be stored in one or more HTTP streaming servers and/or distributed via content delivery network (CDN).
  • CDN content delivery network
  • the client may request and receive a MPD file from a FITTP streaming server at 202.
  • the client may decide which segments to request according to its capabilities, such as its display resolution and the available bandwidth.
  • the client may request the appropriate segments from a server at 202, which may send the video segments to the client per its request.
  • the segments transmitted via HTTP may b cached in a HTTP cache server or servers at 204. This may allow cached segments to be reused for other users, and may allow the system to provide streaming service on a large scale.
  • FIG. 2. is a block diagram illustrating an example block-based layer video encoder 300 that may be used to generate bitstreams for a streaming system, such as streaming system 200.
  • a single layer encoder may employ, for example, spatial prediction (which may be referred to as intra prediction) at 302. and/or temporal prediction (which may be referred to as inter prediction and/or motion compensated prediction) at 304 to predict the input video signal.
  • the encoder may also have mode decision logics 306 that may choose a form (e.g., the most suitable form) of prediction, for example, based on certain criteria, such as a combination of rate and distortion consideratsons.
  • the encoder may transform at 308 and quantize at 310 the prediction residual (which may be the difference signal between the input signal and the prediction signal).
  • the quantized residual together with the mode information (e.g., intra or inter prediction) and prediction information (such as motion vectors, reference picture indexes, intra prediction modes, etc.) may be further compressed at an entropy coder 312. and packed into an output video bitstream 314. As shown in FIG. 2, the encoder may also generate the mode information (e.g., intra or inter prediction) and prediction information (such as motion vectors, reference picture indexes, intra prediction modes, etc.) may be further compressed at an entropy coder 312. and packed into an output video bitstream 314. As shown in FIG. 2, the encoder may also generate the mode information (e.g., intra or inter
  • the reconstructed video signal may go through a loop filter process 322, for example, deblocking filter, Sample Adaptive Offsets, or Adaptive Loop Filters, and may be siored in a reference picture store 324 that may be used to predict a video signal.
  • a loop filter process 322 for example, deblocking filter, Sample Adaptive Offsets, or Adaptive Loop Filters, and may be siored in a reference picture store 324 that may be used to predict a video signal.
  • FIG. 3 is a block diagram of an example block-based single layer decoder 400 that may receive a video bitstream produced by an encoder, such as encoder 300, and may- reconstruct the video signal to be displayed.
  • the bitstream may be parsed by an entropy decoder 402.
  • the residual coefficients may be inverse quantized at 404 and inverse transformed at 406 to obtain a reconstructed residual.
  • the coding mode and prediction information may be used to obtain the prediction signal using spatial prediction 408 and/or temporal prediction 410.
  • the prediction signal and the reconstructed residual may be added together at 412. to get a reconstructed video.
  • the reconstructed video may go through loop filtering at 414, may be stored in a reference picture store 416, may be displayed at 418, and/or may be used to decode a video signal.
  • a number of video coding standards that may use a block-based encoder, such as encoder 300, and a decoder, such as decoder 400, include MPEG-2 Video, MPEG4 Visual, H.264/AVC, and HEVC.
  • the power endurance of mobile devices may affect application performance.
  • Power usage for a reference mobile platform may be analyzed under various conditions. For example, power consumption of components, e.g., processor, display, internal and external memory, and radio (e.g., GSM/3G/4G/WiFi), may be evaluated. In different applications, power consumption percentages among those parts may be different.
  • video playback may be an intensive power consumption application as it may involve both computation and memory access. Moreover, video playback may display the pictures with sufficient luminance levels, which may also consume much power.
  • FIG, 4 depicts an example of power usage in an example video playback scenario in which CPU, graphics, and display (backlight and LCD) parts may consume power.
  • Power saving approaches may adaptively switch working modes according to system status. For example, when the system status may be idle, the device processor may transition to a low power state, keeping requested modules working to reduce power consumption. Other power saving approaches may switch the processor's frequency adaptively. For example, when the processor's frequency decreases, the voltage of a power supply may also decrease to reduce power consumption.
  • the power dissipated by a chip may be formulated as:
  • C is the capacitance, is the voltage and /is the switching frequency.
  • the frequency of the processor may be configured for different tasks. For example, because decoding complexity may be different for a picture, a processor's clock frequency may be reduced when decoding easier to decode pictures.
  • An encoded picture size may be used to estimate a picture's decoding time.
  • the processor frequency may be decreased, for example, when the estimated picture decoding time may be less than the picture duration.
  • power may be allocated among different modules, such as the display, the decoding module, etc. For example, a client device may decrease the luminance of display and/or skip some frame decoding if the system determines that the remaining power m y not be sufficient to play the remaining video.
  • Power saving methods may be client side technologies and may prevent full-length playback even at the cost of frame dropping and/or motion jerkiness. Power saving issues for mobile devices may be addressed based on power aware computing. For example, the server and the client may collaborate. For example, the server may prepare the video content with power consumption considerations, and the client may subscribe to presentations with different complexities according to the available bandwidth, remaining battery, and remaining video playback time.
  • Clients may try to decode and playback video after they receive the video segments from the server.
  • decoding and playback may occupy a portion of the processor's resources to meet time criterion.
  • Methods may be used to prevent the processor's resources from becoming burdened, which may prevent the client from becoming unable to playback video smoothly when the processor straggles to decode it in real time.
  • methods may be used to prevent the client from dropping frames or presenting asynchronous audio and video.
  • methods may allow an improvement of the system's response time in a multi-task oriented environment
  • Power aware adaptation methods may be used by a power aware adaptation control module in a power aware video streaming system. These power aware adaptation methods may achieve power savings and/or better processor load balancing. Power adaptation may be implemented at the client side, which may be applied in power aware streaming systems.
  • FIG. 5 is a block diagram illustrating an example power aware streaming system 600
  • the power aware streaming system 600 may include a power aware video encoder 602, a compiexity aware MPD 800, a power aware adaptation control module 604, and/or a power sensing module 606.
  • the power aware adaptation control module 604 and/or the power sensing module 606 may be implemented in, for example, a power aware streaming client 608.
  • a power aware video encoder 602 may generate various versions of video segments with different complexity levels.
  • FIG. 6 depicts a graph comparing example contents that may be generated at the server side by considering resolution, bitrate, and complexity. In the example shown in FIG. 6, a number of complexity levels may be available for a bitrate, such as at 702, 704 and 706.
  • the complexity level information may be added in an MPD file, media description file, or other types of manifest file that may be signaled to a client.
  • FIG. 7 illustrates an example of a complexity aware MPD file 800.
  • a description 802 of a representation element may include a complexityLevel attribute 804 that may indicate a bitstream complexity level.
  • the MPD file or other types of manifest fifes may be used as an example in this disclosure to carry the complexity information
  • the complexity information may be embedded in video bitstreams using high level parameter sets such as Video Parameter Set (VPS), Sequence Parameter Set (SPS), and the like.
  • Compiexity information may be conveyed in an MPEG Media Transport (MMT).
  • MMT MPEG Media Transport
  • compiexity information may be coded as a property element in an MMT Asset, or it may be conveyed as an MMT message to clients.
  • the Green MPEG Call for Proposals (CfP) may define metadata files that may be used to cany useful mformation to reduce power consumptions on the devices. Such metadata files may be used to convey the complexity information.
  • a power aware adaptation control at the client side may adaptively choose segments for a receiver according to information that may be obtained from bandwidth sensing, power sensing, and/or CPU load status.
  • the adaptation logic may promote full-length playback using the remaining battery power on a device; achieving acceptable, e.g., the improved video quality; satisfying the currently available bandwidth; and/or achieving CPU load balancing.
  • power adaptation logic 604 may be customized based on objectives of different applications.
  • the power adaptation logic 604 may be used in a variety of use cases such as if the user configures the client to a quality mode. If the user prefers quality over power, the power adaptation logic 604 may prioritize full-length playback with the remaining power and the best available video quality. If the user configures the client to a load balance mode, e.g., in a multitasking environment, the power adaptation logic 604 may prioritize full-length playback within the processor load budget. If the user configures the client to a power saving mode, e.g., if the user prefers to conserve the remaining battery power, the power adaptation logic 604 may prioritize full-length playback within the processor load budget and/or consuming as little power as possible.
  • a power aware client 608 may measure a battery level (BL) and/or may determine battery usage.
  • the power sensing module 606 may use an interface that may provide the current battery level.
  • the power sensing module 606 may periodically use a power management interface that may be provided, for example, by an operating system such as the Android operating system, which may indicate a remaining percentage of an entire batter '- capacity.
  • the power sensing module 606 may use an interface that reports battery usage by an application or a process.
  • the Android operating system may have an interface that may provide power and processor usage statistics for one or more applications (e.g., an application) and/or the power usage for the display.
  • the power sensing module 606 may determine power usage due to video decoding and display in a muli-task or multi-process environment.
  • the power aware client 608 may then apply the power adaptation logic 604 to ensure efficient power usage.
  • Adaptation may involve updating a power dissipation rate (PDR) and adapting a complexity level (CL) to satisfy configurable objectives.
  • PDR power dissipation rate
  • CL complexity level
  • the power dissipation rate (PDR) for a. complexity level may be measured and updated periodically, for example, according to the equation:
  • k is the complexity level
  • t t is the time of the f" segment
  • CLMI N and CL 4AT may be the minimum and maximum complexity levels, respectively, in the system
  • (3 ⁇ 4 may be the complexity level of the i " segment
  • PDR(CL it i, may be the PDR value of complexity level i3 ⁇ 4 at time /,-
  • BL,- may be the remaining battery level at time t t
  • a may be the factor to control the updating speed, which may be set to, for example, 0.6, in which case a PDR value may be updated using equation (1) and may be a weighted combination of .6 times a current PDR observation and .4 times a previous PDR state.
  • the PDR values of a complexity value (e.g., all complexity values) at the beginning of the video session may be initialized to 0. The PDR value of complexity levels may be updated. If the battery level does not change, then the PDR statistics may be kept. Otherwise, the PDR statistics may be updated accordingly.
  • the amount of power requested to playback the remaining video with complexity level (3 ⁇ 4, denoted as Pi. ' ⁇ : ( . 7. . tj) may be estimated as
  • PCAC , ⁇ ,) PDR(CL, t.) *(T- ⁇ ,) (2)
  • the client may decide whether to switch up or down or to keep the current complexity level according to the customized objectives.
  • the power adaptation logic 604 may fry to achieve full-length video playback before the batter '- power may be exhausted.
  • FIG. 8 depicts an example process that may be implemented by power adaptation logic for a quality mode.
  • example process 900 that may be implemented by the power adaptation logic 604 (show in FIG. 5) for a quality mode, which may provide a video quality given the remaining battery level and the available bandwidth.
  • the bandwidth statistics and the power statistics for a current complexity level (CL) may be updated.
  • the power consumption (PC) may be estimated for a complexity level or complexity levels for the remaining video playback.
  • the power consumption for the current complexity level may be less than the remaining battery life ( ⁇ ⁇ -,); whether the current complexity level may be less than the maximum complexity level; and whether the power consumption for the next higher complexity level may be less than the remaining battery life, e.g., whether it may be feasible to play the remaining video at the next higher complexity level given the remaining battery life. This may prevent the client from switching up and down too frequently and may promote smoother decoded video quality. This decision may be made based on the PDR learned from pre vious statistics during playback. If these conditions may be met, the decoder may be notified at 910 to switch to a normal decoding mode, and the complexity level may be switched upward. The segment may be downloaded at the adjusted complexity level and selected bitrate at 912.
  • the complexity level may be switched downward at 916.
  • the complexity level may be switched downward by switching to a next lower complexity level. This approach to switching down the complexity level may enable the complexity level to be switched down gradually to allow a smoother quality transition.
  • the complexity level may also be switched downward by searching for a complexity level for which the remaining power may be enough to playback the remaining video. This may be done using, for example, the following in Table 1 :
  • This approach may switch the complexity level downward more quickly to save more power. However, if this switching method erroneously decides to switch down too many levels, e.g. , if the PDR estimation given by equation ( 1) may not be accurate enough, the power adaptation logic may decide later to switch the complexity level up again. Regardless of how the complexity level may be switched downward, the segment may then be downloaded at the adjusted complexity level and selected bitrate at 912. [0064] Similarly, gradual or more aggressive logics may also be applied when the client decides whether to switch up complexity level or not. For example, a more gradual switch up method may switch up by one complexity level at a time, as shown in FIG. 8, whereas a more aggressive switch up method may switch up to the next highest complexity level that may be feasible. A more aggressive method may not only cause larger video quality changes due to more frequent switching, as in the case of switching the complexity level downward, but may also mn a higher risk of using up the battery power before full-length playback may be achieved.
  • the power consumption for the current complexity level may not be greater than the remaining battery life, e.g., there may be insufficient battery power to playback the video at the current complexity level, or the current complexity level may not be higher than the minimum complexity level (e.g, the lowest complexity level available from the server is already being used).
  • the power consumption for the current (e.g. minimum) complexity level may be greater than the remaining battery life and/or whether the bitrate (BR) may be the minimum bitrate. If the lowest bitrate may be reached and the power consumption for the current (e.g. minimum) complexity le vel may be greater than the remaining battery life, e.g., there may be insufficient battery power to playback the video at the current (e.g. minimum) complexity level, at 922 the current (e.g.
  • the decoder may be switched to a lower power decoding mode.
  • in-loop filtering such as deblocking and/or SAO in HEVC maybe bypassed for non-reference frames.
  • the segment may then be downloaded at the complexity level and bitrate at 912.
  • the bitrate may be switched to a lower bitrate, and the complexity level may be set to a new complexity level at the lower bitrate, for example, a higher complexity level or the highest complexity level at the lower bitrate.
  • the segment may be downloaded at the new (e.g. higher or highest) complexity level and lower bitrate at 912,
  • FIG. 9 depicts an example process 1000 that may be implemented by power adaptation logic, such as power adaption logic 604 (shown in FIG. 5) for a multitasking environment.
  • power adaptation logic such as power adaption logic 604 (shown in FIG. 5) for a multitasking environment.
  • the process 1000 may involve a number of decisions and actions that may not be performed in the process 900.
  • the process 1000 may involve decisions and actions that may be used by the client to balance the system load while trying to provide full- length playback. Before checking remaining power, at 1032, the client may check if the system may be overloaded or not by measuring average CPU usage during a short period.
  • the system may switch down the complexity level at 1034 until the complexity level reaches a minimum bound CL I N - If the system may still be o v erloaded at CLMIN, the client may switch to a lower bitrate representation and may set the complexity level to a new complexity level (e.g., a higher or the highest complexity level) at 1036 if current bitrate may be greater than the lowest bitrate (BR min). If the client reaches the lowest bitrate and the lowest complexity level, it may notify the decoder to apply a low complexity decoding mode at 1038 to reduce operations, for example, by skipping some in-loop filtering operations for non-reference frames.
  • a new complexity level e.g., a higher or the highest complexity level
  • the system processor may not be overloaded, then it may apply the same power aware adaptation logic as in the process 900 to provide the full-length playback.
  • the client may switch down bit rates before it switches down complexity levels. For example, the client may switch to a lower bit rate before it may reach the minimum complexity level CLMI N , or the client may ignore complexity levels and only switch to lower bit rates,
  • the client may choose the lowest complexity level at the lowest bitrate to minimize the power consumption during video playback. If the remaining power may not be enough for full-length playback, the client may notify the decoder to apply additional power saving decoding modes, such as those described herein.
  • Adaptation decisions may be based on factors such as the remaining battery level, the availability of multimedia content segments at different complexity levels and bitrates, and/or power dissipation states tracked by the decoder device.
  • the decoder device may track power dissipation states according to equation (1).
  • the resulting power dissipation states PDR(kj j ) may provide a device-specific understanding of a power dissipation rate that may be expected for a number of complexity levels k.
  • Power dissipation states may be tracked, maintained, and used by a decoder device, such as a WTRU.
  • a decoder device such as a WTRU.
  • Different content providers or content sources may use different algorithms to estimate complexity.
  • Signaled complexity level values from one content source may map to different power dissipation rates than signaled complexity levels from a different content source.
  • a decoder device may recognize this and may adapt to the use of different algorithms to estimate complexity.
  • a decoder device may have limited memory for state tracking, A decoder device may manage the state memory while still accurately tracking power dissipation states.
  • FIG. 10 depicts an example system i 100 in which a decoder device i 102, e.g. , a WTRU, may stream media content from multiple different content sources 1 104, 1 106, 1 108, 1 1 10.
  • These content sources may be different content web sites, content providers, content sendees, etc.
  • a first source 1 104 may be YouTube
  • a second source 1 106 may ⁇ be CNN Video.
  • the different content sources may provide multiple versions of content at different complexity levels and/or different bit rates, as described herein. Complexity level estimates may be provided for the available content versions.
  • the media content from different content sources may have been encoded using different encoding tools.
  • the source 1 104 may provide content encoded using a first video encoder
  • the source 1106 may provide content encoded using a second video encoder.
  • the different encoding tools may be provided by different encoder vendors, for example.
  • the complexity estimation algorithm may be standardized across content sources.
  • a decoder device may track power dissipation states based on its own observations of battery usage when playing back video at the various signaled complexity levels. This may allow the decoder to interpret the complexity levels for different decoding resources in light of the power dissipation performance.
  • the complexity estimation algorithm used by the content sources may not be standardized across content sources.
  • a content source may customize a complexity estimation algorithm based on its own requests, and the complexity estimation algorithm may also be changed and improved over time.
  • the decoder devices may adapt to changes in the complexity estimation algorithm,
  • Different content sources may provide complexity level estimates generated using different complexity estimation algorithms.
  • the complexity level estimates provided by one content source may not be compatible with the complexity level estimates provided by a different content source.
  • a first content source may provide complexity level estimates using integers from 1 to 10
  • a second content source may provide complexity level estimates using integers from 1 to 100.
  • different value ranges or complexity scales may make for incompatibility
  • other algorithm differences may also render the complexity estimates from one content source incompatible with the complexity estimates from another content source.
  • one content source may give a particular weighting to addition operations and a different weighting to multiplication operations when creating the complexity estimation value .
  • Another content source may factor in the availability of specialized decoding hardware to the complexity estimate.
  • a complexity level value (e.g., "ComplexityLevel ⁇ lO”) signaled by a first content source may correspond to one power dissipation rate, and the same complexity level value signaled by a second content source may correspond to a different power dissipation rate, from the point of view of a decoder device.
  • the decoder device 1 102 may track multiple state sets that may correspond to different complexity estimation algorithms that may be used by different content sources. As illustrated in FIG. 10, the decoder device 1 102 may have multiple state sets 1 1 12, and may have a state manager component 1 1 14 that may be used to create and manage the multiple state sets 1 1 12.
  • a mu ltiple state set may comprise a set of power dissipation states computed from observations of power dissipation while decoding various segments of video at different advertised complexity levels.
  • a state set may have power dissipation states PDR(k,i j ) as computed using equation (1 ).
  • a state set may be constructed from power dissipation data observed during a single streaming session, or from data observed across multiple streaming sessions. For purposes of brevity, the may be omitted from the notation PDR(k j): accordingly, the notation PDR(k) may represent a recent power dissipation rate states for complexity level k.
  • State sets may correspond to different content sources.
  • a decoder device may maintain a different state set for a content source.
  • Different state sets may correspond to different content web sites, content providers, or content services.
  • the decoder device may distinguish content sources using a domain name (e.g.
  • youtube.com vs. cnn.com
  • the state set may be maintained across multiple streaming sessions, so that compatible observations may be collected to the proper state set.
  • This technique may resolve issues of data sparsity, and may allow the decoder device to begin a streaming session with an already developed state model.
  • the decoder device 1 102 may have a first streaming session in which content may be streamed from YouTube.
  • the state manager 1 1 14 may create and initialize a new state set, and the state set may be progressively updated based on observations of the power dissipation rate in light of the complexity level labels provided by YouTube for the various content segments.
  • the power dissipation states may be updated during the first streaming session using Equation (1), for example.
  • the decoder device may end the first streaming session and may engage in other streaming sessions with other content sites resulting in additional (e.g., separate) state sets being created or updated.
  • the decoder device may have a second streaming session in which content may be streamed from YouTube.
  • the decoder device may recognize that a state set for YouTube may exist. For example, the decoder device may match the domain name (youtube.com) from the new streaming session to the same domain name associated with an existing state set. Based on this recognition/matching, the decoder device may utilize the existing YouTube state set for the second streaming session. For example, the decoder device may begin the second streaming session with the developed power dissipation states from the existing state set. The decoder device may use these previously stored power dissipation states to drive adaptation decisions from the beginning of the second streaming session. The decoder device may progressively update the existing state set using power dissipation rate observations from the second streaming session, e.g., based on equation (1).
  • State sets may correspond to different complexity estimation algorithms, regardless of the content source.
  • a first content source may provide content encoded using a third party encoding tool, such as the Acme Encoder vl.O, and this encoding tool may have a built-in algorithm to estimate the complexity of the video segments it encodes.
  • a second content source may provide different content encoded using the same third party encoding tool.
  • the first content source and the second content source (e.g., two different content sources) may provide complexity level estimates produced by the same complexity estimation algorithm, such as the complexity estimation algorithm embedded in the third party encoding tool.
  • An identifier e.g., a complexity estimation identifier (CEID)
  • CEID complexity estimation identifier
  • the decoding device may create and/or maintain a different state set for each different complexity estimation algorithm it may encounter, regardless of the content source.
  • the CEID may be, for example, an identification string or number that identifies a complexity estimation algorithm.
  • the CEID may be assigned by a registration authority.
  • the CEID may be created, assigned, or randomly generated by the provider of the complexity estimation algorithm.
  • the provider may generate an identification string, e.g., "Acme-CEID- Version- 1 -0-5" or a Globally Unique Identifier (GUID) number, e.g., "138294578321" to distinguish complexity estimates provided by a particular version of Acme video encoding software.
  • GUID Globally Unique Identifier
  • Such a GUID number may be random.
  • the provider may provide a different identification string or a different random number to distinguish complexity estimates provided by a different version of its software.
  • the pro vider may make the CEID a vaiiabie to the content source that uses the encoding software so that the content source may signal the CEID to the decoder device together with the complexity level estimate values. This may be done in an automated way.
  • a version of the encoder software may be aware of the CEID corresponding to the complexity level estimation algorithm embedded in the software, and the software may output the CEID along with the complexity level estimates when encoding content.
  • the software may be capable of generating raw data for inclusion in an MPD file advertising the encoded content, or may be capable of generating the MPD itself.
  • the data or the MPD produced by the software may include the CEID in addition to the complexity level estimates.
  • a decoder device may recognize the same CEID when streaming content from different content sources, may recognize that the complexity level estimate values associated with such content were produced by the same complexity estimation algorithm, and may utilize the same state set for content with the same advertised CEID.
  • a decoder device may recognize that some content available from a content source may have been produced using a first complexity estimation algorithm corresponding to a first CEID, and that other content available from the same content source may have been produced using a second complexity estimation algorithm corresponding to a second CEID.
  • the decoder device may use two different state sets to track power dissipation rates for content corresponding to the two different CEIDs. For example, a content site may update its encoding software and/or its complexity estimation algorithm on a certain date, such that content encoded before the date may be associated with one CEID, and content encoded after the date may be associated with a different CEID.
  • a decoder device may recognize the two distinct CEIDs and may maintain two different state sets corresponding to the two different CEIDs.
  • a decoder device may utilize a state set when it encounters a CEID that it recognizes as associated with the state set. If the decoder device does not have a state sei corresponding to a CEID it encounters while streaming, then the decoder device may create a state set associated with the newly encountered CETD.
  • a decoder device e.g., a state manager on a decoder device, may have and/or may use management functions to reduce or limit the number of state sets ihai the decoder device tracks.
  • the management functions may detect when a state set may not have been used for a time period ⁇ e.g., unused for two weeks), or when a state set has been used seldomiy (e.g., used two or fewer times in a three-month period). This may be the ease, for example, if the content source may not be a popular content source or if the user of the decoder device infrequently streams from a particular content source.
  • the management functions may delete a state set which may be unused or used seldomiy, thus saving memory on the decoder device.
  • a decoder device may have a limit ⁇ e.g., an upper limit) to the number of state sets it may track and/or to the amount of memory it may use for tracking state sets.
  • the management functions may detect when such limits may be exceeded and may delete one or more state sets to bring the number of state sets or the memory used to store state sets back under the limit.
  • the state sets may be deleted based on a reverse priority order; for example, the least frequently used state sets may be deleted first.
  • Deletion of state sets to reduce state set memory may be performed during a streaming session. For example, if a streaming session involves the creation of a state set, and creating the state set may cause the decoder device to exceed a maximum number of state sets, a management function may be called during the streaming session to delete the lowest priority ⁇ e.g., least frequently used) pre-existing state set to make room for the new state set. Deletion of state sets may be performed between streaming sessions or during idle periods.
  • the decoder device may delete a state set that may have been useful for a later streaming session.
  • the decoder device may create a state set and begin tracking power dissipation states at the beginning of the streaming session. While the benefits of using a preexisting state set may be lost in this case, the system may still function and may be able to make adaptation decisions based on the newly created state set.
  • State sets may be merged opportun stically. A decoder device may detect that two state sets may be similar so that the state sets may be merged.
  • two or more content sources may use the same complexity level estimation algorithm, but may not advertise CEIDs, such that the decoder device may not be able to tell from the top level signaling thai the complexity level estimation algorithms may be the same.
  • the decoder device may compare the two state sets and may determine similarity. If simil rity may be determined, the decoder device may merge the two state sets and reduce the overall number of state sets maintained by the decoder device. Detection of similarity across state sets may be based on a variety of factors.
  • State sets may be sufficiently evolved to permit evaluation or comparison.
  • the decoder device may track the number of power dissipation observations used to construct a state set or the total playback time observed to construct the state set.
  • the decoder device may apply a threshold to consider a state set mature enough for comparison.
  • the threshold may be global to the state set.
  • An example threshold may be that a state set may be mature enough to permit comparison when it has been updated using at least eight minutes of video playback.
  • the threshold may be applied per power dissipation state.
  • An example threshold may be that a state set is mature enough to permit comparison once a power dissipation state PDR(k) has been updated based on at least five video playback segments.
  • State sets may have compatible complexity level values to permit evaluation or comparison.
  • a first state set may evolve such that it may have states for PDR(k) with k e. ⁇ 1,2,3, ...10 ⁇ .
  • the first state set may be compared to a second state set that may also have evolved to have states for PDR(k) with k e ⁇ 1,2,3, ...10 ⁇ .
  • It may be difficult to compare the first state set with a third state set thai may have evolved to have states for PDR(k) with k e ⁇ 20,25,30,35,40,45,50 ⁇ . If two state sets may be produced based on a different set or a different range of signaled complexity level values, the underlying complexity estimation algorithms may not be directly comparable.
  • additional signaling may be added to the MPD and/or sent via external/alternative means to indicate the full range of complexity level values used by the server and/or the encoder. This may allow the device to more easily detect the full range of complexity level values (e.g., instead of having to make multiple observations of the complexity levels in multiple video segments) used by different content sources, servers, and/or encoders and to more easily assess the similarity of the underlying complexity estimation algorithms used by different content sources, servers, and/or encoders. If CEID is used, the full range of complexity levels may be signaled together with each CEID.
  • State sets may be compared using a suitable comparison metric.
  • the state sets may be compared using a comparison metric.
  • the power dissipation states may be put in vector form and a norm of the difference between the two state vectors may be computed.
  • the norm may be an L 1 norm or an L 2 norm, for example.
  • the norm of the difference may be compared to a threshold, and if the nor may be under a threshold, the two state sets may be considered similar enough that the state sets may be merged.
  • Other comparison metrics may be used.
  • a decoder device may compitte a difference between corresponding states of one state set and another state set, and may compare this metric to a threshold in order to determine whether two state sets may be sufficiently similar to merge.
  • state sets may be compared if they may be produced based on different sets or different ranges of signaled complexity level values to account for (he possibility that some states of a state set may not yet have any data observations or may have insufficient data observations to be considered a reliable metric for power dissipation at a signaled complexity level.
  • a state set may have mature state values for PDR(k) with k e
  • the remaining mature states may sufficiently characterize the state set to allow comparison with other state sets.
  • the state set may be compared to other state sets. For example, states that may not be mature or that may not be availa ble in either of the two state sets under comparison may be removed from the comparison, and corresponding states that may be mature (e.g., that may have been updated using a small amount of data, such as a minimum amount of data, for example, as determined by a threshold) may be compared in order to determine the similarity.
  • states that may not be mature or that may not be availa ble in either of the two state sets under comparison may be removed from the comparison, and corresponding states that may be mature (e.g., that may have been updated using a small amount of data, such as a minimum amount of data, for example, as determined by a threshold) may be compared in order to determine the similarity.
  • a comparison metric e.g., the I/ ' norm of the difference between the reduced state set vectors.
  • States that may not be mature or that may not be available in a state set may be interpolated from neighboring states that may be mature in the same state set. For example, linear interpolation may be used to fill in missing or immature states to allow a full comparison to another state set to be made.
  • a decoder device may merge the state sets. This may reduce the total number of state sets tracked by the decoder device, which may save memory.
  • the data for a state of the two state sets to be merged may be averaged to produce a corresponding state of the merged state set. This may be done as a simple average, or it may be done as a weighted average, e.g.,
  • PDR merged (k) A ⁇ PDRi(k) + B ⁇ PDR 2 (k)
  • the merged state set may be associated with contexts that may have been valid for the component state sets that may have merged. For example, if a first state set corresponding to You Tube may be merged with a second state set corresponding to CNN video, the resulting merged state set may be associated with both YouTube and CNN video.
  • the state sets may be associated with contexts, for example, via domain names, content service names or identifiers, and/or CEIDs as disclosed herein.
  • a merged set may correspond to more than two such contexts, as may be the result of multiple merges.
  • the decoder device or its state manager may perform state set comparisons and merging during a streaming session (e.g., in reaction to the current state set in use evolving to a point where it may be mature enough for comparison, or to a point where it may be sufficiently similar to another existing state set to be merged). Comparisons and merging may be done outside of an active streaming session, e.g. , periodically as a housekeeping activity or during idle periods.
  • Merging of state sets may reduce the overall state set memory requirements, and may reduce the number of distinct state sets that may be tracked by the decoder device. By combining the data from two or more similar state sets, issues of data sparsity may be resolved.
  • PDR states may be quantized.
  • the state tracking technique described by equation (1) may assign a state to a complexity level k that may be signaled by a content source. For example, if a content source signals complexity levels k with k e f I, 2,3, ...10 ⁇ , a state set may be produced with ten correspondmg values, PDR(k) with k e f 1,2,3, ...10 ⁇ . If a content source may- use a fine granularity of complexity levels (e.g., complexity levels k with k e ⁇ 1,2,3, ...1000 ⁇ ), tracking a state for a possible complexity level may increase the memory used for tracking the state set, and may also result in a data sparsity issue. For example, the decoder device may not see enough data at a complexit '- level to reliably compute a corresponding power dissipation state.
  • the decoder device may quantize the complexity level space to a number of discrete bins.
  • a bin may correspond to a power dissipation rate state. Data observations from multiple neighboring complexity levels may be fed into a bin, and the storage size and tracking complexity for a state set may be reduced and/or limited.
  • FIG. 1 1 depicts an example of quantizing complexity levels.
  • complexity levels 1202 that may be signaled by a content source may range from CL min to CL max .
  • the decoder device may divide this range into a number of bins, such as at 1204, 1206, 1208, 1210, and 1212. While FIG. 1 1 illustrates an example of five bins 1204, 1206, 1208, 1210, 1212, more or fewer bins may be used. For example, a larger number of bins (e.g., ten bins or twenty bins) may be used to characterize the relationship between complexity level and power dissipation rate.
  • the decoder device may request and/or receive a video segment associated with a signaled complexity level, CL. While playing back the video segment, the decoder device may observe the changing battery level and/or may compute a power dissipation rate for the playback of the video segment. The decoder device may map the signaled complexity level to the appropriate bin, as illustrated in FIG. 1 1. The decoder device may use the observed battery levels and/or the computed power dissipation rate to update a power dissipation state corresponding to the mapped bin. The update may be done according to equation (I), with the appropriate mapping of signaled complexity le vel CL to mapped bin k.
  • a state set with a small number of states ⁇ e.g., five states as shown in FIG. 11) may be produced from a more dense set of complexity levels signaled by a content source.
  • a value PDR(k) from this small state set may correspond to the power dissipation rate for the complexity level (CL) value in the center of a bin k. This may be illustrated in FIG. 1 1 by the unfilled circles 1214, 1216, 1218, 1220, 1222 at the center of a bin.
  • the values PDR(k) may be used to predict the power dissipation rates corresponding to the bin centers. To predict power dissipation rates corresponding to other values of CL, interpolation may be employed.
  • linear interpolation or polynomial interpolation may be used to interpolate between neighboring values of PDR(k).
  • a reduced state set may be used to predict power dissipation rates for any signaled value of CL, and the reduced state model may be used to drive adaptation decisions.
  • non-uniform quantization of the full range of complexity level values may be used. For example, finer quantization may be applied to the
  • the middle bins may be mapped to a small number of complexity levels, for example 3 complexity level values instead of 6 complexity levels as shown
  • coarser quantization may be applied to the complexity level values on the left/right sides (that is, the left right side bins may span a larger number of complexity levels, for example 9 complexity level values instead of 6 complexity levels as shown).
  • the alignment of complexity values relative to bin dividers may introduce a bias into the power dissipation rate updates when a reduced state set model may be used.
  • the frequency of use of complexity levels may introduce a bias. For example, as illustrated in FIG. 12, a signaled complexity level 1302 aligned near an edge 1304 of a bin 1306 may be played back frequently during the evolution of the reduced state set, while other signaled complexity levels 1308 within the bin 1306 may be played back less frequently.
  • the data used to represent the bin 1306 may be biased toward one side of the bin 1306, and the resulting state value PDR(k) may not accurately represent the center of the bin 1306.
  • the observed or computed values of battery- level change or power dissipation may be remapped to equivalent values corresponding to the center of bin k, before using such values to update the power dissipation rate state PDR(k).
  • the decoder device may store observations of complexity level CL and power dissipation rate PDR orig (CL) corresponding to the original complexity level space.
  • the decoder device may use interpolation to interpolate between neighboring values 1402, 1404 of (CL, PDR ori? (CL)) to determine a mapped value 1406 of power dissipation rate corresponding to the center of bin k.
  • This mapped power dissipation rate may be used to update PDR(k), the power dissipation rate state corresponding to bin k of the reduced state set,
  • F G. 13 illustrates the use of linear interpolation
  • other types of interpolation ⁇ e.g., polynomial interpolation, etc.
  • Interpolation may be used to remap an original power dissipation rate to an equivalent rate at the center of the bin.
  • a decoder device may interpolate between a power dissipation observation and a power dissipation rate state value of a neighboring bin.
  • this may correspond to replacing (CL 2 , PDR orjg (CL 2 )) on the right side of the interpolation with (CL equ , v (k+l), PDR(k+l)i, which may correspond to the equivalent complexity le vel and the power dissipation rate at the center of the next higher bin.
  • Interpolation may be done using either the next higher bin (e.g., k+!) or the next lower bin (e.g., k-1), depending on whether the current complexity level observation CL> is to the left or right, respectively, of the center of bin k. Remapping to the center of bin k may be performed without storing multiple power dissipation rate values.
  • a decoder device may use approaches disclosed herein to develop multiple reduced state sets having the same number of reduced state sets N.
  • the decoder device may create and maintain multiple state sets (e.g., corresponding to different content sources), where a state set may be a reduced state set comprising state values PDR(k), k c (1,2,3, ...N ⁇ . Because state sets may be based on the same reduced set of complexity levels k, the state sets may be more easily compared for merging . Pairs of state sets of the order-N reduced state sets may have compatible complexity level values that may permit evaluation or comparison. Comparison for potential merging may be performed for pairs of the order-N reduced state sets that may have sufficiently mature data to permit a comparison.
  • FIG. 14A is a diagram of an example communications system 100 in which one or more disclosed embodiments may be implemented.
  • the communications system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users.
  • the communications system 100 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth.
  • the communications systems 100 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA f OFDMA), single- carrier FDMA (SC-FDMA), and the like.
  • CDMA code division multiple access
  • TDMA time division multiple access
  • FDMA frequency division multiple access
  • SC-FDMA single- carrier FDMA
  • the communications system 100 may include wireless transmit/receive units (WTRUs) 102a, 102b, 102c, and/or 102d (which generally or collectively may be referred to as WTRU 102), a radio access network (RAN) 103/104/105, a core network 106/107/109, a public switched telephone network (PSTN) 108, the Internet 1 10, and other networks 1 12, though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements. Fiach of the WTRUs 102a, 102b, 102c, 102d may be any type of device configured to operate and/or communicate in a wireless environment.
  • WTRUs wireless transmit/receive units
  • the WTRUs 102a, 102b, 102c, 102d may be configured to transmit and/or receive wireless signals and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, and the like.
  • UE user equipment
  • PDA personal digital assistant
  • smartphone a laptop
  • netbook a personal computer
  • a wireless sensor consumer electronics, and the like.
  • the communications systems 100 may also include base station 1 14a and a base station 1 14b.
  • Each of the base stations 1 14a, 1 14b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102a, 102b, 102c, 102d to facilitate access to one or more communication networks, such as the core network 106/107/109, the Internet 1 10, and/or the networks 12.
  • the base stations 1 14a, 1 14b may be a base transceiver station (BT8), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and the like. While the base stations 1 14a, 1 14b are each depicted as a single element, it will be appreciated that the base stations 1 14a, 1 14b may include any number of interconnected base stations and/or network elements.
  • the base station 1 14a may be part of the RAN 103/104/105, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC) a radio network controller (RNC), relay nodes, etc.
  • BSC base station controller
  • RNC radio network controller
  • the base station 1 14a and/or the base station 1 14b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown).
  • the cell may further be divided into cell sectors.
  • the cell associated with the base station 1 14a may be divided into three sectors.
  • the base station 1 4a may include three transceivers, i.e., one for each sector of the cell.
  • the base station 1 14a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.
  • MIMO multiple-input multiple output
  • the base stations 1 14a, 1 14b may communicate with one or more of the WTRUs 102a, 102b, 02c, 102d over an air interface 1 15/1 16/1 17, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (LTV), visible light etc.).
  • the air interface 1 15/1 16/1 17 may be established using any suitable radio access technology (RAT).
  • RAT radio access technology
  • the communications system 100 may be a multiple access system and may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like.
  • the base station 1 4a in the RAN 103/104/1 05 and the WTRUs 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 1 15/1 16/1 17 using wideband CDMA (WCDMA).
  • UMTS Universal Mobile Telecommunications System
  • UTRA Universal Mobile Telecommunications System
  • WCDMA wideband CDMA
  • WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+).
  • HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).
  • HSPA High-Speed Packet Access
  • HSDPA High-Speed Downlink Packet Access
  • HSUPA High-Speed Uplink Packet Access
  • the base station 1 14a and the WTRUs 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 1 15/1 16/1 17 using Long Term Evolution (LTE) and/or LTE- Advanced (LTE-A).
  • E-UTRA Evolved UMTS Terrestrial Radio Access
  • LTE Long Term Evolution
  • LTE-A LTE- Advanced
  • the base station 114a and the WTRUs 102a, 102b, 02c may implement radio technologies such as IEEE 802.16 (i.e., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO, Interim Standard 2000 (18-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.
  • IEEE 802.16 i.e., Worldwide Interoperability for Microwave Access (WiMAX)
  • CDMA2000, CDMA2000 IX, CDMA2000 EV-DO Code Division Multiple Access 2000 (18-2000), IS-95, IS-856
  • GSM Global System for Mobile communications
  • EDGE Enhanced Data rates for GSM Evolution
  • GERAN GSM EDGERAN
  • the base station 1 14b in FIG. 14A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and the like.
  • the base station 1 14b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.1 1 to establish a wireless local area network (WLAN),
  • the base station 1 14b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN).
  • WLAN wireless local area network
  • WPAN wireless personal area network
  • the base station 1 14b and the WTRUs 102c, 102d may utilize a cellular- based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE--A, etc) to establish a picoceil or femtoeeil.
  • a cellular- based RAT e.g., WCDMA, CDMA2000, GSM, LTE, LTE--A, etc
  • the base station 1 14b may have a direct connection to the Internet 1 10.
  • the base station 1 14b may not be required to access the Internet 1 10 via the core network 106/107/109.
  • the RAN 103/104/105 may be in communication with the core network 106/107/109, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the WTRUs 102a, 102 b, 102c, 102d.
  • the core network 106/107/109 may provide call control, billing se dees, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perfonn high-level security functions, such as user authentication.
  • the RAN 103/104/105 and/or the core network 106/107/109 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 103/104/105 or a different RAT,
  • the core network in addition to being connected to the RAN 103/104/105, which may be utilizing an E-UTRA radio technology, the core network
  • 106/107/109 may also be in communication with another RAN (not shown) employing a GSM radio technology.
  • the core network 106/107/109 may also serve as a gateway for the WTRUs 102a, 102b, 102c, 102d to access the PSTN 108, the Internet 1 10, and/or other networks 1 12.
  • the PSTN 108 may include circuit-switched telephone networks that provide plain old telephone sendee (POTS).
  • POTS plain old telephone sendee
  • the Internet 1 10 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite.
  • the networks 1 12 may include wired or wireless communications networks owned and/or operated by other service providers.
  • the networks 1 12 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 103/104/105 or a different RAT.
  • Some or all of the WTRUs 102a, 102b, 102c, 102d in the communications system 100 may include multi-mode capabilities, i.e., the WTRUs 102a, 102b, 102c, 102d may include multiple transceivers for communicating with different wireless networks over different wireless links.
  • the WTRU 102c shown in FIG. 14A may be configured to communicate with the base station 1 14a, which may employ a cellular-based radio technology , and with the base station 1 14b, which may employ an IEEE 802 radio technology.
  • FIG. 14B is a system diagram of an example WTRU 102, As shown in FIG. I4B, the WTRU 102 may include a processor 1 18, a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, non-removable memor 130, removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and other peripherals 138. It will be appreciated that the WTRU 102 may include any subcombination of ihe foregoing elements while remaining consistent with an embodiment.
  • GPS global positioning system
  • base stations 1 14a and 1 14b, and/or the nodes that base stations 1 14a and 1 14b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include some or all of the elements depicted in FIG. 14B and described herein.
  • BTS transceiver station
  • Node-B a Node-B
  • site controller such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include some or all of the elements depicted
  • the processor 1 18 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller,
  • DSP digital signal processor
  • the processor 118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment.
  • the processor 1 18 may be coupled to the transceiver 120, which may be coupled to the
  • FIG. 14B depicts the processor 1 18 and the transceiver 12.0 as separate components, it will be appreciated that the processor 1 18 and the transcen-'er 120 may be integrated together in an electronic package or chip.
  • the transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 1 14a) over the air interface 1 15/1 16/1 17.
  • a base station e.g., the base station 1 14a
  • the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals
  • the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example.
  • the transmit/receive element 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
  • the WTRIJ 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MIMO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 1 15/1 16/1 17.
  • the transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122.
  • the WTRU 102 may have multi-mode capabilities.
  • the transceiver 120 may include multiple transceivers for enabling the WTRIJ 102 to communicate via multiple RATs, such as UTRA and IEEE 802.1 1, for example.
  • the processor 1 18 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 1 18 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the processor 1 18 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132,
  • the non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like.
  • SIM subscriber identity module
  • SD secure digital
  • the processor 1 18 may access information from, and store data in, memory that may not be physically located on the WTRU 102, such as on a server or a home computer (not shown).
  • the processor 1 18 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102.
  • the power source 134 may be any suitable device for powering the WTRU 102.
  • the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar ceils, fuel cells, and the like,
  • the processor 1 18 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102.
  • location information e.g., longitude and latitude
  • the WTRU 102 may receive location information over the air interface 1 15/1 16/117 from a base station (e.g., base stations 1 14a, 1 14b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
  • the processor 1 18 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
  • the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player
  • FIG. 14C is a system diagram of the RAN 103 and the core network 106 according to an embodiment.
  • the RAN 103 may employ a UTRA radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 1 15.
  • the RAN 103 may also be in communication with the core network 106.
  • the RAN 103 may include Node-Bs 140a, 140b, 140c, which may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 1 15.
  • the Node-Bs 140a, 140b, 140c may each be associated with a particular cell (not shown) within the RAN 103.
  • the RAN 103 may also include RNCs 142a, 142b. It will be appreciated that the RAN 103 may- include any number of Node-Bs and RNCs while remaining consistent with an embodiment.
  • the Node-Bs 140a, 140b may be in communication with the RNC 142a. Additionally, the Node-B 140c may be in communication with the RNCI 42b. The Node-Bs 140a, 140b, 140c may communicate with the respective RNCs 142a, 142b via an lub interface. The RNCs 142a, 142b may be in communication with one another via an Iur interface. Each of the RNCs 142a, 142b may be configured to control the respective Node-Bs 140a, 140b, I40c to which it is connected. In addition, each of the RNCs 142a, 142b may be configured to cany out or support other functionality, such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and the like.
  • outer loop power control such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and the like.
  • the core network 106 shown in FIG. 14C may include a media gateway (MGW) 144, a mobile switching center (MSG) 146, a serving GPRS support node (SGSN) 148, and/ or a gateway GPRS support node (GGSN) 150. While each of the foregoing elements are depicted as part of the core network 106, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • MGW media gateway
  • MSG mobile switching center
  • SGSN serving GPRS support node
  • GGSN gateway GPRS support node
  • the RNC 142a in the RAN 103 may be connected to the MSG 146 in the core network 106 via an IuCS interface.
  • the MSG 146 may be connected to the MGW 144.
  • the MSG 146 and the MGW 144 may provide the WTRUs 102a, 102b, 102c with access to circuit- switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and traditional land-line communications devices.
  • the RNC 142a in the RAN 103 may also be connected to the SGSN 148 in the core network 106 via an IuPS interface.
  • the SGSN 148 may be connected to the GGSN 150.
  • the SGSN 148 and the GGSN 150 may provide the WTRUs 102a, 102b, 102c with access to packet- switched networks, such as the Internet 1 10, to facilitate communications between and the WTRUs 102a, 102b, 102c and IP-enabled devices.
  • the core network 106 may also be connected to the networks 1 12, which may include other wired or wireless networks that are owned and/or operated by other service providers.
  • FIG. 14D is a sy stem diagram of the RAN 104 and the core network 107 according to an embodiment.
  • the RAN 104 may employ an E-UTRA radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 1 16.
  • the RAN 104 may also be in communication with the core network 107.
  • the RAN 104 may include eNode-Bs 160a, 160b, 160c, though it will be appreciated that the RAN 104 may include any number of eNode-Bs while remaining consistent with an embodiment.
  • the eNode-Bs 160a, 160b, 160c may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 1 16. In one
  • the eNode-Bs 160a, 160b, 160c may implement M1MQ technology.
  • the eNode-B 160a for example, may use multiple antennas to transmit wireless signals to, and recei ve wireless signals from, the WTRU 102a.
  • Each of the eNode-Bs 160a, 160b, 160c may be associated with a particular cell (not shown) and may be configured to handle radio resource management decisions, handover decisions, scheduling of users in the uplink and/or downlink, and the like. As shown in FIG. 14D, the eNode-Bs 160a, 160b, 160c may communicate with one another over an X2 interface.
  • the core network 107 shown in FIG. 14D may include a mobility management gateway (MME) 162, a serving gateway 164, and a packet data network (PDN) gateway 166. While each of the foregoing elements are depicted as part of the core network 107, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • MME mobility management gateway
  • PDN packet data network
  • the MME 162. may be connected to each of the eNode-Bs 160a, 160b, 160c in the RAN 104 via an S i interface and may serve as a control node.
  • the MME 162 may ⁇ be responsible for authenticating users of the WTRUs 102a, 102b, 102c, bearer
  • the MME 162 may also provide a control plane function for switching between the RAN 104 and other RANs (not shown) that employ other radio technologies, such as GSM or WCDMA.
  • the serving gate way 164 may be connected to each of the eNode-Bs 160a, 160b, 160c in the RAN 104 via the S I interface.
  • the serving gateway 164 may generally route and forward user data packets to/from the WTRUs 102a, 102b, 102c.
  • the serving gateway 164 may also perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for the WTRUs 102a, 102b, 102c, managing and storing contexts of the WTRUs 102a, 102b, 102c, and the like.
  • the serving gateway 164 may also be connected to the PDN gateway 166, which may pro vide the WTRUs 102a, 102b, 102c with access to packet-switched networks, such as the Internet 1 10, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices.
  • the PDN gateway 166 may pro vide the WTRUs 102a, 102b, 102c with access to packet-switched networks, such as the Internet 1 10, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices.
  • the core network 107 may facilitate communications with other networks.
  • the core network 107 may provide the WTRUs 102a, 102b, 102c with access to circuit- switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and traditional land-line communications devices.
  • the core network 107 may include, or may communicate with, an IP gateway (e.g., an IP multimedia subsystem (IMS) server) that serves as an interface between the core network 107 and the PSTN 108.
  • IMS IP multimedia subsystem
  • FIG. 14E is a system diagram of the RA N 105 and the core network 109 according to an embodiment.
  • the RAN 105 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with the WTRUs 102a, 102b, I 02c over the air interface 1 17.
  • ASN access service network
  • the communication links between the different functional entities of the WTRUs 102a, 102b, 102c, the RAN 1 05, and the core network 109 may be defined as reference points.
  • the RAN 105 may include base stations 180a, 180b, 180c, and an ASN gateway 182, though it will be appreciated that the RAN 105 may include any number of base stations and ASN gateways while remaining consistent with an embodiment.
  • the base stations 180a, 180b, 180c may each be associated with a particular cell (not shown) in the RAN 105 and may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 1 17.
  • the base stations 180a, 180b, 1 80c may implement MIMO technology.
  • the base station 180a for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102a.
  • the base stations 180a, 180b, 180c may also provide mobility management functions, such as handoff triggering, runnel establishment, radio resource management, traffic classification, quality of service (QoS) policy enforcement, and the like.
  • the ASN gateway 182 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 109, and the like.
  • the air interface 1 1 7 between the WTRUs 102a, 102b, 102c and the RAN 105 may be defined as an Rl reference point that implements the IEEE 802.16 specification.
  • each of the WTRUs 102a, 102b, 102c may establish a logical interface (not shown) with the core network 109.
  • the logical interface between the WTRUs 102a, 102b, 102c and the core network 109 may be defined as an R2 reference point, which may be used for authenticati n, authorization, IP host configuration management, and/or mobility management.
  • the communication link between each of the base stations 180a, 180b, 180c may be defined as an R8 reference point that includes protocols for facilitating WTRU handovers and the transfer of data between base stations.
  • the communication link between the base stations 180a, 180b, 180c and the ASN gateway 182 may be defined as an R6 reference point.
  • the R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of the WTRUs 102a, 102b, 102c.
  • the RAN 105 may be connected to the core network 109.
  • the communication link between the RAN 1 05 and the core network 109 may defined as an R3 reference point that includes protocols for facilitating data transfer and mobility management capabilities, for example.
  • the core network 109 may include a mobile TP home agent ( ⁇ - ⁇ ) 184, an authentication, authorization, accounting (AAA) server 186, and a gateway 188. While each of the foregoing elements are depicted as part of the core network 109, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
  • the MIP-HA may he responsible for IP address management, and may enable the WTRUs 102a, 102b, 102c to roam between different ASNs and/or different core networks.
  • the M1P--HA 184 may pro vide the WTRUs 102a, 102b, 102e with access to packet-switched networks, such as the Internet 1 10, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices.
  • the AAA server 186 may be responsible for user authentication and for supporting user services.
  • the gateway 188 may facilitate interworking with other networks.
  • the gateway 188 may provide the WTRUs 102a, 102b, 102c with access to circuit-switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and traditional land- line communications devices.
  • the gateway 188 may provide the WTRUs 102a, 102b, 102c with access to the networks 1 12, which may include other wired or wireless networks that are owned and/or operated by other service providers.
  • the RAN 105 may be connected to other ASNs and the core network 109 may be connected to other core networks.
  • the communication link between the RAN 105 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of the WTRUs 102a, 102b, 102c between the RAN 105 and the other ASNs.
  • the communication link between the core network 109 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks.
  • a WTRU may refer to an identity of the physical device, or to the user's identity such as subscription related identities, e.g., MSISDN, SIP URL etc.
  • WTRU may refer to application- based identities, e.g., user names that may be used per application.
  • Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memor (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
  • ROM read only memory
  • RAM random access memor
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
  • a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, L ! E, terminal, base station, RNC, or any host computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

L'invention concerne une adaptation sensible à l'énergie pour un système de diffusion vidéo en flux sensible à l'énergie qui peut être fondée sur les informations de complexité acheminées de différentes manières. Un niveau de complexité d'un flux de données, tel qu'un flux de données vidéo, peut être sélectionné en fonction d'une énergie de batterie restante d'une unité d'émission/réception sans fil (WTRU) et d'un ensemble d'états parmi une pluralité d'ensembles d'états qui peuvent être stockés et/ou gérés par la WTRU. Ces ensembles d'états peuvent correspondre, par exemple, à différentes sources de contenu et/ou à différents algorithmes d'estimation de complexité et peuvent être utilisés pour sélectionner le niveau de complexité du flux de données. Le flux de données peut ensuite être reçu au niveau de complexité sélectionné. Le niveau de complexité et/ou un débit binaire du flux de données peuvent être adaptés, par exemple, à l'énergie de batterie restante et/ou à d'autres circonstances. L'adaptation peut être personnalisée selon les objectifs de scénarios d'utilisation.
PCT/US2014/020999 2013-03-06 2014-03-06 Adaptation sensible à l'énergie pour diffusion vidéo en flux WO2014138331A2 (fr)

Priority Applications (12)

Application Number Priority Date Filing Date Title
EP22190891.6A EP4156703A1 (fr) 2013-03-06 2014-03-06 Adaptation sensible à la puissance pour diffusion vidéo en continu
KR1020157027754A KR101879318B1 (ko) 2013-03-06 2014-03-06 비디오 스트리밍을 위한 전력 인식 적응
EP18208512.6A EP3499905B1 (fr) 2013-03-06 2014-03-06 Adaptation sensible à la puissance pour diffusion vidéo en continu
CN201811154594.6A CN109510999B (zh) 2013-03-06 2014-03-06 一种wtru及由wtru执行的方法
KR1020187019821A KR101991214B1 (ko) 2013-03-06 2014-03-06 비디오 스트리밍을 위한 전력 인식 적응
JP2015561634A JP2016517197A (ja) 2013-03-06 2014-03-06 ビデオストリーミングに対する電力認識適応
EP14715156.7A EP2965529A2 (fr) 2013-03-06 2014-03-06 Adaptation sensible à l'énergie pour diffusion vidéo en flux
US14/773,078 US10063921B2 (en) 2013-03-06 2014-03-06 Power aware adaptation for video streaming
CN201480012503.9A CN105191329B (zh) 2013-03-06 2014-03-06 用于视频流的功率感知自适应
US16/038,881 US10469904B2 (en) 2013-03-06 2018-07-18 Power aware adaptation for video streaming
US16/580,848 US11153645B2 (en) 2013-03-06 2019-09-24 Power aware adaptation for video streaming
US17/503,809 US11695991B2 (en) 2013-03-06 2021-10-18 Power aware adaptation for video streaming

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361773379P 2013-03-06 2013-03-06
US61/773,379 2013-03-06
US201461936838P 2014-02-06 2014-02-06
US61/936,838 2014-02-06

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US14/773,078 A-371-Of-International US10063921B2 (en) 2013-03-06 2014-03-06 Power aware adaptation for video streaming
US16/038,881 Continuation US10469904B2 (en) 2013-03-06 2018-07-18 Power aware adaptation for video streaming

Publications (2)

Publication Number Publication Date
WO2014138331A2 true WO2014138331A2 (fr) 2014-09-12
WO2014138331A3 WO2014138331A3 (fr) 2014-12-04

Family

ID=50434270

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/020999 WO2014138331A2 (fr) 2013-03-06 2014-03-06 Adaptation sensible à l'énergie pour diffusion vidéo en flux

Country Status (2)

Country Link
KR (1) KR101879318B1 (fr)
WO (1) WO2014138331A2 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160044732A (ko) * 2014-10-16 2016-04-26 엔트릭스 주식회사 클라우드 스트리밍 서비스 시스템, 스틸 이미지 기반 클라우드 스트리밍 서비스 방법 및 이를 위한 장치
KR20160044819A (ko) * 2014-10-16 2016-04-26 삼성전자주식회사 무선 네트워크 환경에서 http 적응적 스트리밍 방법 및 장치
WO2016105812A1 (fr) * 2014-12-24 2016-06-30 Intel Corporation Diffusion en continu de contenu multimédia
EP3185123A1 (fr) 2015-12-21 2017-06-28 Orange Procédé de gestion de ressources sur un terminal
WO2017160731A1 (fr) * 2016-03-15 2017-09-21 Roku, Inc. Détection de condition de baisse de tension et étalonnage de dispositif
US10264269B2 (en) 2014-10-13 2019-04-16 Apple Inc. Metadata hints to support best effort decoding for green MPEG applications
US10284612B2 (en) 2013-04-19 2019-05-07 Futurewei Technologies, Inc. Media quality information signaling in dynamic adaptive video streaming over hypertext transfer protocol
FR3074629A1 (fr) * 2017-12-05 2019-06-07 Orange Procede de gestion de la consommation electrique d'un dispositif electronique.
CN110493609A (zh) * 2019-08-07 2019-11-22 咪咕文化科技有限公司 直播方法、终端及计算机可读存储介质
US20230269408A1 (en) * 2015-12-11 2023-08-24 Interdigital Madison Patent Holdings, Sas Scheduling multiple-layer video segments

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040158878A1 (en) * 2003-02-07 2004-08-12 Viresh Ratnakar Power scalable digital video decoding
EP1841172B1 (fr) * 2006-03-31 2018-03-14 Google Technology Holdings LLC Redirection d'un flux de données multimédia dans un dispositif de communication sans fil
US8396114B2 (en) * 2009-01-29 2013-03-12 Microsoft Corporation Multiple bit rate video encoding using variable bit rate and dynamic resolution for adaptive video streaming
WO2014011622A2 (fr) * 2012-07-09 2014-01-16 Vid Scale, Inc. Décodage et lecture en transit de vidéo sensibles à l'énergie électrique

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10284612B2 (en) 2013-04-19 2019-05-07 Futurewei Technologies, Inc. Media quality information signaling in dynamic adaptive video streaming over hypertext transfer protocol
US10264269B2 (en) 2014-10-13 2019-04-16 Apple Inc. Metadata hints to support best effort decoding for green MPEG applications
KR102185876B1 (ko) 2014-10-16 2020-12-02 삼성전자주식회사 무선 네트워크 환경에서 http 적응적 스트리밍 방법 및 장치
KR20160044819A (ko) * 2014-10-16 2016-04-26 삼성전자주식회사 무선 네트워크 환경에서 http 적응적 스트리밍 방법 및 장치
KR102273143B1 (ko) 2014-10-16 2021-07-05 에스케이플래닛 주식회사 클라우드 스트리밍 서비스 시스템, 스틸 이미지 기반 클라우드 스트리밍 서비스 방법 및 이를 위한 장치
KR20160044732A (ko) * 2014-10-16 2016-04-26 엔트릭스 주식회사 클라우드 스트리밍 서비스 시스템, 스틸 이미지 기반 클라우드 스트리밍 서비스 방법 및 이를 위한 장치
WO2016105812A1 (fr) * 2014-12-24 2016-06-30 Intel Corporation Diffusion en continu de contenu multimédia
US9860294B2 (en) 2014-12-24 2018-01-02 Intel Corporation Media content streaming
US12003794B2 (en) * 2015-12-11 2024-06-04 Interdigital Madison Patent Holdings, Sas Scheduling multiple-layer video segments
US20230269408A1 (en) * 2015-12-11 2023-08-24 Interdigital Madison Patent Holdings, Sas Scheduling multiple-layer video segments
EP3185123A1 (fr) 2015-12-21 2017-06-28 Orange Procédé de gestion de ressources sur un terminal
US10045301B2 (en) 2015-12-21 2018-08-07 Orange Method for managing resources on a terminal
US11023027B2 (en) 2016-03-15 2021-06-01 Roku, Inc. Brown out condition detection and device calibration
US10437304B2 (en) 2016-03-15 2019-10-08 Roku, Inc. Brown out condition detection and device calibration
US20170269664A1 (en) * 2016-03-15 2017-09-21 Roku, Inc. Brown out condition detection and device calibration
US11983057B2 (en) 2016-03-15 2024-05-14 Roku, Inc. Brown out condition detection and device calibration
WO2017160731A1 (fr) * 2016-03-15 2017-09-21 Roku, Inc. Détection de condition de baisse de tension et étalonnage de dispositif
EP3496407A1 (fr) * 2017-12-05 2019-06-12 Orange Procédé de gestion de la consommation électrique d'un dispositif électronique
FR3074629A1 (fr) * 2017-12-05 2019-06-07 Orange Procede de gestion de la consommation electrique d'un dispositif electronique.
US11252471B2 (en) 2017-12-05 2022-02-15 Orange Method for managing the electricity consumption of an electronic device
CN110493609A (zh) * 2019-08-07 2019-11-22 咪咕文化科技有限公司 直播方法、终端及计算机可读存储介质
CN110493609B (zh) * 2019-08-07 2022-02-01 咪咕文化科技有限公司 直播方法、终端及计算机可读存储介质

Also Published As

Publication number Publication date
KR101879318B1 (ko) 2018-07-18
KR20150128848A (ko) 2015-11-18
WO2014138331A3 (fr) 2014-12-04

Similar Documents

Publication Publication Date Title
US11695991B2 (en) Power aware adaptation for video streaming
US10063921B2 (en) Power aware adaptation for video streaming
JP7072592B2 (ja) 品質ドリブンストリーミング
US11516485B2 (en) Power aware video decoding and streaming
WO2014138331A2 (fr) Adaptation sensible à l'énergie pour diffusion vidéo en flux

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480012503.9

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14715156

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 14773078

Country of ref document: US

ENP Entry into the national phase in:

Ref document number: 2015561634

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2014715156

Country of ref document: EP

ENP Entry into the national phase in:

Ref document number: 20157027754

Country of ref document: KR

Kind code of ref document: A