EP1743467B1 - Refined quality feedback in streaming services - Google Patents
Refined quality feedback in streaming services Download PDFInfo
- Publication number
- EP1743467B1 EP1743467B1 EP05737952.1A EP05737952A EP1743467B1 EP 1743467 B1 EP1743467 B1 EP 1743467B1 EP 05737952 A EP05737952 A EP 05737952A EP 1743467 B1 EP1743467 B1 EP 1743467B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- playback
- quality
- client
- media stream
- metric
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000013442 quality metrics Methods 0.000 claims description 72
- 238000000034 method Methods 0.000 claims description 29
- 238000004590 computer program Methods 0.000 claims description 7
- 238000010295 mobile communication Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000011664 signaling Effects 0.000 description 3
- 230000003139 buffering effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/50—Network service management, e.g. ensuring proper service fulfilment according to agreements
- H04L41/508—Network service management, e.g. ensuring proper service fulfilment according to agreements based on type of value added network service under agreement
- H04L41/509—Network service management, e.g. ensuring proper service fulfilment according to agreements based on type of value added network service under agreement wherein the managed service relates to media content delivery, e.g. audio, video or TV
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/06—Generation of reports
- H04L43/065—Generation of reports related to network devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/08—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
- H04L43/0852—Delays
- H04L43/087—Jitter
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/10—Active monitoring, e.g. heartbeat, ping or trace-route
- H04L43/106—Active monitoring, e.g. heartbeat, ping or trace-route using time related information in packets, e.g. by adding timestamps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/50—Testing arrangements
- H04L43/55—Testing of service level quality, e.g. simulating service usage
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/80—Responding to QoS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/23406—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving management of server-side video buffer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/24—Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44004—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6125—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/637—Control signals issued by the client directed to the server or network components
- H04N21/6377—Control signals issued by the client directed to the server or network components directed to server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/643—Communication protocols
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/643—Communication protocols
- H04N21/64322—IP
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6581—Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6582—Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
Definitions
- This invention relates to a method, a computer program, a computer program product, a system, a client, a server and a protocol for quality feedback in a streaming service, wherein at least one media stream is streamed to a client for playback.
- streaming refers to the ability of an application settled in a client to play back synchronized media streams like speech, audio and video streams in a continuous way while those streams are being transmitted to the client over a data network.
- streaming also refers to real-time low-delay applications such as conversational applications.
- Applications that can be built on top of streaming services can be classified into on-demand and live information delivery applications. Examples of the first category are music and news-on-demand applications. Live delivery of radio and television programs are examples of the second category.
- Real-time low delay application are, for example, multimedia (video)telephony or Voice over IP and any type of conversational multimedia application.
- IP Internet Protocol
- 3G Third Generation
- PSS Packet-switched Streaming Service
- 3GPP TS 26.233 3GPP TS 26.233
- TS 26.234 3G Packet-switched Streaming Service
- the PSS enables mobile streaming applications, wherein the complexity of the terminals is lower than that required for conversational services, because no media input devices and encoders are required, and because less complex protocols can be used.
- the PSS includes a basic set of streaming control protocols, transport protocols, media codecs and scene description protocols.
- Fig. 1 schematically depicts the PSS protocol stack 1 that controls the transfer of both streamable and non-streamable content between a content or media server and a client.
- Streamable content 101 such as video, audio and speech, is first converted to the payload format of the Real-time Transport Protocol (RTP) 102 in an adaptation layer 103.
- RTP Real-time Transport Protocol
- Said RTP as defined by the IETF provides means for sending real-time or streaming data by using the services of an underlying User Datagram Protocol (UDP) 104, which in turn uses the services of an underlying IP protocol 105.
- UDP User Datagram Protocol
- Non-streamable content 106 as for instance multimedia content which is not created for streaming purposes (e.g. MMS clips recorded on a terminal device), still images, bitmap and vector graphics, text, timed text and synthetic audio are transferred by the Hypertext Transfer Protocol (HTTP) 107, which uses the services of the underlying Transport Control Protocol (TCP) 108 and the further underlying IP 105.
- HTTP Hypertext Transfer Protocol
- TCP Transport Control Protocol
- the built-in session set-up and control capabilities of the HTTP 107 are sufficient to transfer the content
- an advanced session set-up and control protocol has to be invoked, for instance to start, stop and pause a streaming video that is transferred from the content server to the client via the RTP/UDP/IP.
- This task is performed by the Real-time Streaming Protocol (RTSP) 109, which may either use the underlying TCP 108 or the underlying UDP 104.
- RTSP requires a presentation description 110 at least to set-up a streaming session.
- Such a presentation description 110 may for instance be available in the form of a Session Description Protocol (SDP) file.
- Said SDP file contains the description of the session, for instance session name and author, the type of media to be presented, information to receive said media, as for instance addresses, ports, formats and so on, and the bitrate of the media.
- SDP Session Description Protocol
- URI Universal Resource Identifier
- WAP Wireless Application Protocol
- This URI specifies a streaming or RTSP server and the address of the content on that or another content server.
- the corresponding SDP file may now be obtained in a number of ways. It may be provided in a link inside the HTML page that the user downloads, for instance via an embed tag, or may also be directly obtained by typing it as a URI.
- the SDP file i.e.
- the presentation description 110 then is transferred via the HTTP 107 as indicated in the middle column of the protocol stack of Fig. 1 .
- it may also be obtained through RTSP 109 signaling, for instance by using the DESCRIBE method of the RTSP 109, as indicated by the right column of the protocol stack in Fig. 1 .
- the presentation description may equally well be transmitted by said RTP 102. However, for simplicity of presentation, this possibility was not included in Fig. 1 .
- the subsequent session establishment is the process in which the browser or the user of the mobile terminal invokes a streaming client to set up the session against the content server.
- the terminal is expected to have an active radio bearer that enables IP-based packet transmission at the start of session establishment signaling.
- the subsequent set-up of the streaming service is done by sending an RTSP SETUP message for each media stream chosen by the client. This returns the UDP 104 and/or TCP 108 port to be used for the respective media stream.
- the client sends an RTSP PLAY message to the content server that then starts to send one or more streams over the IP network.
- streaming service quality metrics have been introduced in PSS systems, as presented in 3GPP Technical document (Tdoc) S4-040073: "Draft Rel-6 PSS Quality Metrics Permanent Document v.0.11 ", which refers to 3GPP TSG-SA4 meeting #30 in Malaga, Spain, February 23-27, 2004 .
- the streaming client measures and feeds back information on the quality of the actual streaming application (Quality of Experience, QoE) to a streaming server, wherein said quality is defined in terms of said quality metrics.
- Said streaming server may for instance be an RTSP server, and said quality metrics may for instance be transported by using said RTSP and SDP.
- the service is transparent to the type of RAN and CN, only the streaming client and the streaming server are impacted by the PSS quality metrics.
- the measurements may not rely on information from protocol layers below the RTP layer (e.g. UDP, IP, PDCP, SNDCP, LLC, RLC, MAC, Physical Layer).
- the terminal in a PSS system with quality feedback is responsible to perform the quality measurements in accordance to the measurement definition, aggregate them into streaming client quality metrics and report the metrics to the streaming server. This requirement does not preclude the possibility for the streaming client to report raw quality measurements to be processed by the streaming server into quality metrics.
- the streaming server is responsible to signal the activation of the streaming client's quality metrics reporting and to gather the streaming client's quality metrics.
- the streaming server may process the received streaming client's quality metrics to build aggregated quality metrics. E.g. it could receive a raw lost packets report and build the Min, Max, Avg and Std packet loss rate for a particular streaming client.
- Corruption duration is the time period from the first corrupted frame to the first subsequent good frame or the end of the reporting period (whichever is sooner).
- the unit of this metrics is expressed in seconds, and can be a fractional value.
- This metric is only applicable for audio, video and speech, and is not applicable to other media types.
- the unit of this metrics is expressed in seconds, and can be a fractional value.
- Rebuffering is defined as any stall in playback time due to any involuntary event at the client side.
- Initial buffering is the time from receiving the first RTP packet until playback starts.
- the unit of this metrics is expressed in seconds, and can be a fractional value.
- the objective of the above quality metric definition is to obtain consistent measurements across content type, terminals, and types of Radio Access Network (RAN).
- RAN Radio Access Network
- the constraints are to minimize the size of the quality metrics report that will be sent to the streaming server and, the complexity for the terminal.
- the actual quality metrics feedback can be conveyed to the PSS server by using the SET_PARAMETER method of the RTSP with a feedback header 2 as depicted in Fig. 2 (with reference to IETF Request for Comments (RFC) document 2327), however, in particular cases, it is more efficient to use other methods to carry the information, as for instance the TEARDOWN message or the PAUSE message.
- RRC Request for Comments
- Stream-url is the RTSP session or media control URL identifier for the feedback parameter.
- the Metrics field in the Parameters definition contains the name of the metrics/measurements (for instance corruption duration, etc.).
- the Value field indicates the results. There is the possibility that the same event occurs more than once during a monitoring period. In that case the metrics value can occur more than once, which indicates the number of events to the server.
- the optional Range field indicates the reporting period.
- Timestamp field in the feedback header 2 of Fig. 2 indicates the time when the event (or measurement) occurred or when the metric was calculated since the beginning of the session.
- the four quality metrics defined by Tdoc S4-040073 only allow for a coarse characterization of the quality of the playback of multimedia streams as experienced by a user. For instance, if two streaming sessions have the same values of the four quality metrics defined by Tdoc S4-040073, and if in the first of said sessions, a perfect synchronization between audio and video data exists, whereas in the second of said sessions, said synchronization between audio and video has been lost, the reported quality based on the four quality metrics defined by Tdoc S4-040073 is the same while the actually experienced quality of playback is quite different.
- the four quality metrics defined by Tdoc S4-040073 do not differentiate between the different frame types contained in said multimedia stream, so that, for instance, the loss of frame types that are of crucial importance for the experienced quality of the playback can not be differentiated from the loss of less important types of frames when reporting quality.
- Document US 2003/046384 A1 discloses a method of providing streaming service quality metrics from a client to a server.
- Different quality metrics can be used such as peak bandwidth, low bandwidth, high/low latency, jitter, client video-rendering frame rate.
- Document XP009028505 discloses a method for efficient frame scheduling for mpeg video streams. The method is evaluated by using standard deviation of frame rate which is related to a mean value of frame rates over multiple streams.
- Document XP002374403 discloses using effective frame rate as a metric used for video streaming.
- the present invention defines a method according to claim 1, a computer program according to claim 13, a computer program product according to claim 14, a system according to claim 15, a client according to claim 16 and a server according to claim 28. Further embodiments are set forth in the dependent claims 2-12, 17-27, 29 and 30.
- the present invention proposes special quality metrics to be used in quality feedback for streaming services in order to refine quality feedback.
- the proposed quality metrics and their associated timestamps will be described in more detail.
- This quality metric may only be applicable for audio, video and speech, and it may not be applicable to other media types. It gives information on the playback frame rate. Frame rate deviation happens when the playback frame rate deviates from a pre-defined value.
- This quality metric may contain both the time duration of the event and the frame rate deviation value, i.e. the difference between a pre-defined frame rate and the actual playback frame rate.
- the time duration may be expressed in units of seconds, and may be a fractional value.
- the deviation value may be expressed in units of frames per second, and may also be a fractional value.
- Said pre-defined value may be a default value known by both the server and the client, or it may be provided by the server during QoE negotiation. If it is to be provided by the server, the server may decide the value by checking the media bitstream or any other means.
- the value indicates the average frame rate calculated when the media stream is locally played back.
- the frame rate value of the time period from second n-1 to second n is equal to the number of frames played back during the period.
- the time axis may represent NPT time and may originate from the starting time of the QoE reporting period.
- the timestamp associated with the frame-rate-deviation metric indicates the time when the frame rate deviation has occurred.
- the value of the timestamp may be equal to the NPT of the first played frame during the frame rate deviation event, relative to the starting time of the QoE reporting period. If there is no played frame during the event, the value may be equal to the NPT of the last played frame before the event or the starting time of the QoE reporting period, whichever is later, relative to the starting time of the QoE reporting period.
- This quality metric may be only applicable for audio, video and speech, and may not be applicable to other media types.
- a playback jitter happens when the absolute difference between the actual playback time and the scheduled playback time is larger than a pre-defined value.
- This metric may be expressed in units of seconds, and may be a fractional value.
- Said pre-defined value may be a default value known by both the server and the client, e.g. 100 milliseconds.
- the timestamp associated with the jitter-duration metric indicates the time when the playback jitter has occurred.
- the value of the timestamp may be equal to the NPT of the first played frame in the playback jitter, relative to the starting time of the QoE reporting period.
- This quality metric may be applicable for any pair of media types.
- a value A as a difference between a playback time of a last played frame of a first media stream of said at least one media streams and a playback time of a last played frame of a second media stream of said at least one media streams
- a value B as a difference between a scheduled playback time of said last played frame of said first media stream of said at least one media streams and a scheduled playback time of said last played frame of said second media stream of said at least one media stream.
- a synchronization loss happens when the absolute difference between said value A and said value B is larger than a pre-defined value.
- This quality metric may be expressed in units of seconds, and may be a fractional value.
- Said pre-defined value may be a default value known by both the server and the client, e.g. 100 milliseconds.
- the timestamp associated with the synchronization-loss-duration metric indicates the time when the playback synchronization loss has occurred.
- the value of the timestamp may be equal to the NPT of the first played frame in the synchronization loss, relative to the starting time of the QoE reporting period.
- This quality metric may be only applicable for video, and may not be applicable to other media types.
- This metric refers to the number of corrupted scene cut frames during the QoE reporting period. This metric may be expressed in units of integers larger than or equal to zero. If this metric is supported, and if no such metric is reported for a QoE reporting period, this may indicate that no scene cut frame has been corrupted during the QoE reporting period.
- This quality metric may be used if the server has information indicating which frames are scene cut frames, or if the server implements a method to derive the information, for instance by using a scene cut detection algorithm. In this case, it may be advantageous that the server makes the information available to the client either via in-band or out-of-band signaling. This quality metric may also be used if the client implements a method to derive the information, for instance by using a scene cut detection algorithm. For example, in ITU-T H.264 (a.k.a. ISO/IEC MPEG-4 Part 10), scene cut information can be conveyed to the client in-band using the scene information supplemental enhancement information (SEI) message.
- SEI scene information supplemental enhancement information
- the timestamp associated with the number-of-corrupted-scene-cut-frames metric may not be defined, since it may be clear that the time when the number of corrupted scene cut frames is measured is equal to the ending time of the QoE reporting period.
- An example protocol syntax for each of the new quality metrics as proposed by the present invention reads as:
- the Value1 field indicates the time duration of the frame rate deviation event
- the Value2 field indicates the frame rate deviation value.
- the Value field indicates the time duration of the playback jitter, the time duration of synchronization loss, or the number of corrupted scene cut frames.
- the semantics of the Timestamp field are as specified above.
- a quality feedback value may occur more than once, which indicates the number of events to the server.
- a pre-defined frame rate value may need to be sent to the client.
- the value can be added as one additional parameter to an RSTP header QoE-Header or SDP attribute "QoE-Metrics", as specified in Tdoc S4-040073.
- the syntax design for the pre-defined frame rate value may, for example, be defined as follows:
- the present invention proposes that a quality feedback value determined according to the corruption-duration metric is only reported if said duration is larger than a pre-defined value. During playback, if only a single frame or a few frames in a short time period are not played back, the end user typically cannot perceive the difference.
- the receiving terminal may decide not to play back a frame for instance due to one of the following reasons: Due to any reason, a non-reference frame has not been transmitted from the server; a non-reference frame is partially or entirely lost; a non-reference frame is completely received, but cannot be correctly decoded; a non-reference frame is completely received, but due to delay or lacking of computing capability, it is not decoded; a reference or non-reference frame is completely received and correctly decoded but its scheduled display time has expired.
- Non-playback of such frames in a short time period does not affect user experience, so that reporting of such frames of a short time period would be a waste of transmission bandwidth.
- a corruption duration is only reported when it exceeds said pre-defined value, which may for instance be a default value set by the server and/or the client or a value prescribed by a protocol.
- Fig. 3 depicts a flowchart of a method according to the present invention.
- a streaming session is set up between a streaming client and a streaming server.
- one or more quality metrics from a pre-defined set of quality metrics which comprises the quality metrics as proposed by the present invention, are negotiated between the streaming client and the streaming server for use in the quality feedback procedure that is performed by the streaming client.
- Both said session set-up and negotiation may be based on an RTSP in combination with an SDP, or on an RTCP or SIP.
- Step 301 may also be performed together with step 300.
- a corresponding timestamp metric may be associated with at least some of the negotiated quality metrics for the streaming session.
- the actual streaming is started, for instance when a media stream is transmitted to the streaming client and played back on the terminal in which said streaming client is set up.
- a quality feedback is required or not. This may for instance be accomplished by continuously checking if an event, which has to be reported to the streaming server according to the negotiated quality metric, occurs or not. This may for instance be a loss of synchronization event.
- a periodical quality report may have been negotiated, for instance the periodical feed-back of the number of corrupted scene cut frames in a certain time interval. In said step 303, both the event-driven and the periodical quality feed-back is triggered.
- a quality feedback value is determined according to each negotiated quality metric. If one or several of said negotiated quality metrics are associated with a timestamp metric, corresponding timestamps are determined in a step 305. Said step 305 may equally well be performed before the step 304.
- the quality feedback values and the corresponding timestamps are then reported to the streaming server in a step 306, for instance via the RTSP, RTCP or SIP.
- After quality feedback, or if it is decided that no quality feedback is required it is checked in a step 307 if streaming is to be stopped. If this is not the case, it is again checked in a step 303 if a new quality feedback is required or not.
- Fig. 4 schematically depicts the functional components of a system according to the present invention.
- This embodiment exemplarily refers to a PSS system that uses an RTSP to control the streaming.
- the SIP could be used here with a slightly modified underlying protocol stack and with an additional network instance that sniffs or captures the quality feedbacks and timestamps that are sent from the client 601 (a first party) to the server 600 (a second party).
- the PSS system in Fig. 4 comprises a streaming client 601 and a streaming server 600, wherein both client 601 and server 600 have at least one RTSP entity 401, 400 that is capable of operating the RTSP.
- the RTSP entities 400, 401 use the services of underlying protocol layers that are operated by further protocol entities, of which only the TCP/UDP entities 402, 403 and the IP entities 404, 405 are shown.
- the streaming client 601 is further connected to a streaming quality monitor instance 407, which monitors the quality of the actual streaming application in terms of the negotiated quality metrics and possibly a corresponding timestamp metric and inputs monitored quality feedback values into said RTSP entity 401.
- Said streaming quality monitor may for instance be provided by the terminal, in which said streaming client is set up.
- the streaming quality monitor 407 determines a timestamp according to said timestamp metric, and transfers said monitored quality feedback values and said corresponding timestamps, via the client RTSP 401, to the RTSP peer entity in the streaming server 600, where they are input into a quality data processing instance 406 for evaluation and analysis, which may for instance aim at improving the quality of the streaming application by enhancing the error resilience of the streams if it is found that the corruption duration events become too frequent, or just aim at statistical quality data collection or charging or other aims.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Quality & Reliability (AREA)
- Environmental & Geological Engineering (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Cardiology (AREA)
- General Health & Medical Sciences (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Description
- This invention relates to a method, a computer program, a computer program product, a system, a client, a server and a protocol for quality feedback in a streaming service, wherein at least one media stream is streamed to a client for playback.
- Streaming, on the one hand, refers to the ability of an application settled in a client to play back synchronized media streams like speech, audio and video streams in a continuous way while those streams are being transmitted to the client over a data network. On the other hand, streaming also refers to real-time low-delay applications such as conversational applications.
- Applications that can be built on top of streaming services can be classified into on-demand and live information delivery applications. Examples of the first category are music and news-on-demand applications. Live delivery of radio and television programs are examples of the second category. Real-time low delay application are, for example, multimedia (video)telephony or Voice over IP and any type of conversational multimedia application.
- Streaming over fixed Internet Protocol (IP) networks is already a major application today. While the Internet Engineering Task Force (IETF) and the World Wide Web Consortium (W3C) have developed a set of protocols used in fixed-IP streaming services, no complete standardized streaming framework has yet been defined. For Third Generation (3G) mobile communications systems according to the standards developed by the Third Generation Partnership Project (3GPP), the 3G Packet-switched Streaming Service (PSS, 3GPP TS 26.233, TS 26.234) fills the gap between the 3G Multi-media Messaging Service (MMS), for instance downloading applications and multimedia content, and conversational & streaming services.
- The PSS enables mobile streaming applications, wherein the complexity of the terminals is lower than that required for conversational services, because no media input devices and encoders are required, and because less complex protocols can be used. The PSS includes a basic set of streaming control protocols, transport protocols, media codecs and scene description protocols.
-
Fig. 1 schematically depicts thePSS protocol stack 1 that controls the transfer of both streamable and non-streamable content between a content or media server and a client. -
Streamable content 101, such as video, audio and speech, is first converted to the payload format of the Real-time Transport Protocol (RTP) 102 in an adaptation layer 103. Said RTP as defined by the IETF provides means for sending real-time or streaming data by using the services of an underlying User Datagram Protocol (UDP) 104, which in turn uses the services of anunderlying IP protocol 105. - Non-streamable
content 106, as for instance multimedia content which is not created for streaming purposes (e.g. MMS clips recorded on a terminal device), still images, bitmap and vector graphics, text, timed text and synthetic audio are transferred by the Hypertext Transfer Protocol (HTTP) 107, which uses the services of the underlying Transport Control Protocol (TCP) 108 and the furtherunderlying IP 105. - Whereas for the
non-streamable content 106, the built-in session set-up and control capabilities of the HTTP 107 are sufficient to transfer the content, in case ofstreamable content 101, an advanced session set-up and control protocol has to be invoked, for instance to start, stop and pause a streaming video that is transferred from the content server to the client via the RTP/UDP/IP. This task is performed by the Real-time Streaming Protocol (RTSP) 109, which may either use the underlying TCP 108 or theunderlying UDP 104. RTSP requires apresentation description 110 at least to set-up a streaming session. Such apresentation description 110 may for instance be available in the form of a Session Description Protocol (SDP) file. Said SDP file contains the description of the session, for instance session name and author, the type of media to be presented, information to receive said media, as for instance addresses, ports, formats and so on, and the bitrate of the media. - If streaming content is to be viewed at the client side, for instance at a mobile terminal, the user of said terminal is first provided with a Universal Resource Identifier (URI) to specific content that suits his terminal. This URI may come form a WWW server, a Wireless Application Protocol (WAP) server, or may have been entered manually via the keyboard of the terminal. This URI specifies a streaming or RTSP server and the address of the content on that or another content server. The corresponding SDP file may now be obtained in a number of ways. It may be provided in a link inside the HTML page that the user downloads, for instance via an embed tag, or may also be directly obtained by typing it as a URI. The SDP file, i.e. the
presentation description 110, then is transferred via the HTTP 107 as indicated in the middle column of the protocol stack ofFig. 1 . Alternatively, it may also be obtained through RTSP 109 signaling, for instance by using the DESCRIBE method of the RTSP 109, as indicated by the right column of the protocol stack inFig. 1 . Note that the presentation description may equally well be transmitted by said RTP 102. However, for simplicity of presentation, this possibility was not included inFig. 1 . - The subsequent session establishment is the process in which the browser or the user of the mobile terminal invokes a streaming client to set up the session against the content server. The terminal is expected to have an active radio bearer that enables IP-based packet transmission at the start of session establishment signaling.
- The subsequent set-up of the streaming service is done by sending an RTSP SETUP message for each media stream chosen by the client. This returns the
UDP 104 and/or TCP 108 port to be used for the respective media stream. The client sends an RTSP PLAY message to the content server that then starts to send one or more streams over the IP network. - In order to offer service providers in PSS systems means to evaluate the end user streaming experience, streaming service quality metrics have been introduced in PSS systems, as presented in 3GPP Technical document (Tdoc) S4-040073: "Draft Rel-6 PSS Quality Metrics Permanent Document v.0.11", which refers to 3GPP TSG-SA4 meeting #30 in Malaga, Spain, February 23-27, 2004. The streaming client measures and feeds back information on the quality of the actual streaming application (Quality of Experience, QoE) to a streaming server, wherein said quality is defined in terms of said quality metrics. Said streaming server may for instance be an RTSP server, and said quality metrics may for instance be transported by using said RTSP and SDP.
- Because the service is transparent to the type of RAN and CN, only the streaming client and the streaming server are impacted by the PSS quality metrics. One consequence of this is that the measurements may not rely on information from protocol layers below the RTP layer (e.g. UDP, IP, PDCP, SNDCP, LLC, RLC, MAC, Physical Layer).
- The terminal in a PSS system with quality feedback is responsible to perform the quality measurements in accordance to the measurement definition, aggregate them into streaming client quality metrics and report the metrics to the streaming server. This requirement does not preclude the possibility for the streaming client to report raw quality measurements to be processed by the streaming server into quality metrics.
- The streaming server is responsible to signal the activation of the streaming client's quality metrics reporting and to gather the streaming client's quality metrics. The streaming server may process the received streaming client's quality metrics to build aggregated quality metrics. E.g. it could receive a raw lost packets report and build the Min, Max, Avg and Std packet loss rate for a particular streaming client.
- The following four quality metrics are defined by Tdoc S4-040073:
- Corruption duration is the time period from the first corrupted frame to the first subsequent good frame or the end of the reporting period (whichever is sooner). The unit of this metrics is expressed in seconds, and can be a fractional value.
- This metric is only applicable for audio, video and speech, and is not applicable to other media types. The unit of this metrics is expressed in seconds, and can be a fractional value. Rebuffering is defined as any stall in playback time due to any involuntary event at the client side.
- Initial buffering is the time from receiving the first RTP packet until playback starts. The unit of this metrics is expressed in seconds, and can be a fractional value.
- The number of content packets lost in succession per media channel.
- The objective of the above quality metric definition is to obtain consistent measurements across content type, terminals, and types of Radio Access Network (RAN).
- The constraints are to minimize the size of the quality metrics report that will be sent to the streaming server and, the complexity for the terminal.
- The actual quality metrics feedback can be conveyed to the PSS server by using the SET_PARAMETER method of the RTSP with a
feedback header 2 as depicted inFig. 2 (with reference to IETF Request for Comments (RFC) document 2327), however, in particular cases, it is more efficient to use other methods to carry the information, as for instance the TEARDOWN message or the PAUSE message. - In the
feedback header 2 ofFig. 2 , Stream-url is the RTSP session or media control URL identifier for the feedback parameter. The Metrics field in the Parameters definition contains the name of the metrics/measurements (for instance corruption duration, etc.). The Value field indicates the results. There is the possibility that the same event occurs more than once during a monitoring period. In that case the metrics value can occur more than once, which indicates the number of events to the server. The optional Range field indicates the reporting period. - The optional Timestamp field in the
feedback header 2 ofFig. 2 indicates the time when the event (or measurement) occurred or when the metric was calculated since the beginning of the session. - The four quality metrics defined by Tdoc S4-040073 only allow for a coarse characterization of the quality of the playback of multimedia streams as experienced by a user. For instance, if two streaming sessions have the same values of the four quality metrics defined by Tdoc S4-040073, and if in the first of said sessions, a perfect synchronization between audio and video data exists, whereas in the second of said sessions, said synchronization between audio and video has been lost, the reported quality based on the four quality metrics defined by Tdoc S4-040073 is the same while the actually experienced quality of playback is quite different. Furthermore, the four quality metrics defined by Tdoc S4-040073 do not differentiate between the different frame types contained in said multimedia stream, so that, for instance, the loss of frame types that are of crucial importance for the experienced quality of the playback can not be differentiated from the loss of less important types of frames when reporting quality.
- Document
US 2003/046384 A1 discloses a method of providing streaming service quality metrics from a client to a server. Different quality metrics can be used such as peak bandwidth, low bandwidth, high/low latency, jitter, client video-rendering frame rate. - Document XP009028505 discloses a method for efficient frame scheduling for mpeg video streams. The method is evaluated by using standard deviation of frame rate which is related to a mean value of frame rates over multiple streams.
- Document XP002374403 discloses using effective frame rate as a metric used for video streaming.
- The present invention defines a method according to
claim 1, a computer program according to claim 13, a computer program product according to claim 14, a system according to claim 15, a client according to claim 16 and a server according to claim 28. Further embodiments are set forth in the dependent claims 2-12, 17-27, 29 and 30. - In the figures show:
- Fig. 1:
- A schematic representation of a Packet-Switched Streaming Service (PSS) protocol stack according to the prior art,
- Fig. 2:
- a definition of a Real-time Streaming Protocol (RTSP) negotiation header according to the prior art,
- Fig. 3:
- a flowchart of the method of the present invention, and
- Fig. 4:
- a schematic representation of a system according to the present invention.
- The present invention proposes special quality metrics to be used in quality feedback for streaming services in order to refine quality feedback. In the following, the proposed quality metrics and their associated timestamps will be described in more detail.
- This quality metric may only be applicable for audio, video and speech, and it may not be applicable to other media types. It gives information on the playback frame rate. Frame rate deviation happens when the playback frame rate deviates from a pre-defined value. This quality metric may contain both the time duration of the event and the frame rate deviation value, i.e. the difference between a pre-defined frame rate and the actual playback frame rate. The time duration may be expressed in units of seconds, and may be a fractional value. The deviation value may be expressed in units of frames per second, and may also be a fractional value.
- Said pre-defined value may be a default value known by both the server and the client, or it may be provided by the server during QoE negotiation. If it is to be provided by the server, the server may decide the value by checking the media bitstream or any other means.
- It may be advantageous that the value indicates the average frame rate calculated when the media stream is locally played back.
- From an implementation point of view, the following method may be specified to calculate the frame rate. It is assumed that the frame rate changes only at integer seconds. The frame rate value of the time period from second n-1 to second n is equal to the number of frames played back during the period. The time axis may represent NPT time and may originate from the starting time of the QoE reporting period.
- The timestamp associated with the frame-rate-deviation metric indicates the time when the frame rate deviation has occurred. The value of the timestamp may be equal to the NPT of the first played frame during the frame rate deviation event, relative to the starting time of the QoE reporting period. If there is no played frame during the event, the value may be equal to the NPT of the last played frame before the event or the starting time of the QoE reporting period, whichever is later, relative to the starting time of the QoE reporting period.
- This quality metric may be only applicable for audio, video and speech, and may not be applicable to other media types. A playback jitter happens when the absolute difference between the actual playback time and the scheduled playback time is larger than a pre-defined value. This metric may be expressed in units of seconds, and may be a fractional value. Said pre-defined value may be a default value known by both the server and the client, e.g. 100 milliseconds.
- The timestamp associated with the jitter-duration metric indicates the time when the playback jitter has occurred. The value of the timestamp may be equal to the NPT of the first played frame in the playback jitter, relative to the starting time of the QoE reporting period.
- This quality metric may be applicable for any pair of media types. Define a value A as a difference between a playback time of a last played frame of a first media stream of said at least one media streams and a playback time of a last played frame of a second media stream of said at least one media streams, and define a value B as a difference between a scheduled playback time of said last played frame of said first media stream of said at least one media streams and a scheduled playback time of said last played frame of said second media stream of said at least one media stream.
- A synchronization loss happens when the absolute difference between said value A and said value B is larger than a pre-defined value. This quality metric may be expressed in units of seconds, and may be a fractional value. Said pre-defined value may be a default value known by both the server and the client, e.g. 100 milliseconds.
- The timestamp associated with the synchronization-loss-duration metric indicates the time when the playback synchronization loss has occurred. The value of the timestamp may be equal to the NPT of the first played frame in the synchronization loss, relative to the starting time of the QoE reporting period.
- This quality metric may be only applicable for video, and may not be applicable to other media types. This metric refers to the number of corrupted scene cut frames during the QoE reporting period. This metric may be expressed in units of integers larger than or equal to zero. If this metric is supported, and if no such metric is reported for a QoE reporting period, this may indicate that no scene cut frame has been corrupted during the QoE reporting period.
- This quality metric may be used if the server has information indicating which frames are scene cut frames, or if the server implements a method to derive the information, for instance by using a scene cut detection algorithm. In this case, it may be advantageous that the server makes the information available to the client either via in-band or out-of-band signaling. This quality metric may also be used if the client implements a method to derive the information, for instance by using a scene cut detection algorithm. For example, in ITU-T H.264 (a.k.a. ISO/IEC MPEG-4 Part 10), scene cut information can be conveyed to the client in-band using the scene information supplemental enhancement information (SEI) message.
- The timestamp associated with the number-of-corrupted-scene-cut-frames metric may not be defined, since it may be clear that the time when the number of corrupted scene cut frames is measured is equal to the ending time of the QoE reporting period.
- An example protocol syntax for each of the new quality metrics as proposed by the present invention for instance reads as:
Framerate_Deviation = "Framerate_Deviation" "=" "{" SP / (Valuel SP Value2 [SP Timestamp])#("," Value1 SP Value2 [SP Timestamp])"}"; Jitter_Duration = "Jitter_Duration" "=" "{" SP / (Value [SP Timestamp])#("," Value [SP Timestamp]) "}"; Syncloss_Duration = "Syncloss_Duration" "=" "{" SP / (Value [SP Timestamp])#("," Value [SP Timestamp]) "}"; Corrupted_Scene_Cuts = "Corrupted_Scene_Cuts" "=" "{" SP / Value "}";
Pre-defined Frame Rate = "FR" "=" 1*DIGIT "." 1*DIGIT
QoE-Header = "QoE-Metrics" ":" "Off" / 1# (stream-url ";" Metrics ";" Sending-rate [";" Range] [";" FR]) CRLF and a=QoE-Metrics: Metrics ";" Sending-rate [";" Range] [";" FR] CRLF
Claims (30)
- A method comprising:- determining (304) a quality feedback value according to at least one quality metric from a pre-defined set of quality metrics, the at least one quality metric being at least a quality metric related to a playback of a media stream streamed to a client (601) in a streaming service; and- reporting (306) said quality feedback value to a server (600),characterized in that
said quality metric related to said playback of said media stream is a frame-rate-deviation metric that is related to a deviation of a playback frame rate of said playback of said media stream from a pre-defined value provided by said server. - The method according to claim 1, wherein said quality metric contains a frame rate deviation value indicating a difference between a pre-defined frame rate and the actual playback frame rate of said playback of said media stream.
- The method according to claim 1, wherein said frame-rate-deviation metric further is related to a duration of said deviation.
- The method according to claim 1, wherein said quality metric is reported together with a timestamp.
- The method according to claim 1, wherein said frame-rate-deviation metric is reported together with a timestamp, and wherein said timestamp is equal to a play time of a first played frame during said deviation relative to a starting time of a quality reporting period.
- The method according to claim 5, wherein said play time is a Normal Play Time NPT.
- The method according to claim 1, wherein said pre-defined set of quality metrics contains a further quality metric related to said playback of said media stream, wherein said further quality metric is jitter-duration metric that is related to a duration of a jitter event that occurs when a difference between a playback time of said media stream and a scheduled playback time is larger than a pre-defined value.
- The method according to claim 1, wherein said streaming of said media stream is based on a Real-time Transport Protocol RTP (102).
- The method according to claim 1, wherein said streaming is at least partially controlled by a Real-time Streaming Protocol RTSP (109).
- The method according to claim 9, wherein said quality feedback value is reported to said server (600) via said RTSP (109).
- The method according claim 1, wherein said streaming service is a Packet-switched Streaming Service PSS in a 3G mobile communications system.
- The method according claim 1, wherein said pre-defined value is provided by said server during QoE negotiation.
- A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method steps of claim 1.
- A computer program product comprising a computer program according to claim 13.
- A system, comprising:- at least one server (600), and- at least one client (601),wherein a media stream is streamed to said at least one client (601) for playback, wherein a quality feedback value is determined according to at least one quality metric from a pre-defined set of quality metrics, the at least one quality metric being at least a quality metric related to said playback of said media stream, and wherein said quality feedback value is reported to said at least one server (600),
characterized in that
said quality metric related to said playback of said media stream is a frame-rate-deviation metric that is related to a deviation of a playback frame rate of said playback of said media stream from a pre-defined value provided by said server. - A client (601), comprising:- means for receiving a media stream that is streamed to said client (601) for playback in a streaming service,- means for determining a quality feedback value according to at least one quality metric from a pre-defined set of quality metrics, the at least one quality metric being at least a quality metric related to said playback of said media stream; and- means for reporting said quality feedback value to a server (600),characterized in that
said quality metric related to said playback of said media stream is a frame-rate-deviation metric that is related to a deviation of a playback frame rate of said playback of said media stream from a pre-defined value provided by said server (600). - The client (601) according to claim 16, wherein said quality metric contains a frame rate deviation value indicating a difference between a pre-defined frame rate and the actual playback frame rate of said playback of said media stream.
- The client (601) according to claim 16, wherein said frame-rate-deviation metric further is related to a duration of said deviation.
- The client (601) according to claim 16, wherein said quality metric is reported together with a timestamp.
- The client (601) according to claim 16, wherein said frame-rate-deviation metric is reported together with a timestamp, and wherein said timestamp is equal to a play time of a first played frame during said deviation relative to a starting time of a quality reporting period.
- The client (601) according to claim 20, wherein said play time is a Normal Play Time NPT.
- The client (601) according to claim 16, wherein said pre-defined set of quality metrics contains a further quality metric related to said playback of said media stream, wherein said further quality metric is jitter-duration metric that is related to a duration of a jitter event that occurs when a difference between a playback time of said media stream and a scheduled playback time is larger than a pre-defined value.
- The client (601) according to claim 16, wherein said streaming of said media stream is based on a Real-time Transport Protocol RTP (102).
- The client (601) according to claim 16, wherein said streaming is at least partially controlled by a Real-time Streaming Protocol RTSP (109).
- The client (601) according to claim 24, wherein said quality feedback value is reported to said server via said RTSP (109).
- The client (601) according claim 16, wherein said streaming service is a Packet-switched Streaming Service PSS in a 3G mobile communications system.
- The client (601) according claim 16, wherein said pre-defined value is provided by said server (600) during QoE negotiation.
- A server (600) comprising:- means for receiving a quality feedback value that is reported by a client (601) to said server (600), wherein said quality feedback value is determined by said client (601) according to at least one quality metric from a pre-defined set of quality metrics, the at least one quality metric being at least a quality metric related to a playback of a media stream that is streamed to said client (601), characterized in thatsaid quality metric related to said playback of said media stream is a frame-rate-deviation metric that is related to a deviation of a playback frame rate of said playback of said media stream from a pre-defined value, and in that
said server further comprises means for providing said pre-defined value to said client. - The server (600) according to claim 28, wherein said quality metric contains a frame rate deviation value indicating a difference between a pre-defined frame rate and the actual playback frame rate of said playback of said media stream.
- The server (600) according to claim 28, wherein said means for providing said pre-defined value to said client are means for providing said pre-defined value to said client during QoE negotiation.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/841,228 US8010652B2 (en) | 2004-05-07 | 2004-05-07 | Refined quality feedback in streaming services |
PCT/IB2005/001296 WO2005109821A1 (en) | 2004-05-07 | 2005-05-03 | Refined quality feedback in streaming services |
Publications (2)
Publication Number | Publication Date |
---|---|
EP1743467A1 EP1743467A1 (en) | 2007-01-17 |
EP1743467B1 true EP1743467B1 (en) | 2019-02-06 |
Family
ID=34967332
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP05737952.1A Active EP1743467B1 (en) | 2004-05-07 | 2005-05-03 | Refined quality feedback in streaming services |
Country Status (8)
Country | Link |
---|---|
US (3) | US8010652B2 (en) |
EP (1) | EP1743467B1 (en) |
JP (1) | JP2007536859A (en) |
KR (2) | KR100906158B1 (en) |
CN (1) | CN1951083B (en) |
AU (1) | AU2005241687B2 (en) |
TW (1) | TWI309121B (en) |
WO (1) | WO2005109821A1 (en) |
Families Citing this family (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6930709B1 (en) | 1997-12-04 | 2005-08-16 | Pentax Of America, Inc. | Integrated internet/intranet camera |
EP1723539A4 (en) * | 2004-03-03 | 2009-02-25 | Packetvideo Network Solutions | System and method for retrieving digital multimedia content from a network node |
KR100640862B1 (en) * | 2004-08-03 | 2006-11-02 | 엘지전자 주식회사 | A dynamic control method of an timeout measurement in a forward message transmission |
US8374166B1 (en) * | 2005-09-22 | 2013-02-12 | Verizon Patent And Licensing Inc. | Method and system for providing call waiting features in a SIP-based network |
CN101026616B (en) * | 2006-02-18 | 2013-01-09 | 华为技术有限公司 | Multimedia subsystem based interactive media session establishing system and method |
WO2007133697A2 (en) | 2006-05-11 | 2007-11-22 | Cfph, Llc | Methods and apparatus for electronic file use and management |
US20090313376A1 (en) * | 2006-06-02 | 2009-12-17 | Mats Cedervall | Method and apparatuses for establishing a session between a client terminal and a media supply system to transport a unicast media stream over an ip network |
PL2615833T3 (en) * | 2006-10-19 | 2014-10-31 | Ericsson Telefon Ab L M | A method for determining video quality |
US8959239B2 (en) * | 2006-12-29 | 2015-02-17 | Telefonaktiebolaget L M Ericsson (Publ) | Method and apparatus for reporting streaming media quality |
KR100810223B1 (en) | 2007-01-19 | 2008-03-06 | 삼성전자주식회사 | System and method for providing live streaming service between terminals |
US20080228912A1 (en) * | 2007-03-16 | 2008-09-18 | Ramakrishna Vedantham | Enhanced Quality Reporting for Transmission Sessions |
US7991904B2 (en) | 2007-07-10 | 2011-08-02 | Bytemobile, Inc. | Adaptive bitrate management for streaming media over packet networks |
US7987285B2 (en) | 2007-07-10 | 2011-07-26 | Bytemobile, Inc. | Adaptive bitrate management for streaming media over packet networks |
US7996482B1 (en) * | 2007-07-31 | 2011-08-09 | Qurio Holdings, Inc. | RDMA based real-time video client playback architecture |
US20090125636A1 (en) * | 2007-11-13 | 2009-05-14 | Qiong Li | Payload allocation methods for scalable multimedia servers |
US8762476B1 (en) | 2007-12-20 | 2014-06-24 | Qurio Holdings, Inc. | RDMA to streaming protocol driver |
US20090210552A1 (en) * | 2008-02-15 | 2009-08-20 | Alcatel Lucent | Facilitating access to IPTV content using a portable device while roaming |
WO2009104869A1 (en) * | 2008-02-20 | 2009-08-27 | Electronics And Telecommunications Research Institute | Method and apparatus for svc video and aac audio synchronization using npt |
KR100916505B1 (en) * | 2008-02-20 | 2009-09-08 | 한국전자통신연구원 | Method and apparatus for svc video and aac audio synchronization using ntp |
US8060904B1 (en) | 2008-02-25 | 2011-11-15 | Qurio Holdings, Inc. | Dynamic load based ad insertion |
US20100029266A1 (en) * | 2008-07-02 | 2010-02-04 | Nokia Corporation | System and methods for quality of experience reporting |
KR101324222B1 (en) * | 2008-09-04 | 2013-11-01 | 삼성테크윈 주식회사 | Image processing apparatus |
US8775665B2 (en) * | 2009-02-09 | 2014-07-08 | Citrix Systems, Inc. | Method for controlling download rate of real-time streaming as needed by media player |
US9154375B2 (en) * | 2009-11-20 | 2015-10-06 | Carrier Iq, Inc. | Method for recording user experience or performance of a peripheral device |
CN102238152B (en) * | 2010-05-06 | 2015-09-23 | 华为技术有限公司 | Control the methods, devices and systems of content report behavior |
US8423658B2 (en) * | 2010-06-10 | 2013-04-16 | Research In Motion Limited | Method and system to release internet protocol (IP) multimedia subsystem (IMS), session initiation protocol (SIP), IP-connectivity access network (IP-CAN) and radio access network (RAN) networking resources when IP television (IPTV) session is paused |
EP2448265A1 (en) | 2010-10-26 | 2012-05-02 | Google, Inc. | Lip synchronization in a video conference |
TWI583160B (en) | 2011-02-11 | 2017-05-11 | 內數位專利控股公司 | Method and apparatus for synchronizing mobile station media flows during a collaborative session |
US9288251B2 (en) | 2011-06-10 | 2016-03-15 | Citrix Systems, Inc. | Adaptive bitrate management on progressive download with indexed media files |
EP2719144B1 (en) | 2011-06-10 | 2018-08-08 | Citrix Systems, Inc. | On-demand adaptive bitrate management for streaming media over packet networks |
US9210302B1 (en) | 2011-08-10 | 2015-12-08 | Google Inc. | System, method and apparatus for multipoint video transmission |
US20130195119A1 (en) * | 2011-10-14 | 2013-08-01 | Qualcomm Incorporated | Feedback channel for wireless display devices |
US9769281B2 (en) * | 2011-12-19 | 2017-09-19 | Google Technology Holdings LLC | Method and apparatus for determining a multimedia representation for a multimedia asset delivered to a client device |
US8917309B1 (en) | 2012-03-08 | 2014-12-23 | Google, Inc. | Key frame distribution in video conferencing |
US10489389B2 (en) | 2012-06-07 | 2019-11-26 | Wormhole Labs, Inc. | Experience analytic objects, systems and methods |
US8791982B1 (en) | 2012-06-27 | 2014-07-29 | Google Inc. | Video multicast engine |
US9125073B2 (en) * | 2012-08-03 | 2015-09-01 | Intel Corporation | Quality-aware adaptive streaming over hypertext transfer protocol using quality attributes in manifest file |
US9100698B2 (en) * | 2012-10-26 | 2015-08-04 | Motorola Solutions, Inc. | Systems and methods for sharing bandwidth across multiple video streams |
US9348903B2 (en) | 2013-02-08 | 2016-05-24 | John Moran | Methods, devices and computer readable mediums for a music recognition game |
US9774869B2 (en) * | 2013-03-25 | 2017-09-26 | Blackberry Limited | Resilient signal encoding |
US9665895B2 (en) * | 2013-08-12 | 2017-05-30 | Mov, Inc. | Technologies for video-based commerce |
GB2521078B (en) * | 2013-10-24 | 2016-01-13 | Motorola Solutions Inc | Systems and methods for sharing bandwidth across multiple video streams |
CN103594103B (en) * | 2013-11-15 | 2017-04-05 | 腾讯科技(成都)有限公司 | Audio-frequency processing method and relevant apparatus |
WO2015113960A1 (en) * | 2014-01-29 | 2015-08-06 | Koninklijke Kpn N.V. | Establishing a streaming presentation of an event |
US9992500B2 (en) * | 2014-03-18 | 2018-06-05 | Intel Corporation | Techniques for evaluating compressed motion video quality |
KR101507032B1 (en) * | 2014-08-26 | 2015-04-01 | 계명대학교 산학협력단 | System and method for real-time synchronized playback of streaming media |
US11265359B2 (en) | 2014-10-14 | 2022-03-01 | Koninklijke Kpn N.V. | Managing concurrent streaming of media streams |
US9681163B1 (en) * | 2015-03-26 | 2017-06-13 | Amazon Technologies, Inc. | Identify bad files using QoS data |
US9609275B2 (en) | 2015-07-08 | 2017-03-28 | Google Inc. | Single-stream transmission method for multi-user video conferencing |
US11089183B1 (en) * | 2019-08-20 | 2021-08-10 | Amazon Technologies, Inc. | Multiple device audio-video synchronization |
WO2021178900A1 (en) | 2020-03-06 | 2021-09-10 | Christopher Renwick Alston | Technologies for augmented-reality |
US12058193B2 (en) * | 2021-06-30 | 2024-08-06 | Tencent America LLC | Bidirectional presentation datastream |
Family Cites Families (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4101270A1 (en) | 1991-01-17 | 1992-07-23 | Siemens Ag | METHOD FOR TRANSMITTING DIGITAL SIGNALS |
US5493584A (en) * | 1993-12-27 | 1996-02-20 | Emeott; Stephen P. | Method for determining a channel quality metric in a receiver |
US5533021A (en) | 1995-02-03 | 1996-07-02 | International Business Machines Corporation | Apparatus and method for segmentation and time synchronization of the transmission of multimedia data |
US5652749A (en) | 1995-02-03 | 1997-07-29 | International Business Machines Corporation | Apparatus and method for segmentation and time synchronization of the transmission of a multiple program multimedia data stream |
JP3967443B2 (en) | 1998-01-22 | 2007-08-29 | 富士通株式会社 | Image data transmission / reception system, transmitting apparatus thereof, receiving apparatus thereof, and storage medium storing program thereof |
US6259677B1 (en) | 1998-09-30 | 2001-07-10 | Cisco Technology, Inc. | Clock synchronization and dynamic jitter management for voice over IP and real-time data |
US7606164B2 (en) * | 1999-12-14 | 2009-10-20 | Texas Instruments Incorporated | Process of increasing source rate on acceptable side of threshold |
US6658027B1 (en) | 1999-08-16 | 2003-12-02 | Nortel Networks Limited | Jitter buffer management |
US7143432B1 (en) * | 1999-10-01 | 2006-11-28 | Vidiator Enterprises Inc. | System for transforming streaming video data |
US7574351B2 (en) * | 1999-12-14 | 2009-08-11 | Texas Instruments Incorporated | Arranging CELP information of one frame in a second packet |
EP1130839B1 (en) | 2000-03-02 | 2005-06-08 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for retransmitting video data frames with priority levels |
GB2362533A (en) * | 2000-05-15 | 2001-11-21 | Nokia Mobile Phones Ltd | Encoding a video signal with an indicator of the type of error concealment used |
US7286652B1 (en) | 2000-05-31 | 2007-10-23 | 3Com Corporation | Four channel audio recording in a packet based network |
US6973102B2 (en) | 2000-07-31 | 2005-12-06 | Telefonaktiebolaget Lm Ericsson (Publ) | Jitter reduction in differentiated services (DiffServ) networks |
FI120125B (en) * | 2000-08-21 | 2009-06-30 | Nokia Corp | Image Coding |
US6785353B1 (en) * | 2000-09-06 | 2004-08-31 | Telogy Networks, Inc. | Synchronization loss detection in a V.34 receiver |
US7336613B2 (en) | 2000-10-17 | 2008-02-26 | Avaya Technology Corp. | Method and apparatus for the assessment and optimization of network traffic |
EP1338131B1 (en) * | 2000-11-29 | 2009-08-19 | BRITISH TELECOMMUNICATIONS public limited company | Transmitting and receiving real-time data |
US6934318B2 (en) * | 2000-12-22 | 2005-08-23 | Qualcomm, Incorporated | Method and system for energy based frame rate determination |
US20020080719A1 (en) * | 2000-12-22 | 2002-06-27 | Stefan Parkvall | Scheduling transmission of data over a transmission channel based on signal quality of a receive channel |
US6856601B1 (en) | 2001-04-03 | 2005-02-15 | Cisco Technology, Inc. | Shared digital signal processing resources for communications devices |
US20020178330A1 (en) * | 2001-04-19 | 2002-11-28 | Schlowsky-Fischer Mark Harold | Systems and methods for applying a quality metric to caching and streaming of multimedia files over a network |
US7047308B2 (en) | 2001-08-31 | 2006-05-16 | Sharp Laboratories Of America, Inc. | System and method for simultaneous media playout |
US7185084B2 (en) * | 2001-09-05 | 2007-02-27 | Intel Corporation | Server-side measurement of client-perceived quality of service |
JP2003304523A (en) * | 2002-02-08 | 2003-10-24 | Ntt Docomo Inc | Information delivery system, information delivery method, information delivery server, content delivery server, and terminal |
US7010598B2 (en) * | 2002-02-11 | 2006-03-07 | Akamai Technologies, Inc. | Method and apparatus for measuring stream availability, quality and performance |
US7596373B2 (en) | 2002-03-21 | 2009-09-29 | Mcgregor Christopher M | Method and system for quality of service (QoS) monitoring for wireless devices |
US6981184B2 (en) * | 2002-04-11 | 2005-12-27 | Motorola, Inc. | Apparatus and method for processing a corrupted frame |
JP3964751B2 (en) | 2002-07-15 | 2007-08-22 | 日本電信電話株式会社 | Network quality estimation control method |
CN1669322A (en) * | 2002-07-15 | 2005-09-14 | 诺基亚有限公司 | Method for error concealment in video sequences |
US7038710B2 (en) * | 2002-07-17 | 2006-05-02 | Koninklijke Philips Electronics, N.V. | Method and apparatus for measuring the quality of video data |
US20040073690A1 (en) * | 2002-09-30 | 2004-04-15 | Neil Hepworth | Voice over IP endpoint call admission |
WO2004040928A1 (en) | 2002-10-29 | 2004-05-13 | Telefonaktiebolaget Lm Ericsson (Publ) | Reporting for multi-user services in wireless networks |
AU2003295515A1 (en) | 2002-11-11 | 2004-06-03 | Supracomm, Inc. | Multicast videoconferencing |
US7251267B2 (en) | 2002-12-13 | 2007-07-31 | Lucent Technologies Inc. | System and method for determining a best round trip delay indicator |
US7630612B2 (en) * | 2003-02-10 | 2009-12-08 | At&T Intellectual Property, I, L.P. | Video stream adaptive frame rate scheme |
US7394833B2 (en) | 2003-02-11 | 2008-07-01 | Nokia Corporation | Method and apparatus for reducing synchronization delay in packet switched voice terminals using speech decoder modification |
EP1453269A1 (en) * | 2003-02-25 | 2004-09-01 | Matsushita Electric Industrial Co., Ltd. | A method of reporting quality metrics for packet switched streaming |
US20040170163A1 (en) | 2003-02-28 | 2004-09-02 | Zarlink Semiconductor V.N. Inc. | Data structure providing storage and bandwidth savings for hardware RTCP statistics collection applications |
US20050163047A1 (en) | 2003-03-20 | 2005-07-28 | Christopher M. Mcgregor, Gregory M. Mcgregor And Travis M. Mcgregor | Method and system for processing quality of service (QOS) performance levels for wireless devices |
US20050010462A1 (en) | 2003-07-07 | 2005-01-13 | Mark Dausch | Knowledge management system and methods for crude oil refining |
JP2007503741A (en) * | 2003-08-21 | 2007-02-22 | ビディアトアー エンタープライジズ インコーポレイテッド | Quality of experience (QOE) metrics for wireless communication networks |
WO2005022865A1 (en) | 2003-09-02 | 2005-03-10 | Nokia Corporation | Transmission of embedded information relating to a quality of service |
JP4219930B2 (en) | 2003-09-10 | 2009-02-04 | 富士通株式会社 | Transmission parameter control device |
US20050076113A1 (en) * | 2003-09-12 | 2005-04-07 | Finisar Corporation | Network analysis sample management process |
US7158899B2 (en) | 2003-09-25 | 2007-01-02 | Logicvision, Inc. | Circuit and method for measuring jitter of high speed signals |
US20050089092A1 (en) * | 2003-10-22 | 2005-04-28 | Yasuhiro Hashimoto | Moving picture encoding apparatus |
-
2004
- 2004-05-07 US US10/841,228 patent/US8010652B2/en active Active
-
2005
- 2005-05-03 JP JP2007512580A patent/JP2007536859A/en active Pending
- 2005-05-03 EP EP05737952.1A patent/EP1743467B1/en active Active
- 2005-05-03 KR KR1020067023233A patent/KR100906158B1/en active IP Right Grant
- 2005-05-03 AU AU2005241687A patent/AU2005241687B2/en not_active Ceased
- 2005-05-03 KR KR1020077026408A patent/KR100813929B1/en active IP Right Grant
- 2005-05-03 WO PCT/IB2005/001296 patent/WO2005109821A1/en active Application Filing
- 2005-05-03 CN CN2005800144295A patent/CN1951083B/en active Active
- 2005-05-05 TW TW094114476A patent/TWI309121B/en active
-
2008
- 2008-04-02 US US12/080,494 patent/US7743141B2/en not_active Expired - Lifetime
-
2010
- 2010-04-29 US US12/799,656 patent/US8060608B2/en not_active Expired - Lifetime
Non-Patent Citations (2)
Title |
---|
SEELAM N ET AL: "A hysteresis based approach for quality, frame rate, and buffer management for video streaming using TCP", SECURITY IN COMMUNICATION NETWORKS : THIRD INTERNATIONAL CONFERENCE ; REVISED PAPERS / SCN 2002, AMALFI, ITALY, SEPTEMBER 11 - 13, 2002; [LECTURE NOTES IN COMPUTER SCIENCE , ISSN 0302-9743], SPRINGER VERLAG, DE, vol. 2216, 29 October 2001 (2001-10-29), pages 1 - 15, XP002374403, ISBN: 978-3-540-24128-7, DOI: 10.1007/3-540-45508-6_1 * |
WAN H-H ET AL: "EFFICIENT FRAME SCHEDULING FOR MPEG VIDEO STREAMS", PROCEEDINGS OF THE IASTED CONFERENCE. INTERNET AND MULTIMEDIASYSTEMS AND APPLICATIONS. PROCEEDINGS OF CONFERENCE ON INTERNETAND MULTIMEDIA SYSTEMS AND APPLICATIONS, XX, XX, 19 November 2000 (2000-11-19), pages 398 - 403, XP009028505 * |
Also Published As
Publication number | Publication date |
---|---|
US20050259947A1 (en) | 2005-11-24 |
TW200623774A (en) | 2006-07-01 |
CN1951083A (en) | 2007-04-18 |
US8010652B2 (en) | 2011-08-30 |
KR20070112433A (en) | 2007-11-23 |
KR20060135939A (en) | 2006-12-29 |
US20080189412A1 (en) | 2008-08-07 |
US20100215339A1 (en) | 2010-08-26 |
AU2005241687A1 (en) | 2005-11-17 |
EP1743467A1 (en) | 2007-01-17 |
KR100813929B1 (en) | 2008-03-18 |
KR100906158B1 (en) | 2009-07-03 |
TWI309121B (en) | 2009-04-21 |
US8060608B2 (en) | 2011-11-15 |
JP2007536859A (en) | 2007-12-13 |
CN1951083B (en) | 2012-07-11 |
US7743141B2 (en) | 2010-06-22 |
AU2005241687B2 (en) | 2008-09-04 |
WO2005109821A1 (en) | 2005-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1743467B1 (en) | Refined quality feedback in streaming services | |
AU2004317111B2 (en) | Timing of quality of experience metrics | |
US20200296150A1 (en) | Classified media quality of experience | |
EP2098033B1 (en) | Method and apparatus for reporting streaming media quality | |
EP2364017B1 (en) | Method, system and user device for obtaining key frame in streaming media service | |
EP2568669B1 (en) | Method, apparatus and system for controlling content reporting behaviors | |
KR100808981B1 (en) | Timing of quality of experience metrics | |
KR20060108771A (en) | Classified media quality of experience | |
JP2012147504A (en) | Quality of classified media experiences |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20061018 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: HANNUKSELA, MISKA Inventor name: CURCIO, IGOR Inventor name: WANG, YE-KUI Inventor name: AKSU, EMRE |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20080213 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: NOKIA CORPORATION |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: NOKIA TECHNOLOGIES OY |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20180906 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: AT Ref legal event code: REF Ref document number: 1095564 Country of ref document: AT Kind code of ref document: T Effective date: 20190215 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602005055361 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190606 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1095564 Country of ref document: AT Kind code of ref document: T Effective date: 20190206 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190606 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190506 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190507 |
|
RAP2 | Party data changed (patent owner data changed or rights of a patent transferred) |
Owner name: NOKIA TECHNOLOGIES OY |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602005055361 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 |
|
26N | No opposition filed |
Effective date: 20191107 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190531 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190531 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20190531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190503 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190503 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190206 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20050503 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602005055361 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: H04L0029060000 Ipc: H04L0065000000 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230527 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20240415 Year of fee payment: 20 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240404 Year of fee payment: 20 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240403 Year of fee payment: 20 |