WO2006035438A1 - Media player and method for operating a media player - Google Patents
Media player and method for operating a media player Download PDFInfo
- Publication number
- WO2006035438A1 WO2006035438A1 PCT/IL2005/001040 IL2005001040W WO2006035438A1 WO 2006035438 A1 WO2006035438 A1 WO 2006035438A1 IL 2005001040 W IL2005001040 W IL 2005001040W WO 2006035438 A1 WO2006035438 A1 WO 2006035438A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- frame
- frames
- display
- time
- media player
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/80—Responding to QoS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
- H04L65/764—Media network packet handling at the destination
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/132—Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/154—Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/156—Availability of hardware or computational resources, e.g. encoding based on power-saving criteria
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/44—Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/587—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
Definitions
- This invention relates to media player
- a key element in the modern networked environment is streaming of multi-media over LAN (Local Area Network), WAN (Wide Area Network), Internet, cellular and other networks.
- LAN Local Area Network
- WAN Wide Area Network
- Internet cellular and other networks.
- technologies such as bit rate, that are inherently limited.
- H.261, H.263, H.264 and MPEG-4 standards is used for streaming video.
- the compression algorithm used in these cases is based on the principle that temporal and spatial redundancy in motion pictures make up the majority of the visual information that humans perceive.
- the data indicative of a frame may be encoded in a way that substantially reduces the data storage and transfer requirements for the medium.
- Fig. 1 shows a prior art multimedia player 100 that is used to handle an incoming data input stream 1.
- the player 100 comprises three basic components.
- a network interface 2 receives data packets of the input stream 1 from a source (not shown), and stores them in an associated buffer 16 for subsequent processing.
- a decoder 4 retrieves data packets from the buffer 16 and analyzes the retrieved data for information that is needed to prepare the data for presentation. This includes, for example, decoding the data required for generation of a picture to be displayed on a computer monitor.
- a Tenderer 6 uses the decoded data to generate data indicative of a frame and sends the data to a display device 18.
- the decoded data may be RGB data of a video image to be displayed on a computer monitor.
- a player When implemented as a software module, a player uses central processing unit (CPU) resources to execute its tasks. When the central processor performs other tasks simultaneously with operating the player, CPU utilization time is shared by several tasks under the control of the operating system, including the player.
- the frame rate of the media input stream to be displayed is determined by the encoding element of the source where the bit stream originated. Since the player has to process data that is continuously streamed over the network from an independent source, it should be allocated sufficient CPU time in order to properly process incoming data prior to the arrival of new data. If the CPU power allocated to the player at any moment is not sufficient for real time performance of all the tasks of the player, degradation in the quality of the displayed video occurs that may be evident in delays or artifacts. Examples of such phenomena are: Delayed presentation of video. This occurs when the network interface 2
- a high priority task is capable of retrieving all data packets, but the subsequent stages (decoding, rendering) fail to process the data in a timely manner and thus the actual display of the picture is delayed.
- the delay may vary per picture frame (dependent upon the momentary CPU load) and thus a video stream may be displayed with a distorted or non-constant timing.
- Loss of data - During periods of CPU overload, the network interface 2 might not be able to cope with the rate that the data packets are received, resulting in loss of data packets. In the case of video streaming, this results in a distorted picture in the form of artifacts, and it might take some time to regain a high quality picture when sufficient CPU resources are subsequently allocated to the player.
- the present invention provides a media player and methods and systems for managing CPU power allocated to a media player.
- the player may be, for example, an audio player, a video player, or a multimedia player.
- the player receives data packets in a data stream from a source.
- the player is used together with a device, referred to herein generally as a "display device", "output device” or “window” for displaying frames.
- a display device referred to herein generally as a "display device", "output device” or “window” for displaying frames.
- the display device might be one or more monitor screens.
- the display device is one or more speakers.
- the display device might be a combination of one or more monitor screens and one or more speakers.
- the present invention provides a media player.
- the player of the invention has a controlled renderer that uses a selection criterion to select frames for display on the display device based on directives received from a renderer control.
- the selection criterion selects some of the frames, but not necessarily all of the frames, from the frames that were decoded from the input data stream for display on the display device so as to reduce the processing power required of the CPU while providing a continuous display of frames.
- the selection criterion selects frames at a predetermined spacing between consecutive frames that is larger than the spacing between frames in the initial input stream.
- a "forecasted time for display" is calculated for each input frame, and the controlled renderer selects for display the frame having a forecasted time - A -
- the display rate of the displayed frames does not exceed a maximum rate that is at or slightly above the minimum display rate that is optimal for the specific application - which in some cases may be lower than the transmitted frame rate.
- a display rate of 30 frames per second is around the minimum rate that produces an impression of smooth and continuous display of the video frames.
- data frames are discarded so as to produce a data stream for display having a frequency at or slightly above 30 frames per second.
- the frame rate of the output stream to be displayed on one or more display devices or on one or more fields on a single display device maybe dynamically changed based on criteria derived from prevailing conditions that may be internal or external to the application. Examples of such circumstances might include the size of a window or field on the display device, the relative importance of a window, the number of opened windows, etc.
- the frame rate of the output streams to be displayed may be changed dynamically to match the current CPU power allocated to the player.
- the displayed frame rate for one or more display devices is lowered when the CPU power allocated to the display device is low.
- a method for degradation of the quality of displayed video when the required CPU power exceeds the available power is achieved by raising the priority of the network interface and the decoding processes and lowering the priority of the rendering process. This reduces the chances of losing synchronization with the incoming stream data, and reduces the occurrence of serious artifacts.
- large volumes of data may accumulate in the pre-renderer buffer waiting for further processing.
- the buffered data are selectively processed in order to reduce or eliminate the risk of a subsequent period of CPU power allotment that is insufficient to process all of the accumulated data.
- system may be a suitably programmed computer.
- the invention contemplates a computer program being readable by a computer for executing the method of the invention.
- the invention further contemplates a machine-readable memory tangibly embodying a program of instructions executable by the machine for executing the method of the invention.
- the invention provides a media player comprising a controlled renderer rendering media frames for outputting to one or more display devices, the renderer being configured to select frames from an input data stream according to one or more predetermined criteria for output to a display device.
- the invention provides a method for operating a media player comprising rendering media frames for outputting to one or more display devices, the rendering comprising selecting frames decoded from an input data stream according to one or more predetermined criteria for output to a display device.
- the invention provides a program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform method steps for operating a media player comprising rendering media frames for outputting to one or more display devices, the rendering comprising selecting frames from a decoded input data stream according to a predetermined criterion wherein at least one frame in the input stream may not be output to a display device.
- the invention provides a computer program product comprising a computer useable medium having computer readable program code embodied therein for operating a media player the computer program product comprising rendering media frames for outputting to one or more display 5 devices, the rendering comprising selecting decoded frames from an input data stream according to a predetermined criterion wherein at least one frame in the input stream may not be output to a display device.
- the invention provides a computer program comprising computer program code means for performing all the steps of the method of the i o invention when said program is run on a computer.
- the invention provides a jitter filter for use in the media player of the invention, the jitter filter being configured to calculate, for at least one frame in an input stream, a forecasted time to display.
- Fig. 1 shows a simplified generalized block diagram of a prior art multi- 20 media player implemented as a software module
- Fig. 2 shows a block diagram of a multi-media player in accordance with one embodiment of the invention.
- Fig. 2 shows a media player 102 in accordance with one embodiment of the invention.
- the player may be, for example, an audio player, a video player, or a multimedia player.
- the player 102 has a network interface 2 that receives the input stream 1 and stores the data packets of the input stream 1 in the associated buffer 16. From the buffer 16 the undecoded frames 3 are routed to the decoder 4.
- the player 102 includes a jitter filter 8.
- the jitter filter 8 receives the undecoded frames from the buffer 16 and calculates a "forecasted time to display" 9 of each received frame.
- the forecasted time to display is obtained by adding to the "time received” (the time the undecoded frame 3 is received from the network interface module) the "average time between received pictures ", the later being continuously updated by the jitter filter 8 based on a predetermined number of the most recent frames.
- the forecasted time to display 9 generated by the jitter filter 8 is attached to the decoded picture 5 generated by the decoder 4 for the same frame.
- the forecasted time to display 9 and the decoded picture 5 are placed in a pre-renderer buffer 10 which is a cyclic buffer.
- the controlled renderer 14 selectively retrieves a selected frame 12 from the pre-renderer buffer 10, based on one or more selection criteria received through the renderer control 13, and sends each selected frame 12 to the display device 18 for display.
- the pre- renderer buffer 10 acts as a "flexible joint" that is written to by the decoder 4 and read from by the renderer 14, each of these being a separate task that is executed independently of the other.
- the selection criteria select some of the frames for display, but not necessarily all of the frames, so as to reduce the processing power required of the CPU while providing a continuous display of frames. Examples
- This example considers video data that are generated and input to the processor 102 at an input rate of 25 frames per second, and are to be rendered at a frame rate of 12.5 frames per second.
- the process which takes place in order to achieve this is described in Table 1.
- the "Time Received” column 15 lists the time attached to each frame in the "undecoded Frames " 3 stream. Note that, even though, originally, consecutive frames were spaced by exactly 0.04 sec, at the receiving end there is a "jitter " in the time difference, due to the properties of the network transmission.
- the "Time Between Frames " column 16 presents the time difference between a frame and the previous frame. Thus, as shown in Column 15, Frame 21 was received at time 0.001 sec, and Frame 22 was received at time 0.0039 sec.
- the "Average Time Between Frames” column 17 presents the average time between frames for a predetermined number of the most recent frames.
- the predetermined number is the 8 most recent frames (this is referred to herein as an "8-stage jitter filter").
- the average time between frames for Frames 21 to 28 is 0.0396 sec.
- the "Forecasted Time to Display” column 18 presents, for each input frame the time at which it would be displayed if no special treatment were to be carried out by the controlled renderer 14, and it is the sum of the "Time received" for the frame plus the "Average Time Between Frames " of the previous 8 frames.
- the forecasted time to display of the Frame 28, (0.3176 sec) is the sum of the time Frame 28 was received (0.278 sec) and the average time between frames prior to Frame 28 (0.0396).
- the resulting "forecasted time to display" shown in Column 18 is attached to the decoded picture 5 and stored in the pre- renderer buffer 10.
- the Tenderer 14 Since, in this example, the Tenderer 14 is required to render frames at 12.5 frames per second, it will request frames from the pre-rendering buffer 10 using a selection criterion that selects frames 0.08 sec apart. In response to each request, the controlled renderer 14 will receive a frame best fitting the request, which is the frame having a "Forecasted Time to Display " at or immediately prior to the time defined by the selection criteria 11. In the example of Table 1, this means that the Frame 8 which is associated in the pre-renderer buffer 10 with the time 0.3176 sec will be displayed at time 0.32 sec. The selected frames will be displayed at the times shown in the "Actual Time Displayed" column 19. In this example approximately 50% of the frames are discarded (as indicated in column 20) as would be expected when rendering a stream at half its input frame rate. The displayed frames are thus refreshed at a constant rate, unlike the jittery input stream.
- Table 2 presents another example in which an input stream of 25 frames per second is displayed at 10 frames per second. (For simplicity, in Table 2 and all subsequent examples, no jitter is shown in the "Time Received" column).
- a relevant scenario might be application directed - the application can specifically direct the renderer to generate the video output at a reduced rate relative to the source input. Examples for such cases are:
- Another possible scenario is presented by cases in which the input frame rate from the source is higher than required for a human observer, so that the data stream contains redundant or excessive information.
- An example might be a situation in which video originally recorded at 25 frames per second is streamed for playback at double speed - namely a theoretic rate of 50 frames per second. As there is no added value in displaying video at rates higher than 30 frames per second, discarding part of the frames will not cause any operational degradation to the viewing experience. Table 3 presents this example.
- the frame rate for selected video windows may be determined to match the CPU power currently allocated to the player.
- Tenderer module is separately implemented for each window. In addition to the elimination of redundant network interface and decoder modules, this also enables flexibility in setting a different frame rate for each copy of the displayed data.
- Table 4 presents a situation in which an input stream at 25 frames per second is displayed on two windows - one at 25 frames per sec. and the other at 12.5 frames per sec. TABLE 4: Handling of data frames to be displayed on two windows at different frame rates.
- An example of a scenario in which this feature is useful is a security system in which video streams from many cameras are displayed in small windows, and one stream is also analyzed in more detail - displayed on a large widow.
- the network interface module When the CPU is overloaded, the network interface module is assigned higher priority, resulting in input data packets being buffered for later processing by the decoder and the renderer.
- the decoder and renderer When CPU power becomes available again, a typical scenario would result in the decoder and renderer having to cope with large quantities of buffered data, leading once again to overloading of the CPU.
- the player 102 deals with this problem, by discarding data from the buffer 16 while maintaining the time smoothness of the video stream.
- the "Desired Time to Display” of Frame 144, 0.58 sec is delayed until an "Actual Time Displayed” of 0.61 sec, which sets the “Desired Time to Display” of Frame 145 to 0.65 sec, and so on.
- the renderer may find itself with “too many” new (not yet displayed) frames in the buffer. This may be seen at Frame 150 which has a “Desired Time to Display” of 0.96 sec, in which there are three new frames with "Forecasted Time to Display” of 0.96 sec or less. Only one of them is used, and the other two are discarded.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US61322404P | 2004-09-28 | 2004-09-28 | |
US60/613,224 | 2004-09-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006035438A1 true WO2006035438A1 (en) | 2006-04-06 |
Family
ID=35482192
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2005/001040 WO2006035438A1 (en) | 2004-09-28 | 2005-09-28 | Media player and method for operating a media player |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2006035438A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012013858A1 (en) * | 2010-07-30 | 2012-02-02 | Nokia Corporation | Method and apparatus for determining and equalizing one or more segments of a media track |
CN113965786A (en) * | 2021-09-29 | 2022-01-21 | 杭州当虹科技股份有限公司 | Method for accurately controlling video output and playing |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1317109A1 (en) * | 2001-11-29 | 2003-06-04 | Sony International (Europe) GmbH | System and method for controlling the adaptation of adaptive distributed multimedia applications |
US20040125816A1 (en) * | 2002-12-13 | 2004-07-01 | Haifeng Xu | Method and apparatus for providing a buffer architecture to improve presentation quality of images |
-
2005
- 2005-09-28 WO PCT/IL2005/001040 patent/WO2006035438A1/en not_active Application Discontinuation
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1317109A1 (en) * | 2001-11-29 | 2003-06-04 | Sony International (Europe) GmbH | System and method for controlling the adaptation of adaptive distributed multimedia applications |
US20040125816A1 (en) * | 2002-12-13 | 2004-07-01 | Haifeng Xu | Method and apparatus for providing a buffer architecture to improve presentation quality of images |
Non-Patent Citations (1)
Title |
---|
TEZUKA H ET AL: "Experiences with building a continuous media application on Real-Time Mach", REAL-TIME COMPUTING SYSTEMS AND APPLICATIONS, 1995. PROCEEDINGS., SECOND INTERNATIONAL WORKSHOP ON TOKYO, JAPAN 25-27 OCT. 1995, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 25 October 1995 (1995-10-25), pages 88 - 95, XP010196662, ISBN: 0-8186-7106-8 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012013858A1 (en) * | 2010-07-30 | 2012-02-02 | Nokia Corporation | Method and apparatus for determining and equalizing one or more segments of a media track |
CN103053157A (en) * | 2010-07-30 | 2013-04-17 | 诺基亚公司 | Method and apparatus for determining and equalizing one or more segments of a media track |
CN113965786A (en) * | 2021-09-29 | 2022-01-21 | 杭州当虹科技股份有限公司 | Method for accurately controlling video output and playing |
CN113965786B (en) * | 2021-09-29 | 2024-03-26 | 杭州当虹科技股份有限公司 | Method for precisely controlling video output playing |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2095205B1 (en) | Hybrid buffer management | |
KR102324326B1 (en) | Streaming multiple encodings encoded using different encoding parameters | |
EP1374593B1 (en) | Adaptive bandwidth system and method for video transmission | |
US6275536B1 (en) | Implementation architectures of a multi-channel MPEG video transcoder using multiple programmable processors | |
RU2368940C2 (en) | Synchronised graphic data and region data for systems of remote handling graphic data | |
US8005149B2 (en) | Transmission of stream video in low latency | |
US8675728B2 (en) | Transmitting apparatus and method, and receiving apparatus and method | |
US8290036B2 (en) | Method, apparatus and system for concurrent processing of multiple video streams | |
US20140233637A1 (en) | Managed degradation of a video stream | |
US9612965B2 (en) | Method and system for servicing streaming media | |
KR20180031547A (en) | Method and apparatus for adaptively providing multiple bit rate stream media in server | |
JP2006050604A (en) | Method and apparatus for flexibly adjusting buffer amount when receiving av data depending on content attribute | |
KR100643270B1 (en) | Client and method for playing video stream | |
MXPA05006315A (en) | Method and apparatus for providing a buffer architecture to improve presentation quality of images. | |
TW201933867A (en) | Sending device, sending method, and program | |
US9215396B2 (en) | Faster access to television channels | |
WO2006035438A1 (en) | Media player and method for operating a media player | |
CN210670365U (en) | Video pre-monitoring system | |
JP5264146B2 (en) | Synchronous distribution system, synchronous reproduction system, and synchronous distribution reproduction system | |
WO2023144964A1 (en) | Video processing system, compression device, video processing method and program | |
Schwenke et al. | Dynamic Rate Control for JPEG 2000 Transcoding | |
KR20060030879A (en) | The multi-media streaming method and system of a network adaptation live broadcasting for packet filtering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 69(1) EPC OF 120707 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05788507 Country of ref document: EP Kind code of ref document: A1 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 5788507 Country of ref document: EP |