WO2005069635A1 - Method and apparatus for performing synchronised audio and video presentation - Google Patents
Method and apparatus for performing synchronised audio and video presentation Download PDFInfo
- Publication number
- WO2005069635A1 WO2005069635A1 PCT/EP2004/013243 EP2004013243W WO2005069635A1 WO 2005069635 A1 WO2005069635 A1 WO 2005069635A1 EP 2004013243 W EP2004013243 W EP 2004013243W WO 2005069635 A1 WO2005069635 A1 WO 2005069635A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- audio
- processing means
- data packets
- transmission time
- audio data
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 20
- 230000001360 synchronised effect Effects 0.000 title abstract description 14
- 238000012545 processing Methods 0.000 claims abstract description 63
- 230000005540 biological transmission Effects 0.000 claims abstract description 39
- 238000001914 filtration Methods 0.000 claims description 8
- 230000033458 reproduction Effects 0.000 claims 7
- 238000009877 rendering Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 2
- 230000002730 additional effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/24—Systems for the transmission of television signals using pulse code modulation
- H04N7/52—Systems for transmission of a pulse code modulated video signal with one or more other pulse code modulated signals, e.g. an audio signal or a synchronizing signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video stream to a specific local network, e.g. a Bluetooth® network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/242—Synchronization processes, e.g. processing of PCR [Program Clock References]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43072—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4341—Demultiplexing of audio and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4344—Remultiplexing of multiplex streams, e.g. by modifying time stamps or remapping the packet identifiers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/439—Processing of audio elementary streams
Definitions
- the invention relates to the synchronised presentation or reproduction of video and audio streams using non-synchronised processing means.
- MPEG-4 is an international standard developed by the Motion Picture Experts Group (MPEG) which also developed a number of other MPEG-type standards for compressing audio and video data, for example MPEG-1 and MPEG-2.
- MPEG Motion Picture Experts Group
- the encoded/compressed data is treated as object data and both, video and audio data, are combined into a single bitstream. Since an MPEG-4 system is configured to treat data as object data, it is easy to re-organise a received bitstream by separating it into multiple single packets of data. An MPEG-4 player al- lows then the audio and video data to be reproduced on a computer or an other device.
- a problem to be solved by the invention is to provide synchronised presentation or reproduction of video and audio using separate devices the operation of which is basically not synchronised with each other. This problem is solved, by the method disclosed in claim 1. An apparatus that utilises this method is disclosed in claim 8.
- a data stream comprising video and audio streams is received by first processing means, the received data. stream is separated into video and audio streams and audio stream audio data packets are timestamped by the first processing means. Then, audio data packets are forwarded to second processing means, a local system time of the second processing means is determined and transmission time periods of audio data packets from the first processing means to the second processing means are calculated based on the local system time and the timestamp of the audio data packet. Subsequently, synchronised audio and video rendering/presentation based on the transmission time period is performed.
- the process of rendering is accompanied by lowpass filtering the transmission time periods whereby a mean transmission time is obtained and used for synchronisation of video and audio presentation or reproduction.
- a median filter can be used for lowpass filtering the measured transmission time periods in order to the measurement re- suit.
- the present invention solves the above-mentioned problems of the prior art, and provides a method capable of fast response at start-up, as well as high stability during processing.
- the median filter is also very insusceptible with respect to large measuring errors.
- An MPEG-type stream is separated into video data and audio data, wherein the video data is processed on the first device PC_A and the audio data is timestamped and forwarded to the second device PC_B which compares the received timestamp to the local time. The difference is considered to be the required transmission time.
- the internal time clocks of the first processing device and the second processing device are not synchronised.
- the time reference for synchronisation of video and audio stream is obtained by subtracting mean transmission time periods from the local time of the second processing device PC_B. Subsequently, an additional lowpass filtering can be performed by a digital filter such as a Butterworth filter having a cut-off frequency below that of high frequency mo- tion (jitter) which needs to be eliminated.
- Fig. 1 Block diagram illustrating a network of first and second processing means configured to perform audio and video presentation or reproduction
- Fig. 2 Flowchart of the inventive process.
- the reference numeral 100 denotes an MPEG-4 player which sends an MPEG-4 data stream 102 to first processing means PC_A 104 which include a video player 108.
- the received MPEG-type stream comprises a system, video and audio stream, which further contain video data packets 116 and au- dio data packets 134.
- a stream analysing stage 110 examines the streams since the system stream also includes the structure and the configuration of the video and audio players.
- the first computer PC_A 104 processes video data obtained from the MPEG-4 video stream and displays it using e.g. an attached monitor.
- the timestamping stage 112 checks the local time clock 106 and inserts timestamps into audio data packets.
- a network 118 e.g. of type Ethernet (TCP/IP), connects the first processing means 104 with second processing means 120, e.g. a sec- ond computer PC_B, which processes the audio data packets received from the first computer PC_A, using audio player 126.
- TCP/IP type Ethernet
- the time base 114 of the first computer 104 and the time base 132 of the second computer 120 are not synchronised with each other and they have a tendency to drift away from each other.
- the second computer or the network or the first computer checks the local time clock 122 and compares the received timestamp 124 to the local time of time clock 122.
- the second computer or the network or the first computer calculates the corresponding transmission time peri- ods.
- a median filter 128 can be used for lowpass filtering of transmission time periods in order to obtain mean transmission which is in turn used for synchronisation of audio and video rendering.
- a Butterworth filter 130 provides addi- tional lowpass filtering in order to improve the final result.
- MPEG-4 player 100 sends the MPEG-4 stream of data to the first processing means PC_A which processes video data and also forwards the actualised and timestamped audio data packets to the second computer PC_B through the network.
- the second computer PC_B compares the received timestamp to the local time. The difference is considered to be the transmission time period.
- the time base of the video processing computer 104 is not synchronised with the time base of the audio processing computer 120.
- the internal time clocks of the first and the second computer are not synchronised and slowly drift from each other.
- the timestamps received by the second computer can be considered as being altered with respect to their value because the real transmission time cannot be specified exactly. This may have different reasons, for example: traffic on the network line or lines, configuration of TCP/IP and Ethernet, thread processing of the operating system, the varying amount of data, etc..
- traffic on the network line or lines configuration of TCP/IP and Ethernet, thread processing of the operating system, the varying amount of data, etc.
- the time difference between the sending of the packets and their receiving is calculated. This difference is then filtered with a median filter.
- a median filter is a time-discrete, non-linear filter which stores the acquired samples, sorts them and provides the middle sample (or the average of the two middle samples in case of even number of input values) as an output of its operation.
- the median filter used for the invention is very flexible with respect to the number of input samples it processes. Initially all samples values are set to zero. After having collected a pre-defined first number of samples, e.g. 19, the median filter starts outputting the mean transmission time, whereby the length of the median filter corresponds to said first number. As an option, upon receiving further input samples, the filter length used is increased by one per additional input sample received, up to a pre- defined maximum length, e.g. 499.
- an. MPEG-type stream comprising video and audio streams is received by the first processing means PC_A.
- the said MPEG-type data stream is separated into the video and the audio streams, wherein the first processing computer PC_A containing video player processes the video stream and the second processing computer PC_B containing the audio player processes the audio stream.
- the audio data packets are timestamped by the video process- ing computer PC_A and forwarded to the audio processing computer PC_B configured to receive audio data from the video processing computer PC_A.
- the local system time of the audio processing computer is determined.
- the audio stream transmission time periods from the first processing means to the second processing means are calculated in step 208.
- the last step 210 synchronising of audio and video presentation or reproduction based on the calculated transmission time periods takes place.
- video data packets of the video stream are timestamped by the first processing means and video data packets are forwarded to the second processing means configured to receive audio data packets.
- Time periods are calculated for the transmission of the video data packets from the first processing means to the second processing means, based on the corresponding local system time and the timestamps of the video data packets.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/584,637 US8005337B2 (en) | 2004-01-06 | 2004-11-22 | Method and apparatus for performing synchronised audio and video presentation |
EP04821065A EP1702474A1 (en) | 2004-01-06 | 2004-11-22 | Method and apparatus for performing synchronised audio and video presentation |
KR1020067013484A KR101093350B1 (en) | 2004-01-06 | 2004-11-22 | Method and apparatus for performing synchronised audio and video presentation |
JP2006545947A JP2007522696A (en) | 2004-01-06 | 2004-11-22 | Method and apparatus for synchronizing audio and video presentation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04090001A EP1553784A1 (en) | 2004-01-06 | 2004-01-06 | Method and apparatus for performing synchronised audio and video presentation |
EP04090001.1 | 2004-01-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005069635A1 true WO2005069635A1 (en) | 2005-07-28 |
Family
ID=34585985
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2004/013243 WO2005069635A1 (en) | 2004-01-06 | 2004-11-22 | Method and apparatus for performing synchronised audio and video presentation |
Country Status (6)
Country | Link |
---|---|
US (1) | US8005337B2 (en) |
EP (3) | EP1553784A1 (en) |
JP (1) | JP2007522696A (en) |
KR (1) | KR101093350B1 (en) |
CN (1) | CN100499820C (en) |
WO (1) | WO2005069635A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8381086B2 (en) | 2007-09-18 | 2013-02-19 | Microsoft Corporation | Synchronizing slide show events with audio |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1553784A1 (en) * | 2004-01-06 | 2005-07-13 | Deutsche Thomson-Brandt Gmbh | Method and apparatus for performing synchronised audio and video presentation |
CN101137066B (en) * | 2007-05-11 | 2011-01-05 | 中兴通讯股份有限公司 | Multimedia data flow synchronous control method and device |
EP2043323A1 (en) * | 2007-09-28 | 2009-04-01 | THOMSON Licensing | Communication device able to synchronise the received stream with that sent to another device |
US8639830B2 (en) * | 2008-07-22 | 2014-01-28 | Control4 Corporation | System and method for streaming audio |
US9400555B2 (en) | 2008-10-10 | 2016-07-26 | Internet Services, Llc | System and method for synchronization of haptic data and media data |
GB0921668D0 (en) * | 2009-12-10 | 2010-01-27 | Vocality Internat Ltd | Media over IP perfomance enhancement |
JP5483081B2 (en) * | 2010-01-06 | 2014-05-07 | ソニー株式会社 | Receiving apparatus and method, program, and receiving system |
US9179118B2 (en) | 2011-05-12 | 2015-11-03 | Intel Corporation | Techniques for synchronization of audio and video |
CA2890399A1 (en) * | 2012-10-01 | 2014-04-10 | Internet Services, Llc | System and method for synchronization of haptic data and media data |
TWI602437B (en) * | 2015-01-12 | 2017-10-11 | 仁寶電腦工業股份有限公司 | Video and audio processing devices and video conference system |
EP3396964B1 (en) | 2017-04-25 | 2020-07-22 | Accenture Global Solutions Ltd | Dynamic content placement in a still image or a video |
EP3528196A1 (en) * | 2018-02-16 | 2019-08-21 | Accenture Global Solutions Limited | Dynamic content generation |
CN109257642A (en) * | 2018-10-12 | 2019-01-22 | Oppo广东移动通信有限公司 | Video resource playback method, device, electronic equipment and storage medium |
KR102251148B1 (en) * | 2020-05-06 | 2021-05-12 | (주)유브릿지 | Audio-Video Synchronization Processing Method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6236694B1 (en) * | 1994-08-19 | 2001-05-22 | Thomson Licensing S.A. | Bus and interface system for consumer digital equipment |
EP1213630A2 (en) * | 2000-12-06 | 2002-06-12 | Matsushita Electric Industrial Co., Ltd. | Time managing apparatus for managing time to synchronize plural apparatuses |
US20020118679A1 (en) * | 2001-02-27 | 2002-08-29 | Eyer Mark Kenneth | Efficient transport of program clock reference for audio services delivered on an MPEG-2 transport stream |
WO2002073851A1 (en) * | 2001-03-13 | 2002-09-19 | Pulse-Link, Inc. | Maintaining a global time reference among a group of networked devices |
US20020141452A1 (en) * | 1999-12-30 | 2002-10-03 | Oskar Mauritz | Synchronization in packet-switched telecommunications system |
DE10146887A1 (en) * | 2001-09-24 | 2003-04-30 | Steinberg Media Technologies A | Synchronization of digital data streams in an audio processing system, digital tracks are separately processed by different computers and the data played synchronously |
WO2003047134A2 (en) * | 2001-11-28 | 2003-06-05 | Bridgeco Ag | Method for synchronization in networks |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3236226B2 (en) * | 1996-09-11 | 2001-12-10 | 沖電気工業株式会社 | Echo canceller |
JP3173418B2 (en) * | 1997-04-18 | 2001-06-04 | 日本電気株式会社 | Stream playback control method and machine-readable recording medium recording program |
JP2003018600A (en) | 2001-07-04 | 2003-01-17 | Hitachi Ltd | Image decoding apparatus |
JP3984053B2 (en) * | 2002-01-09 | 2007-09-26 | 富士通株式会社 | Home agent |
EP1553784A1 (en) * | 2004-01-06 | 2005-07-13 | Deutsche Thomson-Brandt Gmbh | Method and apparatus for performing synchronised audio and video presentation |
TWI343713B (en) * | 2006-11-24 | 2011-06-11 | Realtek Semiconductor Corp | Signal processing circuit |
-
2004
- 2004-01-06 EP EP04090001A patent/EP1553784A1/en not_active Withdrawn
- 2004-10-22 EP EP04300714A patent/EP1553785A1/en not_active Withdrawn
- 2004-11-22 WO PCT/EP2004/013243 patent/WO2005069635A1/en not_active Application Discontinuation
- 2004-11-22 KR KR1020067013484A patent/KR101093350B1/en not_active IP Right Cessation
- 2004-11-22 CN CNB2004800398390A patent/CN100499820C/en not_active Expired - Fee Related
- 2004-11-22 JP JP2006545947A patent/JP2007522696A/en active Pending
- 2004-11-22 US US10/584,637 patent/US8005337B2/en not_active Expired - Fee Related
- 2004-11-22 EP EP04821065A patent/EP1702474A1/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6236694B1 (en) * | 1994-08-19 | 2001-05-22 | Thomson Licensing S.A. | Bus and interface system for consumer digital equipment |
US20020141452A1 (en) * | 1999-12-30 | 2002-10-03 | Oskar Mauritz | Synchronization in packet-switched telecommunications system |
EP1213630A2 (en) * | 2000-12-06 | 2002-06-12 | Matsushita Electric Industrial Co., Ltd. | Time managing apparatus for managing time to synchronize plural apparatuses |
US20020118679A1 (en) * | 2001-02-27 | 2002-08-29 | Eyer Mark Kenneth | Efficient transport of program clock reference for audio services delivered on an MPEG-2 transport stream |
WO2002073851A1 (en) * | 2001-03-13 | 2002-09-19 | Pulse-Link, Inc. | Maintaining a global time reference among a group of networked devices |
DE10146887A1 (en) * | 2001-09-24 | 2003-04-30 | Steinberg Media Technologies A | Synchronization of digital data streams in an audio processing system, digital tracks are separately processed by different computers and the data played synchronously |
WO2003047134A2 (en) * | 2001-11-28 | 2003-06-05 | Bridgeco Ag | Method for synchronization in networks |
Non-Patent Citations (1)
Title |
---|
RANGAN P V ET AL: "CONTINUITY AND SYNCHRONIZATION IN MPEG", IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, IEEE INC. NEW YORK, US, vol. 14, no. 1, 1996, pages 52 - 60, XP000548810, ISSN: 0733-8716 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8381086B2 (en) | 2007-09-18 | 2013-02-19 | Microsoft Corporation | Synchronizing slide show events with audio |
Also Published As
Publication number | Publication date |
---|---|
CN1902941A (en) | 2007-01-24 |
EP1702474A1 (en) | 2006-09-20 |
EP1553785A1 (en) | 2005-07-13 |
US20070162952A1 (en) | 2007-07-12 |
US8005337B2 (en) | 2011-08-23 |
CN100499820C (en) | 2009-06-10 |
KR20060127020A (en) | 2006-12-11 |
EP1553784A1 (en) | 2005-07-13 |
JP2007522696A (en) | 2007-08-09 |
KR101093350B1 (en) | 2011-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8005337B2 (en) | Method and apparatus for performing synchronised audio and video presentation | |
US6654956B1 (en) | Method, apparatus and computer program product for synchronizing presentation of digital video data with serving of digital video data | |
CA3078998C (en) | Embedded appliance for multimedia capture | |
US5774497A (en) | Method and apparatus for PCR jitter measurement in an MPEG-2 transport stream | |
EP1639816A1 (en) | Method and apparatus for testing lip-sync of digital television receiver | |
JPH11513222A (en) | Display Time Stamping Method and Synchronization Method for Multiple Video Objects | |
JP2002509401A (en) | Method and system for measuring the quality of a digital television signal | |
CN110896503A (en) | Video and audio synchronization monitoring method and system and video and audio broadcasting system | |
CN114697712B (en) | Method, device and equipment for downloading media stream and storage medium | |
AU2019204751B2 (en) | Embedded appliance for multimedia capture | |
Azimi et al. | Implementation of MPEG system target decoder | |
AU2013254937B2 (en) | Embedded Appliance for Multimedia Capture | |
CN112911346A (en) | Video source synchronization method and device | |
KR100538001B1 (en) | Method for synchronizing data, and computer readable medium storing thereof | |
CA2914803C (en) | Embedded appliance for multimedia capture | |
CN115209198A (en) | Video data processing method and device, terminal equipment and storage medium | |
AU2012202843A1 (en) | Embedded appliance for multimedia capture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004821065 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007162952 Country of ref document: US Ref document number: 10584637 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006545947 Country of ref document: JP Ref document number: 200480039839.0 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067013484 Country of ref document: KR Ref document number: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2004821065 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10584637 Country of ref document: US |