WO1999018728A1 - Interconnexion de flux de donnees multimedia presentant differents formats de compression - Google Patents

Interconnexion de flux de donnees multimedia presentant differents formats de compression Download PDF

Info

Publication number
WO1999018728A1
WO1999018728A1 PCT/US1998/020706 US9820706W WO9918728A1 WO 1999018728 A1 WO1999018728 A1 WO 1999018728A1 US 9820706 W US9820706 W US 9820706W WO 9918728 A1 WO9918728 A1 WO 9918728A1
Authority
WO
WIPO (PCT)
Prior art keywords
server
codec means
audio
codec
video
Prior art date
Application number
PCT/US1998/020706
Other languages
English (en)
Inventor
Yonik Breton
Charles Nahas
Gordon Neil Wilson Kerr
Original Assignee
General Datacomm, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Datacomm, Inc. filed Critical General Datacomm, Inc.
Publication of WO1999018728A1 publication Critical patent/WO1999018728A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/152Multipoint control units therefor

Definitions

  • the present invention relates broadly to multimedia telecommunications. More particularly, the present invention relates to methods, apparatus, and systems for the handling of compressed multimedia communication data so that multimedia equipment utilizing different data compression formats can be interconnected with each other.
  • Multimedia communications can be utilized for a number of applications, and in different configurations.
  • One configuration of recent interest has been multimedia conferencing, where several parties can communicate in a conference style.
  • multimedia conferencing the video data is handled such that each party can see at least one of the other parties, while the audio data is handled such that each party can hear one, several, or all of the other parties.
  • various telecommunications standards are presently being adopted by the ITU-T and ISO which govern the protocols of multimedia conferencing (see, e.g., H.320, ITU-T.120).
  • various standards have been adopted for the compression of the video and audio data.
  • compression standards applied to video compression are the JPEG (Joint Photographic Experts Group) standards promulgated by the joint ISO/CO ' IT technical committee ISO/EEC JTC1/SC2/WG10, and the MPEG (Motion Picture Experts Group) standards promulgated by ISO under ISO/IEC 11172(MPEG-1) and ISO/IEC 13818(MPEG-2).
  • JPEG Joint Photographic Experts Group
  • MPEG Motion Picture Experts Group
  • H.261 provides a very high compression ratio.
  • audio compression standards are the G722, G728 and MPEG audio compression.
  • ADPCM adaptive differential pulse code modulation
  • the audio, video, and other data streams generated by a user's system 12a are multiplexed together directly in the encoder section of a multimedia encoder/decoder (codec) 14 located at the source/terminal 16, and transported together as an indivisible stream through the transport network 20 (now proposed in ATM format) to a similar "peer" codec 24 at a remote location.
  • the peer codec is either at the remote user site for a point- to-point conference, and/or at a multimedia bridge 26 for a multipoint conference.
  • the multimedia bridge 26 which typically includes a codec/switch 24 and a controller 28, provides conference control (e.g., it determines the signal to be sent to each participant), audio mixing (bridging) and multicasting, audio level detection for conference control, video switching, video mixing (e.g., a quad split, or "continuous presence device” which combines multiple images for display together) when available and/or desirable, and video multicasting.
  • conference control e.g., it determines the signal to be sent to each participant
  • audio mixing bridging
  • multicasting audio level detection for conference control
  • video switching e.g., a quad split, or "continuous presence device” which combines multiple images for display together
  • video multicasting e.g., a quad split, or "continuous presence device" which combines multiple images for display together
  • each terminal 12 connected to the conference must use a compatible codec 14.
  • a terminal using a Motion JPEG codec cannot communicate with a terminal using an MPEG or
  • H.320 codec Similarly, a terminal using an MPEG codec cannot communicate with a terminal using a Motion JPEG or H.320 codec, etc.
  • transcoders within the fabric of an ATM network or at a user site.
  • the proposed transcoders would have the ability to input a data stream, determine the compression standard being used, and, with the use of a transcoding algorithm, output a data stream which uses a different compression standard.
  • the algorithms necessary to convert one compression standard to another are difficult to design.
  • the algorithms provided thus far have exhibited severe degradation of the output data stream.
  • switching and mixing of multimedia data is rendered more difficult when using a transcoder.
  • the methods, apparatus, and systems according to the invention include providing a multipoint multimedia server which includes a plurality of different compression-scheme related codecs, a multipoint switch, separate audio and video processors, and at least one controller.
  • Data streams of different compression standards from different users enter the server wherein the data streams are directed to the appropriate codec and are decoded to either baseband analog form, or more preferably to decompressed digital form.
  • the decompressed digital or baseband analog signals are mixed and switched by the controller(s) and the multipoint switch according to user commands and are then routed back to appropriate codecs where the mixed and switched signals are recompressed to the appropriate standard for each user before exiting the server.
  • the server is typically located at a remote node which is part of the communications network.
  • the server preferably includes codecs for MPEG, Motion JPEG, and H.320 compression standards and may include other codecs based on proprietary schemes, as desired. While the presently preferred embodiment of the invention converts all of the data streams to analog audio and video prior to mixing and switching, in accord with an alternate embodiment, an appropriate digital multipoint switch can enable the data streams to be mixed and switched in decompressed (baseband) digital form.
  • the multimedia terminals served by the invention may utilize NTSC, PAL, or SEC AM video standards and the data streams may include other data as well as audio and video.
  • the server is coupled to an ATM network. However, the server according to the invention may be used with other networks, either LAN or WAN.
  • Figure 1 is a high level diagram of a prior art multimedia conferencing system having a plurality of same-compression-type multimedia conferencing sites coupled by an ATM telecommunications network;
  • Figure 2 is a high level diagram of a multipoint multimedia server according to the invention coupled to a node of an ATM network serving a plurality of multimedia conferencing sites each utilizing a different data compression standard;
  • Figure 2a is a block diagram of the functionality of a portion of the server of Figure 2;
  • Figures 2b and 2c are flow diagrams of, respectively, the audio processor and video processor of the server of Figure 2a;
  • Figure 3a is a block diagram of a Motion JPEG codec used in the server according to the invention.
  • FIG. 3b is a block diagram of an MPEG codec used in the server according to the invention.
  • Figure 3c is a block diagram of an H.320 codec used in the server according to the invention.
  • a first system 100 services a plurality of users 112a, 112b, 112c, each of which are each provided with codecs 114a, 114b, 114c, and multimedia source/terminals 116a, 116b, 116c.
  • the codecs 114c-114c which are described in greater detail with reference to Figures 3a-3c act as an interface between the network 120, and the source/terminals 116.
  • the source/terminals typically include cameras 117, microphones 118, and computer terminals 119.
  • the computer terminals 119 typically include a video monitor 119a, speakers 119b, keyboards 119c, etc. as is known in the art.
  • user 112a is provided with a Motion JPEG codec 114a
  • user 112b is provided with an MPEG codec 114b
  • user 112c is provided with an H.320 codec 114c.
  • the network 120 is preferably an ATM network and, as such, the H.320 codec 114c is coupled to the network 120 via an ISDN/ATM gateway 115.
  • each of the users 112a- 112c is coupled to the network 120 at a node 122a, 122b, 122c.
  • the system 100 also includes a multipoint multimedia server 130 which may be located anywhere in the ATM network 120.
  • the server 130 includes an ATM switch 132, a plurality of codecs 134a, 134b, 134c...134n, an audio processor 136, a video processor 138, and a controller 140.
  • the codec 134a is a Motion JPEG codec
  • the codec 134b is an MPEG-2 codec
  • the codec 134c is an H.320 codec.
  • the codecs are described in detail below with reference to Figures 3a-3c.
  • each codec receives an appropriate data stream via the switch 132 and passes a decoded signal (either baseband analog or decompressed digital) to the audio processor 136 and the video processor 138.
  • the audio analog signals are converted by an A D converter 136a into digital samples which are fed via a dedicated bus 136b to a DSP 136c.
  • the digital samples are then processed by the DSP 136c (as discussed in more detail below with reference to Fig. 2b).
  • the audio information may be mixed under control of the controller 140 with the audio information of other users of the multimedia conference.
  • the audio mixing involves summing, or weighting and summing the audio of all of the audio signals of the users in the conference, and then subtracting the source audio signal of the destination.
  • the mixed audio samples are sent via the bus 136b for conversion to a baseband analog signal by a D/A converter 136d.
  • the so-obtained mixed audio information may then be coded by the appropriate codec and forwarded to its destination via the ATM switch 132.
  • the video processor 138 may, under control of system controller 140, simply route (via switch 138a) the video information to the codecs, or may multiplex video information using the split processor 138b for split or multi screen transmission to its destination(s). Details of the video processing are discussed in more detail below with reference to Fig. 2c.
  • the audio and video information being received is provided to the user's codec 114 in the proper compression format for processing, and then provided to the video monitor 119a and to the speaker 119b.
  • the server 130 also includes a data processing subsystem 137 which preferably utilizes an Ethernet interface between the codecs 134 and its T.12x stack 137a-137e.
  • the data processing subsystem 137 can receive conference creation, scheduling, and management information which is passed to the system controller 140.
  • the system controller 140 provides various functions including node control and security 140a, operations, administration and management functions 140b, audio and video control 140c, and conference scheduling 140d.
  • the audio and video control function 140c interfaces with the audio and video processors 136 and 138 as previously described.
  • the audio processor initializes itself by setting up buffers, pointers, and assigning default register values. Then, at 152, the audio processor awaits commands from the system controller 140. Upon receiving at 154 a command, the audio processor DSP validates the command at 156 and processes the command at 158 and sets required flags for the interrupt service routine.
  • the interrupt service routine of the audio processor DSP 136c starts at 160 with the DSP obtaining from the A D converters 136a the audio samples available at all ports and storing them into a matrix S.
  • S[P] 0.
  • the power P[p] of each port is computed.
  • the loudest port L[p] is found by comparing the powers P[p] for that group.
  • the gain is adjusted according to the effective and targeted power values as described in co-owned, copending U.S. Patent Application SN 08/888,571 to Tabet et al., filed July 7, 1997, which is hereby incorporated by reference herein in its entirety.
  • the mixed audio samples are then sent out at 174 for digital to analog conversion.
  • a flow chart of the video processing function is provided.
  • the video system initializes itself.
  • the video system awaits a command from the controller 140.
  • the video switch is formatted at 186 such that any input (whether from the codecs or the split processor 138b) can be switched to any output (including the codecs and the split processor).
  • the Motion JPEG codec generally includes audio circuitry 200, video circuitry 220, modem type data circuitry 240, and a main processor or host 250 with an associated bus 252 and associated circuitry such as a boot PROM 254, flash memory 256, SRAM 258, a local manager interface 260, a timer 262, a watchdog circuit 264, and a management port UART 266.
  • the boot PROM 254 stores the code which boots the main processor 250.
  • the flash memory 256 is typically utilized to hold the main code and static configuration information.
  • the RAM 258 is typically a dynamic RAM for running the code and temporarily storing data.
  • the timer 262 provides clock signals for the operating system, while the watchdog 264 performs reset functions.
  • the management port UART 266 is provided for access by a system manager (not shown) to the codec, while the local management interface 260 provides the codec with the capability of interfacing with a local manager such as the controller 140 of Figure 2.
  • the audio circuitry 200 includes an analog to digital converter 202, an audio codec interface 204, an audio source time manager 206, an audio packet buffer SRAM 210, a packet buffer and DMA controller 212, an audio channel output interface 214, an audio channel input interface 216, an audio presentation time manager 217, and a digital to analog converter 218.
  • the video circuitry 220 includes a composite NTSC/PAL video decoder 222, a JPEG compressor 224, a video source time manager 226, an outgoing packet buffer SRAM 227, an outgoing video packet buffer and DMA controller 228, a video channel output interface 229, and a video channel input interface 231, an incoming packet buffer and DMA controller 232, an incoming packet buffer SRAM 233, a JPEG decompressor 234, a composite NTSC/PAL video encoder 236, a video presentation time manager 238, and a video sync generator 239.
  • the data circuitry 240 includes a data channel port UART 242, a data packet buffer and DMA controller 244 with an associated data packet buffer RAM 245, a data channel output interface 246, and a data channel input interface 248.
  • outgoing audio information received from a microphone(s) or other audio source is applied to the analog to digital converter 202 which simultaneously provides the digital audio data to the audio codec interface 204 and, in accord with a preferred aspect of the invention, provides a reference clock to the audio source time manager 206.
  • the audio codec interface 204 converts the format of the data received from the A/D converter so that the data may be properly provided to the packet buffer SRAM 210 under control of the packet buffer and DMA (direct memory access) controller.
  • the main processor 250 provides a PES (Program Elementary Stream) header to the SRAM 210 to effectively generate PES formatted packet.
  • PES Program Elementary Stream
  • the packet buffer and DMA controller 212 controls the movement of the packetized audio data from the SRAM 210 to the channel output interface 214 as required.
  • the channel output interface 214 places the data in a desired format (e.g., a Transport Stream (TS) format, or an ATM format) by inserting a system time indicator (provided by the audio source time manager) into the signal, and provides the desired overhead bits or bytes (including OAM where appropriate).
  • a desired format e.g., a Transport Stream (TS) format, or an ATM format
  • a system time indicator provided by the audio source time manager
  • the channel output interface 214 implements a serial channel physical interface by receiving the parallel stream of data from the buffer controller, and converting the parallel stream into a serial stream with an accompanying clock, etc., which is multiplexed with video data by the multiplexer/demultiplexer 270.
  • Incoming audio information is received by the audio channel input interface 216 from the multiplexer/demultiplexer 270.
  • the audio channel input interface 216 frames on the incoming (TS) cell, checks the headers for errors, passes the payload in byte wide format (parallel format) to the packet buffer and DMA controller 212, and passes a time reference marker (Program Clock Reference value -PCR) to the audio presentation time manager 217.
  • the DMA controller 212 places the payload in desired locations in the SRAM 210.
  • the DMA controller takes the data out of the SRAM 210 and provides it to the audio codec interface 204, which reformats the data into a serial stream for digital to analog conversion by the D/A converter 218.
  • the presentation time manager 217 is provided to recover a local clock.
  • the video circuitry 220 processes and outputs the video signals. Outgoing video information is received by the video circuitry 220 as a composite analog input.
  • the composite input is decoded by the composite decoder 222 which provides digital luminance and color difference signals to the video compressor 224, and horizontal and vertical synch and a field indicator to the video source time manager 226.
  • the Motion JPEG video compressor 224 compresses the data, and generates a JPEG frame with start of image and end of image markers.
  • the video compressor 224 puts the framed compressed data in parallel format, so that the buffer controller 228 can place the compressed data into the packet buffer SRAM 227.
  • the host (main processor 250) provides PES headers via the buffer controller 228 to desired locations in the SRAM 227 to effectively convert the JPEG frame into a PES packet.
  • the packet buffer and DMA controller 228 provides the channel output interface 229 with the PES packet data at a constant rate. If sufficient data is not available in the packet buffer SRAM 227, the channel output interface 229 generates an "idle cell". Regardless, the channel output interface 229, places the data in a desired format (e.g., TS or ATM format) by inserting a system time indicator (provided by the video source time manager 226) into the signal, and provides the desired overhead bits or bytes (including OAM where appropriate).
  • a desired format e.g., TS or ATM format
  • the channel output interface 229 implements a serial channel physical interface by receiving the parallel stream of data from the buffer controller 228, and converting the parallel stream into a serial stream with an accompanying clock, etc.
  • the outgoing stream of video data is multiplexed with time-related audio data at the multiplexer/demultiplexer 270.
  • video data which is demultiplexed by the multiplexer/demultiplexer 270 is obtained at the video channel input interface 231 which frames on the incoming (TS) cell, checks the headers for errors, passes the payload in byte wide format (parallel format) to the packet buffer and DMA controller 232, and passes a time reference marker to the video presentation time manager 238.
  • the DMA controller 232 places the payload in desired locations in the SRAM 233.
  • the JPEG decompressor 234 indicates that the next video field is required for display
  • the DMA controller 232 provides the buffered compressed video from the head of the buffer to the JPEG decompressor 234.
  • the JPEG decompressor 234 decompresses the data, and provides digital luminance and color difference signals to the composite video encoder 236.
  • the composite video encoder 236 operates based on the video timing signals (horizontal line timing or H, vertical field timing or V, and the field indicator) generated by the sync generator, based on the sample clock recovered by the presentation time manager from the channel PCR.
  • the composite video encoder 236 in turn indicates to the JPEG decompressor when it requires video data for the next video field, and which field is required. Based on these timing signals and the decompressed video data, the composite video encoder generates the analog video output for the video monitor 119a or for the video processor 138.
  • an MPEG-2 codec 114b, 134b generally includes audio circuitry 300, video circuitry 320, controller circuitry 340, management port circuitry 350, multiplexer/demultiplexer circuitry 360, and channel interface circuitry 370.
  • the audio circuitry 300 includes an analog to digital converter 302, a digital to analog converter 304, an MPEG-2 audio processor (compressor/decompressor) 306, and an audio buffer controller 308.
  • the video circuitry 320 includes a composite NTSC/PAL video decoder 322, an MPEG-2 compressor 324, an outgoing video buffer controller 326, an incoming video buffer controller 328, an MPEG decompressor 330, and a composite NTSC/PAL video encoder 332.
  • the controller circuitry 340 is coupled to the buffer controllers 308, 326, 328 as well as to the management port circuitry 350.
  • the multiplexer/demultiplexer circuitry 360 is coupled to the buffer controllers 308, 326, 328 as well as to the channel interface circuitry 370.
  • incoming MPEG-2 coded audio/video data is received via the channel interface 370 and split into separate audio and video signal streams by the multiplexer/demultiplexer circuitry 360.
  • the incoming audio data is passed to the audio buffer controller 308 which provides the audio data to the MPEG-2 audio processor 306.
  • the MPEG- 2 audio processor 306 decompresses the audio data and provides baseband (decompressed) digital data which is converted by the digital to analog converter 304 into an analog audio output.
  • the incoming video data is likewise passed to the video decoder buffer controller 328 which provides the video data to the MPEG-2 video decoder 330.
  • the MPEG-2 video decoder 330 decompresses the video data and provides decompressed digital data which is converted by the video encoder 332 into a composite video output.
  • outgoing audio information received from a microphone(s) or other audio source is applied to the analog to digital converter 302 which provides a digital output to the MPEG-2 audio processor 306.
  • the MPEG-2 audio processor 306 in turn provides compressed digital audio data to the audio buffer controller 308.
  • the outgoing video data received from a camera or other video source e.g. a video switch or mixer
  • the Video decoder 312 which provides a digital signal to the MPEG-2 video encoder 324.
  • the compressed digital data output of the MPEG-2 encoder 324 is passed through the buffer controller 326 to the multiplexer/demultiplexer 360 where the compressed video data is mixed with the compressed audio data for transmission via the channel interface 370 to the ATM network.
  • an H.320 codec 114c, 134c generally includes a video interface 402, 404, an audio interface 406, 408, a data port 410, a slot controller interface 412, a video compressor 414, a video decompressor 416, an audio compressor 418, an audio decompressor 420, a multiplexer 422 / demultiplexer 424, and a host processor 426.
  • the video interface includes an NTSC or PAL video to CCIR-601 format decoder 402 and a CCIR-601 to NTSC or PAL encoder 404.
  • the audio interface includes an A/D converter 406 and a D/A converter 408.
  • the sampling rates of the converters 406, 408 are preferably programmable from 8KHz to 48KHz with either 8 or 16 bit samples.
  • the data port is preferably an RS-232 interface 410.
  • the slot controller interface 412 includes the same CSP as the Motion JPEG codec shown and described with reference to Figure 3a.
  • the interface 412 also includes circuitry for formatting/deformatting an H.320 data stream flowing through an ATM adaptation processor.
  • the video compressor 414 and decompressor 416 support the CCITT H.261 (full OF and QCIF) recommendation for video codecs. This provides a video frame rate of 15 fps with an H.320 serial speed of 384 Kbps.
  • the audio compressor 418 and decompressor 420 support G.711 ( ⁇ -law, A-law, PCM), G.722 (ADPCM), and G.728 (CELP) speech codecs and are preferably implemented with a DSP.
  • the multiplexer 422 and demultiplexer 424 support the H.221 portion of the H.320 standard. Video, audio, data, and signalling are provided in a signals stream of 64 Kbps with the upper bound being 384 Kbps.
  • the host processor 426 controls the H.320 codec and is preferably a Power PC.
  • each of the codecs in the server 130 can be arranged to provide either baseband analog or decompressed digital signals to the audio processor and the video processor for mixing and multiplexing according to user commands. From the foregoing description of the codecs, it will be appreciated that if the audio and video processing is to be performed on decompressed digital signals, the various A/D, D/A, and video coder/decoders described in the codecs may be omitted or bypassed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

Ce serveur multimédia multipoint (130) comprend plusieurs codeurs/décodeurs (134) à standards différents de compression, un commutateur multipoint (132), un processeur audio (136) et un processeur vidéo (138), séparés, ainsi qu'au moins un module de commande (140). Des flux de données aux standards de compression différents entrent dans le serveur (130) et sont dirigés vers le codeur/décodeur approprié. Les signaux sont mélangés et commutés par l'organe de commande (140) et le commutateur multipoint (132), puis ils sont renvoyés vers les codeurs/décodeurs appropriés. Le signaux sont alors comprimés à nouveau selon le standard de chaque utilisateur, avant de sortir du serveur (130).
PCT/US1998/020706 1997-10-02 1998-09-29 Interconnexion de flux de donnees multimedia presentant differents formats de compression WO1999018728A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US94267597A 1997-10-02 1997-10-02
US08/942,675 1997-10-02

Publications (1)

Publication Number Publication Date
WO1999018728A1 true WO1999018728A1 (fr) 1999-04-15

Family

ID=25478446

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/020706 WO1999018728A1 (fr) 1997-10-02 1998-09-29 Interconnexion de flux de donnees multimedia presentant differents formats de compression

Country Status (1)

Country Link
WO (1) WO1999018728A1 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001041438A1 (fr) * 1999-12-02 2001-06-07 Madge Networks Limited Systeme de communication video
WO2001089215A2 (fr) * 2000-05-16 2001-11-22 Canal + Technologies Procede de transmission de donnees chiffrees, application d'un tel procede dans un systeme de television numerique a peage et decodeur utilise dans un tel systeme
EP1263205A1 (fr) * 2001-06-02 2002-12-04 Nokia Corporation Procédé pour fournir un terminal avec des signaux d'images fixes codés, système de communication, élément de réseau et module
WO2002100112A1 (fr) * 2001-06-03 2002-12-12 Seelive Ltd. Systeme et procede destines a une compression video rapide
WO2005048600A1 (fr) 2003-11-14 2005-05-26 Tandberg Telecom As Composeuse de medias distribues en temps reel
WO2006042207A1 (fr) * 2004-10-07 2006-04-20 Thomson Licensing Logique d'acheminement audio-video
US7227922B2 (en) 2001-09-07 2007-06-05 Siemens Aktiengesellschaft Method and device for the transmission of data in a packet-oriented data network
US7295608B2 (en) 2001-09-26 2007-11-13 Jodie Lynn Reynolds System and method for communicating media signals
US7302102B2 (en) 2001-09-26 2007-11-27 Reynolds Jodie L System and method for dynamically switching quality settings of a codec to maintain a target data rate
US7457359B2 (en) 2001-09-26 2008-11-25 Mabey Danny L Systems, devices and methods for securely distributing highly-compressed multimedia content
US7457358B2 (en) 2001-09-26 2008-11-25 Interact Devices, Inc. Polymorphic codec system and method
US7599434B2 (en) 2001-09-26 2009-10-06 Reynolds Jodie L System and method for compressing portions of a media signal using different codecs
US8160160B2 (en) 2005-09-09 2012-04-17 Broadcast International, Inc. Bit-rate reduction for multimedia data streams

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993016557A1 (fr) * 1992-02-11 1993-08-19 Koz Mark C Serveur de fichiers video adaptable et modes d'utilisation
US5488433A (en) * 1993-04-21 1996-01-30 Kinya Washino Dual compression format digital video production system
US5555017A (en) * 1994-07-08 1996-09-10 Lucent Technologies Inc. Seamless multimedia conferencing system using an enhanced multipoint control unit

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993016557A1 (fr) * 1992-02-11 1993-08-19 Koz Mark C Serveur de fichiers video adaptable et modes d'utilisation
US5488433A (en) * 1993-04-21 1996-01-30 Kinya Washino Dual compression format digital video production system
US5555017A (en) * 1994-07-08 1996-09-10 Lucent Technologies Inc. Seamless multimedia conferencing system using an enhanced multipoint control unit

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001041438A1 (fr) * 1999-12-02 2001-06-07 Madge Networks Limited Systeme de communication video
WO2001089215A2 (fr) * 2000-05-16 2001-11-22 Canal + Technologies Procede de transmission de donnees chiffrees, application d'un tel procede dans un systeme de television numerique a peage et decodeur utilise dans un tel systeme
FR2809269A1 (fr) * 2000-05-16 2001-11-23 Canal Plus Technologies Procede de transmission de donnees chiffrees, application d'un tel procede dans un systeme de television numerique a peage et decodeur utilise dans un tel systeme
WO2001089215A3 (fr) * 2000-05-16 2003-05-15 Canal Plus Technologies Procede de transmission de donnees chiffrees, application d'un tel procede dans un systeme de television numerique a peage et decodeur utilise dans un tel systeme
EP1263205A1 (fr) * 2001-06-02 2002-12-04 Nokia Corporation Procédé pour fournir un terminal avec des signaux d'images fixes codés, système de communication, élément de réseau et module
US7016543B2 (en) 2001-06-02 2006-03-21 Nokia Corporation Method for providing a terminal with coded still image signals, communications system, network element and module
WO2002100112A1 (fr) * 2001-06-03 2002-12-12 Seelive Ltd. Systeme et procede destines a une compression video rapide
US7227922B2 (en) 2001-09-07 2007-06-05 Siemens Aktiengesellschaft Method and device for the transmission of data in a packet-oriented data network
US7457359B2 (en) 2001-09-26 2008-11-25 Mabey Danny L Systems, devices and methods for securely distributing highly-compressed multimedia content
US7295608B2 (en) 2001-09-26 2007-11-13 Jodie Lynn Reynolds System and method for communicating media signals
US7302102B2 (en) 2001-09-26 2007-11-27 Reynolds Jodie L System and method for dynamically switching quality settings of a codec to maintain a target data rate
US8036265B1 (en) 2001-09-26 2011-10-11 Interact Devices System and method for communicating media signals
US7457358B2 (en) 2001-09-26 2008-11-25 Interact Devices, Inc. Polymorphic codec system and method
US8675733B2 (en) 2001-09-26 2014-03-18 Interact Devices, Inc. Polymorphic codec system and method
US7599434B2 (en) 2001-09-26 2009-10-06 Reynolds Jodie L System and method for compressing portions of a media signal using different codecs
US8175395B2 (en) 2001-09-26 2012-05-08 Interact Devices, Inc. System and method for dynamically switching quality settings of a codec to maintain a target data rate
US8064515B2 (en) 2001-09-26 2011-11-22 Interact Devices, Inc. System and method for compressing portions of a media signal using different codecs
US9462228B2 (en) 2003-11-04 2016-10-04 Cisco Technology, Inc. Distributed real-time media composer
WO2005048600A1 (fr) 2003-11-14 2005-05-26 Tandberg Telecom As Composeuse de medias distribues en temps reel
CN100568948C (zh) * 2003-11-14 2009-12-09 坦德伯格电信公司 分布式实时媒体创作器
US8289369B2 (en) 2003-11-14 2012-10-16 Cisco Technology, Inc. Distributed real-time media composer
US7561179B2 (en) 2003-11-14 2009-07-14 Tandberg Telecom As Distributed real-time media composer
US8773497B2 (en) 2003-11-14 2014-07-08 Cisco Technology, Inc. Distributed real-time media composer
CN101036329B (zh) * 2004-10-07 2011-06-08 汤姆逊许可公司 音频/视频路由器
US7774494B2 (en) 2004-10-07 2010-08-10 Thomson Licensing Audio/video router
WO2006042207A1 (fr) * 2004-10-07 2006-04-20 Thomson Licensing Logique d'acheminement audio-video
US8160160B2 (en) 2005-09-09 2012-04-17 Broadcast International, Inc. Bit-rate reduction for multimedia data streams

Similar Documents

Publication Publication Date Title
US5844600A (en) Methods, apparatus, and systems for transporting multimedia conference data streams through a transport network
US6285661B1 (en) Low delay real time digital video mixing for multipoint video conferencing
CA2316738C (fr) Station pivot de traitement central pour un systeme de teleconference multimedia
US6288740B1 (en) Method and apparatus for continuous presence conferencing with voice-activated quadrant selection
US7499416B2 (en) Video teleconferencing system with digital transcoding
EP0789492B1 (fr) Méthode de commande de vidéoconférences multipoints et système capable de synchroniser des paquets vidéo et audio
CA2236907C (fr) Communications multimedia a retards adaptatifs en fonction des systemes
EP0619679B1 (fr) Système de vidéoconférence à une pluralité de stations et appareil de commande
US6356294B1 (en) Multi-point communication arrangement and method
JP2003504897A (ja) 電話回線による高速映像伝送
WO2003081892A3 (fr) Systeme de telecommunication
GB2396988A (en) Apparatus and method for displaying pictures in a mobile terminal
WO1999018728A1 (fr) Interconnexion de flux de donnees multimedia presentant differents formats de compression
US20050021620A1 (en) Web data conferencing system and method with full motion interactive video
JPH1042261A (ja) マルチメディア通信システム向け圧縮領域映像へのテキストオーバーレイ
US7453829B2 (en) Method for conducting a video conference
US5900906A (en) Image communication apparatus having automatic answering and recording function
JP3254304B2 (ja) テレビ会議通信装置
KR0168923B1 (ko) 다양한 멀티미디어 통신 서비스를 통합 제공하는 멀티미디어 단말 장치
KR100257345B1 (ko) 아이에스디엔용 퍼스널컴퓨터 영상회의 시스템
KR0151455B1 (ko) 광대역 종합통신망을 이용한 영상회의 시스템
JPH0662398A (ja) 画像通信端末装置
JPH0522720A (ja) 画像コ−デツクおよびavミ−テイング端末
JPH09149395A (ja) 通信装置
KR19990050414A (ko) 영상 회의를 위한 고화질 동영상 압축 장치

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA