US20050144233A1 - Enhanced multimedia capabilities in video conferencing - Google Patents

Enhanced multimedia capabilities in video conferencing Download PDF

Info

Publication number
US20050144233A1
US20050144233A1 US10/971,030 US97103004A US2005144233A1 US 20050144233 A1 US20050144233 A1 US 20050144233A1 US 97103004 A US97103004 A US 97103004A US 2005144233 A1 US2005144233 A1 US 2005144233A1
Authority
US
United States
Prior art keywords
mms
video
data
video conferencing
multimedia data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/971,030
Other languages
English (en)
Inventor
Snorre Kjesbu
Espen Christensen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tandberg Telecom AS
Original Assignee
Tandberg Telecom AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tandberg Telecom AS filed Critical Tandberg Telecom AS
Assigned to TANDBERG TELECOM AS reassignment TANDBERG TELECOM AS ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHRISTENSEN, ESPEN, KJESBU, SNORRE
Publication of US20050144233A1 publication Critical patent/US20050144233A1/en
Priority to US13/016,337 priority Critical patent/US8560641B2/en
Priority to US14/297,135 priority patent/US9462228B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals

Definitions

  • the present invention relates to video conferencing and multimedia messaging.
  • the most realistic substitute for real meetings is high-end video conferencing systems.
  • Conventional video conferencing systems comprise a number of end-points communicating real-time video, audio and/or data streams over WAN, LAN and/or circuit switched networks.
  • the end-points include one or more monitors, cameras, microphones and/or data capture devices and a codec, which encodes and decodes outgoing and incoming streams, respectively.
  • a centralized source known as a Multipoint Control Unit (MCU) is needed to link the multiple end-points together.
  • the MCU performs this linking by receiving the multimedia signals (audio, video and/or data) from end-point terminals over point-to-point connections, processing the received signals, and retransmitting the processed signals to selected end-point terminals in the conference.
  • the different conferencing systems are, however, not isolated from each other.
  • Audio and web participants will not achieve the full benefit of the conferencing capabilities when joining a traditional video conference, because of both end-point and system limitations. Audio participants are not able to see the other participants, or data presentations, in the conference, while the video participants are not necessarily even aware of the presence of the audio participants. The latter is sometimes solved by showing an audio participant icon instead of a picture in the video image to indicate that an audio participant is present. This, however, provides little or no information about the participant.
  • embodiments of the present invention describe a MMS Engine adjusted to adopt MMS capabilities into a video conference, including one or more video participants associated with a respective video conferencing End Point, and one or more audio participants associated with a respective MMS device.
  • the MMS Engine includes a capturing means configured to capture video conferencing data from a data source originating from one or more video conferencing End Points.
  • a conversion means is configured to convert the video conferencing data to an appropriate format.
  • a message generating means is configured to attach the converted video conferencing data to a message, and to insert into the message an address associated with the respective MMS device.
  • a transmission means is configured to transmit the message according to the inserted address.
  • the MMS Engine includes MMS receiving means configured to receive an MMS message from the respective MMS device and separate attached multimedia data.
  • a conversion means is configured to convert the multimedia data to a format compatible with the video conference, and a transmission means is configured to provide said converted multimedia data to the respective video conferencing End Point.
  • the present invention also provides methods and computer program products directed to the capabilities of the MMS Engine.
  • FIG. 1 depicts the format of an MMS message
  • FIG. 2 depicts a conventional MMS architecture
  • FIG. 3 depicts video conference architecture connected to a part of a conventional MMS architecture
  • FIG. 4 depicts an MMS Engine according to a preferred embodiment of the present invention.
  • FIG. 5 depicts a computer system upon which an embodiment of the present invention may be implemented.
  • the present invention takes advantage of the capabilities of the communication system to which the audio participants are connected, to increase the performance of all the participants in a mixed conference.
  • multimedia features in digital communication networks There are many multimedia features in digital communication networks.
  • An example is the Multimedia Messaging System (MMS) standardized by the third Generation Partnership Project (3GPP).
  • WAP Wireless Application Protocol
  • MMS has evolved from the popularity of the SMS messaging system, and is using the Wireless Application Protocol (WAP).
  • WAP is a protocol that permits mobile devices to communicate with Internet servers via the mobile radio communications network. Since displays on mobile devices are much smaller (typically, 150 ⁇ 150 pixels) than computer monitor displays (typically, at least 640 ⁇ 480 pixels), a website designed to be displayed on a computer monitor cannot be displayed on a mobile device with any practicality. Also, mobile devices have considerably less processing power than personal computers.
  • MMS is a standard for sending and receiving multimedia messages.
  • the multimedia messages can include any combination of formatted text, images, photographs, audio and video clips.
  • the images can be in any standard format such as GIF and JPEG.
  • Video formats such as MPEG4 and audio formats such as MP3 and MIDI are also supported by MMS.
  • the typical format of a MMS message is illustrated in FIG. 1 .
  • the MMS message includes headers 1 .
  • the headers 1 provide the routing information and addresses of the recipients and senders of the MMS message.
  • the message body includes the multimedia message, which in turn may include: images which may be in the form of JPEG, formatted or plain text; audio which may be in the form of a wave file; video which may be in the form of a MPEG file, and may optionally include a presentation file which presents the multimedia content to the recipient of the multimedia message.
  • FIG. 2 An illustration of the MMS traffic routing sequence in a traditional peer-to-peer MMS routing is shown in FIG. 2 .
  • a user of a mobile device has an MMS message that the user would like to send to another mobile device.
  • the mobile device sends the MMS message to a MMS server via PLMN X.
  • the MMS server routes messages through the Internet using SMTP and an e-mail address. Since the message sent by the mobile device was addressed to the recipient's MSISDN number, the MMS server must determine the address of the recipient's MMS server in order to route the multimedia message to the recipient's MMS server.
  • the multimedia message is routed to the recipients MMS server via the Internet, using SMTP and an e-mail address of the recipient's MMS server.
  • the MMS server then sends a multimedia message notification to a Push Access Protocol (PAP) server.
  • PAP Push Access Protocol
  • the PAP server is a Push Gateway for pushing messages to the mobile device, using the WAP forum standard.
  • the PAP server sends a notification to the mobile device via a second Public Land Mobile Network Y (PLMN Y).
  • PLMN Y Public Land Mobile Network Y
  • the recipient's mobile device pulls the MMS message from MMS server via PLMN Y.
  • the MMS server routes the multimedia message to the recipient's mobile device via the PLMN Y.
  • the multimedia message is received in the mobile device where it can be presented, played, or displayed to a user of the mobile device.
  • the basics of the present invention are to utilize the architecture and multimedia capabilities of the MMS system in order to improve the performance and benefits of the audio participants in a video conference.
  • FIG. 3 One preferred embodiment is depicted in FIG. 3 .
  • the system includes a first number of End Points (EP) with an associated first MCU1 connected to a Local Area Network (LAN), which in turn is connected to a Network server being an embedded Gateway supplemented with an MMS Engine.
  • a second number of EPs with an associated second MCU2 are also in communication with the Network server through an ISDN line.
  • the EPs, MCUs and Gateway operate in the video conferencing system like conventional video conferencing nodes.
  • the EPs captures multimedia data, encode the data, and forward it to further processing in the MCUs.
  • the processing in the MCUs provides mixing of video, and prepares a coded multimedia data signal that is being sent to each of the participating conferencing EPs, which in turn decode the multimedia data signal and present it to the respective users.
  • the Gateway provides communication between EPs and MCUs operating at different protocols.
  • the most important task of a Gateway is to convert the multimedia data dedicated for transmission over ISDN to multimedia data dedicated to IP transmission.
  • the Gateway is conventionally used for connecting a LAN to an external ISDN connection, allowing enterprise EPs to communicate with external EPs.
  • the Gateway is incorporated in a Network server, also including an MMS Engine, providing increased performance of audio participants in a video conference. It will become apparent from the following description that because the MMS Engine and the Gateway have some similar characteristics, they are installed in the same node. As an example, both the MMS Engine and the Gateway provide protocol conversion, and they are both appropriately placed in the boundary between a local and a public communication network.
  • the MMS Engine provides conversion of video conference content to a conventional MMS content, which is to be transmitted to one or more audio participants using, e.g., a cellular phone adjusted to receive MMS messages.
  • the MMS Engine also provides conversion of MMS content, received from one or more audio participants, to a format that is applicable for the video conference in which the audio participant(s) take(s) part.
  • FIG. 4 One embodiment of the MMS Engine is illustrated in FIG. 4 .
  • several temporary I/O memories are shown, respectively associated with media data of conventional video conferencing format, in addition to a Controller and a Processor.
  • On the right-hand side of the data bus several temporary I/O memories are also shown, each associated with a respective field in a typical MMS message format, in addition to an MMS message I/O memory.
  • the coded video picture is routed via MCU 1 and through the IP Network to the Network Server.
  • the coded video picture is decoded providing a video picture of a conventional video conferencing format like QCIF, CIF or 4CIF.
  • the MMS Engine is configured to capture a snapshot of the video picture at certain predefined time intervals, or at certain events, i.e., selecting one of the images in the stream of still images constituting the video picture. This is implemented in the images consecutively being stored in a temporary memory, either in the Video I/O Memory, or in the Data I/O Memory of the MMS Engine, whose content is fetched at the actual moment of snapshot capturing, and forwarded to the Processor via the data bus. The actual time of fetching is controlled by the Controller.
  • the processor determines the original format of the image, and converts the content to a JPEG format according to a pre-stored algorithm. The conversion may also include scaling of the picture for adjusting the size to a small screen.
  • the Controller further conveys the JPEG image to the JPEG memory, and when the time has come to transmit an MMS message to one or more audio participants, a MMS message is created by the processor according to the format depicted in FIG. 1 , whose content is decided by the Controller.
  • the address of the MMS recipient(s) is/are inserted in the MMS header, and as the complete MMS message is put in the MMS I/O memory, the MMS message with the snapshot from EP1 in JPEG format is ready for transmission to the MMS server(s), with which the recipient(s) is/are associated.
  • the address inserted in the MMS header is an e-mail address.
  • the MMS message is routed to the MMS server in the conventional way using SMTP, and the MMS content is pulled from the MMS server(s) to the recipient(s).
  • the MMS e-mail addresses of participating audio participants must be stored in the Address memory at conference set-up, or when new audio participant with MMS capabilities enters an on-going conference.
  • the snapshot is not limited to include content from one single participant, but can also include so-called CP pictures (Continuous Presence), which is a mix of pictures from several participants.
  • CP pictures Continuous Presence
  • the image attached to the MMS message is not limited to a certain format or a still picture, but can also be a video sequence, e.g. in MPEG format. The video sequences could be fetched directly from the Video I/O memory, or generated by fetching and merging still pictures.
  • the video conference is currently viewing a Continuous Presence (CP) view including a video picture of all the participants, except for the receiver of the CP, and a regularly updated still picture captured by the only audio participant in the video conference.
  • the audio participant is provided with a cellular phone with MMS capabilities, and a camera.
  • the audio participant enters into the conference, it is provided with an e-mail address associated with the conference and/or the Network server.
  • the e-mail address may be transmitted from the Network Server as a MMS message (e.g.
  • the transmitting address, or “return path”, of the first MMS message including video conference data transmitted in the opposite direction may be intended for manual use, or may be automatically inserted into the memory of the cellular phone for later use during the conference.
  • E-mail addresses and/or other data may also be exchanged between the network server and the cellular phone/MMS server by means of a proprietary signalling protocol during call set-up.
  • a call When a call is set up from the cellular phone to the conference, a picture is captured by the camera associated with the cellular phone, and inserted into an MMS message addressed to the conference.
  • the MMS message is then transmitted to the Network Server via the MMS server through the Internet by means of SMTP.
  • a conference ID is either provided by investigating the e-mail address, or by investigating the transmitter address found in the MMS header.
  • the MMS message is inserted in the MMS I/O memory, and the Controller initiates the Processor to separate the different media elements included in the message, and inserts them in the respective memories.
  • the JPEG picture now residing in the JPEG memory is then converted to a proper format used in the CP view of the conference, and inserted into the Video or Data I/O memory.
  • the picture is fetched from the memory, then coded and transmitted to the MCU mixing the CP views of the conference, according to the earlier provided conference ID.
  • the MCU then inserts the still picture, originally captured by the cellular phone, in the sixth CP field, together with the five other video pictures.
  • An alternative to conversion could be to transmit the multimedia data separated from the MMS message directly to the MCU or the video conferencing End Points. This would require that the receiver was IP-addressable, for e.g. pushing out the multimedia data.
  • a picture received from a certain audio participant registered in a directory connected to an End-Point or a management tool could be stored in the directory together with other information about the participant.
  • the audio participant later on participates in a conference which includes the video conferencing device with the directory the corresponding picture can be fetched out and used for viewing the audio participant, without having to retransmit the picture.
  • the above described embodiment of the present invention represents an MMS Engine implemented together with, or incorporated in, a Gateway.
  • the MMS Engine does not necessarily have to be connected to a Gateway, but could also be stand-alone device, or incorporated in other video conferencing nodes, like in a MCU, or in the End Points. It could also be a part of a Management Tool associated with the video conferencing system.
  • the MMS Engine, or a node in which it is incorporated in or connected to has to be addressable according to the Internet Protocol. Further, the description also focuses on capturing and transmitting still pictures between a video conference and one or more audio participants with multimedia capabilities.
  • the multimedia content is not limited to still pictures, but can also consist of video, text and/or audio, in which case, it is distributed in the respective memories in the MMS Engine at conversion.
  • it is possible to incorporate more than one MMS engine into the above described embodiments.
  • the present invention also includes an aspect wherein the multimedia data is transferred to MMS capable audio participants by means of e-mails.
  • the multimedia data is in this case attached to a conventional e-mail after conversion, which is transmitted to the MMS device via the MMS server.
  • the MMS server and device handle the e-mail is operator dependent, but it is a well-known fact that transmitting an e-mail to a MMS device is allowed; addressing the e-mail with phonenumber@domain.
  • the MMS device will receive the e-mail as a MMS message, in which the e-mail text and the attachments are inserted in the MMS entries.
  • a snapshot or other multimedia content may be captured and transferred at predefined points of time, or at certain events.
  • such an event occurs at the time when a considerable change in the content of the source picture (video or still picture), from which present snapshot originates, is detected.
  • the detection may take place e.g. in the Network server illustrated in FIG. 3 .
  • the previously transmitted snapshot is stored in the Network server, and the snapshot is continuously compared to the source picture. The comparison may be carried through in a pixel-by-pixel fashion, or one or more values associated with the snapshot may be compared with one or more corresponding values associated with the source picture.
  • the Controller of FIG. 4 enables reading from the Video or Data I/O memory via the data bus to the processor, which in turn converts the snapshot from present format to JPEG, then inserting the JPEG image into the JPEG memory.
  • One event that could trigger a new snapshot transmission is a page shift in a presentation. Another example is when voice switching is active, and the floor is shifting. A completely different image will then occur as the main video picture in the conference, and a new snapshot transmission will be initiated.
  • capturing new snapshots could also be transmitted periodically; capturing the first snapshot at call set-up.
  • Transmission of multimedia data in the opposite direction, from the audio participant(s) to the video conference, could be initiated accordingly, but the decision mechanism is likely to be implemented in the MMS device or somewhere else in the MMS or cellular network architecture. In case of implementing the decision mechanism in the MMS device, some proprietary software would have to be installed.
  • the MMS device is configured with software allowing it to both send and receive signalling messages concerning the snapshot capturing, transmission and reception, alternatively in addition to merge content received at different points of time, providing continuity in the presentation of multimedia data from the conference.
  • the software is adjusted to receive and store the e-mail address of the Network server/conference, and automatically fetches this address and inserts it in the MMS header when transmitting multimedia data to the conference.
  • the software is preferably installed as Java-scripts, as this is a flexible tool for providing small devices with tailored features. Additionally, most cellular phones and mobile devices are now equipped with Java-technology.
  • snapshot capturing at the video conferencing side should also be possible to initiate, either manually or automatically, remotely from an audio participant.
  • the software installed in the cellular phone is therefore configured to be able to generate and transmit a request for snapshot capturing to the Network server.
  • the MMS Engine captures a snapshot (or other multimedia data) from one of the I/O memory, converts it to a proper format and returns the snapshot to the MMS device of the audio participant.
  • the multimedia content received at different times in the MMS device could benefit from merging the content together, and thereby providing continuity in the data transmitted via MMS from the video conference.
  • a real-time video presentation could be created from a number of snapshots, or video pieces, consecutively transmitted in separate MMS messages from the MMS Engine.
  • the software is in this case also configured to consecutively receive and store the incoming multimedia data, and to present it on the screen of the MMS device in such a way that it appears to be a continuous video stream.
  • a real-time video presentation implies transmission of large and/or many MMS messages, and will probably require a substantial bandwidth all the way to the audio participant.
  • the bandwidth requirement could, however, be reduced, instead of converting the video conference pictures to a JPEG format in the MMS Engine, by coding the pictures according to standard or proprietary coding techniques, and inserting the respective encoded pictures in one of the entries in the MMS messages as general attachments.
  • the software in the cellular phone also has to be configured to be able to decode the attachments in the MMS messages, according to the coding techniques used by the MMS Engine.
  • the tasks of the software in the cellular phone described above would require some signalling and negotiation between the MMS device and the MMS Engine. This exchange of information could be inserted in the text or presentation fields (temporarily in the text or presentation memory in the MMS Engine) of the MMS messages still being transmitted, creating a virtual, separate signalling channel. This information may include snapshot requests, type of events initiating snapshot capturing, and synchronisation information.
  • the embodiments of the present invention appear in the description above as an MMS Engine integrating MMS capabilities in conventional video conferencing.
  • the basic idea of the present invention can also be embodied in an overall method.
  • the method includes in one direction capturing video conferencing data, e.g., a snapshot of the video picture of one or more participant, or a CP picture, converting the data to a proper format, and inserting the converted data as an attachment in an MMS message.
  • the MMS message is transmitted from an IP-addressable device to one or more MMS capable audio participants via the MMS infrastructure.
  • the audio participant fetches the video conferencing data attached to the MMS message, and presents the data as a part of the conference in which the audio participant takes part.
  • the audio participant captures some kind of multimedia data, normally a still picture, or some other data presenting the audio participant, and inserts the multimedia data into an MMS message.
  • the MMS message is addressed and transmitted to an IP-addressable node connected to the video conferencing system.
  • the attachment is then fetched from the MMS message and converted to a proper video conferencing format.
  • the converted multimedia data is then coded and transmitted to one or more of the conventional video conferencing participants, optionally subsequent to mixing it with data from other participants.
  • FIG. 5 illustrates a computer system 1201 upon which an embodiment of the present invention may be implemented.
  • the computer system 1201 includes a bus 1202 or other communication mechanism for communicating information, and a processor 1203 coupled with the bus 1202 for processing the information.
  • the computer system 1201 also includes a main memory 1204 , such as a random access memory (RAM) or other dynamic storage device (e.g., dynamic RAM (DRAM), static RAM (SRAM), and synchronous DRAM (SDRAM)), coupled to the bus 1202 for storing information and instructions to be executed by processor 1203 .
  • the main memory 1204 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processor 1203 .
  • the computer system 1201 further includes a read only memory (ROM) 1205 or other static storage device (e.g., programmable ROM (PROM), erasable PROM (EPROM), and electrically erasable PROM (EEPROM)) coupled to the bus 1202 for storing static information and instructions for the processor 1203 .
  • ROM read only memory
  • PROM programmable ROM
  • EPROM erasable PROM
  • EEPROM electrically erasable PROM
  • the computer system 1201 also includes a disk controller 1206 coupled to the bus 1202 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 1207 , and a removable media drive 1208 (e.g., floppy disk drive, read-only compact disc drive, read/write compact disc drive, compact disc jukebox, tape drive, and removable magneto-optical drive).
  • a removable media drive 1208 e.g., floppy disk drive, read-only compact disc drive, read/write compact disc drive, compact disc jukebox, tape drive, and removable magneto-optical drive.
  • the storage devices may be added to the computer system 1201 using an appropriate device interface (e.g., small computer system interface (SCSI), integrated device electronics (IDE), enhanced-IDE (E-IDE), direct memory access (DMA), or ultra-DMA).
  • SCSI small computer system interface
  • IDE integrated device electronics
  • E-IDE enhanced-IDE
  • DMA direct memory access
  • ultra-DMA ultra-DMA
  • the computer system 1201 may also include special purpose logic devices (e.g., application specific integrated circuits (ASICs)) or configurable logic devices (e.g., simple programmable logic devices (SPLDs), complex programmable logic devices (CPLDs), and field programmable gate arrays (FPGAs)).
  • ASICs application specific integrated circuits
  • SPLDs simple programmable logic devices
  • CPLDs complex programmable logic devices
  • FPGAs field programmable gate arrays
  • the computer system 1201 may also include a display controller 1209 coupled to the bus 1202 to control a display 1210 , such as a cathode ray tube (CRT), for displaying information to a computer user.
  • the computer system includes input devices, such as a keyboard 1211 and a pointing device 1212 , for interacting with a computer user and providing information to the processor 1203 .
  • the pointing device 1212 may be a mouse, a trackball, or a pointing stick for communicating direction information and command selections to the processor 1203 and for controlling cursor movement on the display 1210 .
  • a printer may provide printed listings of data stored and/or generated by the computer system 1201 .
  • the computer system 1201 performs a portion or all of the processing steps of the invention in response to the processor 1203 executing one or more sequences of one or more instructions contained in a memory, such as the main memory 1204 .
  • a memory such as the main memory 1204 .
  • Such instructions may be read into the main memory 1204 from another computer readable medium, such as a hard disk 1207 or a removable media drive 1208 .
  • processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in main memory 1204 .
  • hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
  • the computer system 1201 includes at least one computer readable medium or memory for holding instructions programmed according to the teachings of the invention and for containing data structures, tables, records, or other data described herein.
  • Examples of computer readable media are compact discs, hard disks, floppy disks, tape, magneto-optical disks, PROMs (EPROM, EEPROM, flash EPROM), DRAM, SRAM, SDRAM, or any other magnetic medium, compact discs (e.g., CD-ROM), or any other optical medium, punch cards, paper tape, or other physical medium with patterns of holes, a carrier wave (described below), or any other medium from which a computer can read.
  • the present invention includes software for controlling the computer system 1201 , for driving a device or devices for implementing the invention, and for enabling the computer system 1201 to interact with a human user (e.g., print production personnel).
  • software may include, but is not limited to, device drivers, operating systems, development tools, and applications software.
  • Such computer readable media further includes the computer program product of the present invention for performing all or a portion (if processing is distributed) of the processing performed in implementing the invention.
  • the computer code devices of the present invention may be any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs), Java classes, and complete executable programs. Moreover, parts of the processing of the present invention may be distributed for better performance, reliability, and/or cost.
  • Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks, such as the hard disk 1207 or the removable media drive 1208 .
  • Volatile media includes dynamic memory, such as the main memory 1204 .
  • Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that make up the bus 1202 . Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • Various forms of computer readable media may be involved in carrying out one or more sequences of one or more instructions to processor 1203 for execution.
  • the instructions may initially be carried on a magnetic disk of a remote computer.
  • the remote computer can load the instructions for implementing all or a portion of the present invention remotely into a dynamic memory and send the instructions over a telephone line using a modem.
  • a modem local to the computer system 1201 may receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal.
  • An infrared detector coupled to the bus 1202 can receive the data carried in the infrared signal and place the data on the bus 1202 .
  • the bus 1202 carries the data to the main memory 1204 , from which the processor 1203 retrieves and executes the instructions.
  • the instructions received by the main memory 1204 may optionally be stored on storage device 1207 or 1208 either before or after execution by processor 1203 .
  • the computer system 1201 also includes a communication interface 1213 coupled to the bus 1202 .
  • the communication interface 1213 provides a two-way data communication coupling to a network link 1214 that is connected to, for example, a local area network (LAN) 1215 , or to another communications network 1216 such as the Internet.
  • LAN local area network
  • the communication interface 1213 may be a network interface card to attach to any packet switched LAN.
  • the communication interface 1213 may be an asymmetrical digital subscriber line (ADSL) card, an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of communications line.
  • Wireless links may also be implemented.
  • the communication interface 1213 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • the network link 1214 typically provides data communication through one or more networks to other data devices.
  • the network link 1214 may provide a connection to another computer through a local network 1215 (e.g., a LAN) or through equipment operated by a service provider, which provides communication services through a communications network 1216 .
  • the local network 1214 and the communications network 1216 use, for example, electrical, electromagnetic, or optical signals that carry digital data streams, and the associated physical layer (e.g., CAT 5 cable, coaxial cable, optical fiber, etc).
  • the signals through the various networks and the signals on the network link 1214 and through the communication interface 1213 , which carry the digital data to and from the computer system 1201 maybe implemented in baseband signals, or carrier wave based signals.
  • the baseband signals convey the digital data as unmodulated electrical pulses that are descriptive of a stream of digital data bits, where the term “bits” is to be construed broadly to mean symbol, where each symbol conveys at least one or more information bits.
  • the digital data may also be used to modulate a carrier wave, such as with amplitude, phase and/or frequency shift keyed signals that are propagated over a conductive media, or transmitted as electromagnetic waves through a propagation medium.
  • the digital data may be sent as unmodulated baseband data through a “wired” communication channel and/or sent within a predetermined frequency band, different than baseband, by modulating a carrier wave.
  • the computer system 1201 can transmit and receive data, including program code, through the network(s) 1215 and 1216 , the network link 1214 and the communication interface 1213 .
  • the network link 1214 may provide a connection through a LAN 1215 to a mobile device 1217 such as a personal digital assistant (PDA) laptop computer, or cellular telephone.
  • PDA personal digital assistant
US10/971,030 2003-10-24 2004-10-25 Enhanced multimedia capabilities in video conferencing Abandoned US20050144233A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/016,337 US8560641B2 (en) 2003-10-24 2011-01-28 Enhanced multimedia capabilities in video conferencing
US14/297,135 US9462228B2 (en) 2003-11-04 2014-06-05 Distributed real-time media composer

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NONO20034775 2003-10-24
NO20034775A NO318868B1 (no) 2003-10-24 2003-10-24 Videokonferanse med forbedrede multimediakapabiliteter

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/016,337 Continuation US8560641B2 (en) 2003-10-24 2011-01-28 Enhanced multimedia capabilities in video conferencing

Publications (1)

Publication Number Publication Date
US20050144233A1 true US20050144233A1 (en) 2005-06-30

Family

ID=29775118

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/971,030 Abandoned US20050144233A1 (en) 2003-10-24 2004-10-25 Enhanced multimedia capabilities in video conferencing
US13/016,337 Active 2025-06-23 US8560641B2 (en) 2003-10-24 2011-01-28 Enhanced multimedia capabilities in video conferencing

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/016,337 Active 2025-06-23 US8560641B2 (en) 2003-10-24 2011-01-28 Enhanced multimedia capabilities in video conferencing

Country Status (7)

Country Link
US (2) US20050144233A1 (fr)
EP (1) EP1676439B1 (fr)
JP (1) JP2007513537A (fr)
CN (1) CN1871855A (fr)
AT (1) ATE538596T1 (fr)
NO (1) NO318868B1 (fr)
WO (1) WO2005041574A1 (fr)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040266412A1 (en) * 2003-06-25 2004-12-30 Oracle International Corporation Mobile meeting and collaboration
US20040266408A1 (en) * 2003-06-25 2004-12-30 Oracle International Corporation Mobile messaging concierge
US20060120308A1 (en) * 2004-12-06 2006-06-08 Forbes Stephen K Image exchange for image-based push-to-talk user interface
US20070239825A1 (en) * 2006-04-06 2007-10-11 Sbc Knowledge Ventures L.P. System and method for distributing video conference data over an internet protocol television system
US20080235362A1 (en) * 2007-03-19 2008-09-25 Tandberg Telecom As System and method for conference management
US20080284841A1 (en) * 2007-05-15 2008-11-20 Ori Modai Methods, media, and devices for providing visual resources of video conference participants
US20090047985A1 (en) * 2006-06-30 2009-02-19 Huawei Technologies Co., Ltd. Method, system and apparatus for implementing push to talk over cellular session storing and broadcasting
US20090265603A1 (en) * 2008-04-21 2009-10-22 Samsung Electronics Co., Ltd. Apparatus and method for composing scenes using rich media contents
US20090323552A1 (en) * 2007-10-01 2009-12-31 Hewlett-Packard Development Company, L.P. Systems and Methods for Managing Virtual Collaboration Systems Spread Over Different Networks
US20100085417A1 (en) * 2008-10-07 2010-04-08 Ottalingam Satyanarayanan Service level view of audiovisual conference systems
US20100293469A1 (en) * 2009-05-14 2010-11-18 Gautam Khot Providing Portions of a Presentation During a Videoconference
US20110032943A1 (en) * 2005-09-26 2011-02-10 Huawei Technologies Co., Ltd. Method, system and devices for processing messages in multimedia messaging service
US8028073B2 (en) 2003-06-25 2011-09-27 Oracle International Corporation Mobile meeting and collaboration
US20110258271A1 (en) * 2010-04-19 2011-10-20 Gaquin John Francis Xavier Methods and systems for distributing attachments to messages
US20120306992A1 (en) * 2011-06-02 2012-12-06 Microsoft Corporation Techniques to provide fixed video conference feeds of remote attendees with attendee information
CN104170317A (zh) * 2012-03-16 2014-11-26 株式会社理光 通信控制系统和控制设备
US20150019942A1 (en) * 2013-07-12 2015-01-15 Samsung Electronics Co., Ltd. File attachment method and electronic device thereof
US9525845B2 (en) 2012-09-27 2016-12-20 Dobly Laboratories Licensing Corporation Near-end indication that the end of speech is received by the far end in an audio or video conference
US10693670B2 (en) 2017-05-12 2020-06-23 Fujitsu Limited Information processing apparatus, information processing system, and information processing method

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101212506B (zh) * 2007-12-25 2010-06-02 上海科泰世纪科技有限公司 移动设备上基于彩信实现传情动漫信息收发的方法
US20100218210A1 (en) * 2009-02-23 2010-08-26 Xcast Labs, Inc. Emergency broadcast system
CN102308532B (zh) * 2009-05-21 2013-10-09 华为终端有限公司 点到多点推送消息处理方法、系统及服务器
WO2011022430A2 (fr) * 2009-08-17 2011-02-24 Weigel Broadcasting Co. Système et procédé pour une production audiovisuelle en direct à distance
US9143978B2 (en) 2012-12-07 2015-09-22 At&T Intellectual Property I, L.P. Network congestion prevention and/or mitigation
US9325776B2 (en) 2013-01-08 2016-04-26 Tangome, Inc. Mixed media communication
WO2017042331A1 (fr) * 2015-09-11 2017-03-16 Barco N.V. Procédé et système pour connecter des dispositifs électroniques

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3597022A (en) * 1969-07-22 1971-08-03 Robert D Waldron Diamagnetic levitation and/or stabilizing devices
US5007292A (en) * 1988-08-31 1991-04-16 Amoco Corporation Multicomponent transducer
US5555017A (en) * 1994-07-08 1996-09-10 Lucent Technologies Inc. Seamless multimedia conferencing system using an enhanced multipoint control unit
US5625407A (en) * 1994-07-08 1997-04-29 Lucent Technologies Inc. Seamless multimedia conferencing system using an enhanced multipoint control unit and enhanced endpoint devices
US5673205A (en) * 1996-04-08 1997-09-30 Lucent Technologies Inc. Accessing a video message via video snapshots
US6020915A (en) * 1995-06-27 2000-02-01 At&T Corp. Method and system for providing an analog voice-only endpoint with pseudo multimedia service
US6111972A (en) * 1992-09-28 2000-08-29 Jean Marie Bernard Paul Verdier Diffusing volume electroacoustic transducer
US6130952A (en) * 1996-11-08 2000-10-10 Kabushiki Kaisha Audio-Technica Microphone
US6314302B1 (en) * 1996-12-09 2001-11-06 Siemens Aktiengesellschaft Method and telecommunication system for supporting multimedia services via an interface and a correspondingly configured subscriber terminal
US20020083136A1 (en) * 2000-12-22 2002-06-27 Whitten William B. Method of authorizing receipt of instant messages by a recipient user
US6583806B2 (en) * 1993-10-01 2003-06-24 Collaboration Properties, Inc. Videoconferencing hardware
US20030158902A1 (en) * 2001-10-31 2003-08-21 Dotan Volach Multimedia instant communication system and method
US6611503B1 (en) * 1998-05-22 2003-08-26 Tandberg Telecom As Method and apparatus for multimedia conferencing with dynamic bandwidth allocation
US6625258B1 (en) * 1999-12-27 2003-09-23 Nortel Networks Ltd System and method for providing unified communication services support
US20040034723A1 (en) * 2002-04-25 2004-02-19 Giroti Sudhir K. Converged conferencing appliance and methods for concurrent voice and data conferencing sessions over networks
US6947738B2 (en) * 2001-01-18 2005-09-20 Telefonaktiebolaget Lm Ericsson (Publ) Multimedia messaging service routing system and method
US7043528B2 (en) * 2001-03-08 2006-05-09 Starbak Communications, Inc. Systems and methods for connecting video conferencing to a distributed network
US7069301B2 (en) * 2001-02-07 2006-06-27 Siemens Aktiengesellschaft Method and apparatus for sending messages from an MMS system
US7181538B2 (en) * 2003-11-14 2007-02-20 Sybase 365, Inc. System and method for providing configurable, dynamic multimedia message service pre-transcoding
US7181231B2 (en) * 2001-08-27 2007-02-20 Tcl Communication Technology Holdings Limited System of interoperability between MMS messages and SMS/EMS messages and an associated exchange method
US20070167188A1 (en) * 2003-05-14 2007-07-19 Kjell Linden A system and a device for mobile radio communication
US7630705B2 (en) * 2003-06-30 2009-12-08 Motorola, Inc. Message format conversion in communications terminals and networks

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2696308A1 (fr) 1992-09-28 1994-04-01 Rigondeau Robert Système électro-acoustique à lévitation magnétique et effet de volume.
KR0150698B1 (ko) 1995-08-01 1998-11-02 구자홍 영상기기에 사용되는 스피커 케이스 구조
JP2000115738A (ja) * 1998-10-08 2000-04-21 Ntt Data Corp テレビ会議システム、テレビ会議装置、メール転送装置及び記録媒体
US7181539B1 (en) 1999-09-01 2007-02-20 Microsoft Corporation System and method for data synchronization
JP2003271530A (ja) * 2002-03-18 2003-09-26 Oki Electric Ind Co Ltd 通信システム,システム間関連装置,プログラム,及び,記録媒体

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3597022A (en) * 1969-07-22 1971-08-03 Robert D Waldron Diamagnetic levitation and/or stabilizing devices
US5007292A (en) * 1988-08-31 1991-04-16 Amoco Corporation Multicomponent transducer
US6111972A (en) * 1992-09-28 2000-08-29 Jean Marie Bernard Paul Verdier Diffusing volume electroacoustic transducer
US6583806B2 (en) * 1993-10-01 2003-06-24 Collaboration Properties, Inc. Videoconferencing hardware
US5555017A (en) * 1994-07-08 1996-09-10 Lucent Technologies Inc. Seamless multimedia conferencing system using an enhanced multipoint control unit
US5625407A (en) * 1994-07-08 1997-04-29 Lucent Technologies Inc. Seamless multimedia conferencing system using an enhanced multipoint control unit and enhanced endpoint devices
US6020915A (en) * 1995-06-27 2000-02-01 At&T Corp. Method and system for providing an analog voice-only endpoint with pseudo multimedia service
US5673205A (en) * 1996-04-08 1997-09-30 Lucent Technologies Inc. Accessing a video message via video snapshots
US6130952A (en) * 1996-11-08 2000-10-10 Kabushiki Kaisha Audio-Technica Microphone
US6314302B1 (en) * 1996-12-09 2001-11-06 Siemens Aktiengesellschaft Method and telecommunication system for supporting multimedia services via an interface and a correspondingly configured subscriber terminal
US6611503B1 (en) * 1998-05-22 2003-08-26 Tandberg Telecom As Method and apparatus for multimedia conferencing with dynamic bandwidth allocation
US6625258B1 (en) * 1999-12-27 2003-09-23 Nortel Networks Ltd System and method for providing unified communication services support
US20020083136A1 (en) * 2000-12-22 2002-06-27 Whitten William B. Method of authorizing receipt of instant messages by a recipient user
US6947738B2 (en) * 2001-01-18 2005-09-20 Telefonaktiebolaget Lm Ericsson (Publ) Multimedia messaging service routing system and method
US7069301B2 (en) * 2001-02-07 2006-06-27 Siemens Aktiengesellschaft Method and apparatus for sending messages from an MMS system
US7043528B2 (en) * 2001-03-08 2006-05-09 Starbak Communications, Inc. Systems and methods for connecting video conferencing to a distributed network
US7181231B2 (en) * 2001-08-27 2007-02-20 Tcl Communication Technology Holdings Limited System of interoperability between MMS messages and SMS/EMS messages and an associated exchange method
US20030158902A1 (en) * 2001-10-31 2003-08-21 Dotan Volach Multimedia instant communication system and method
US20040034723A1 (en) * 2002-04-25 2004-02-19 Giroti Sudhir K. Converged conferencing appliance and methods for concurrent voice and data conferencing sessions over networks
US20070167188A1 (en) * 2003-05-14 2007-07-19 Kjell Linden A system and a device for mobile radio communication
US7630705B2 (en) * 2003-06-30 2009-12-08 Motorola, Inc. Message format conversion in communications terminals and networks
US7181538B2 (en) * 2003-11-14 2007-02-20 Sybase 365, Inc. System and method for providing configurable, dynamic multimedia message service pre-transcoding

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040266412A1 (en) * 2003-06-25 2004-12-30 Oracle International Corporation Mobile meeting and collaboration
US20040266408A1 (en) * 2003-06-25 2004-12-30 Oracle International Corporation Mobile messaging concierge
US8028073B2 (en) 2003-06-25 2011-09-27 Oracle International Corporation Mobile meeting and collaboration
US7379733B2 (en) * 2003-06-25 2008-05-27 Oracle International Corporation Mobile meeting and collaboration
US9094805B2 (en) 2003-06-25 2015-07-28 Oracle International Corporation Mobile messaging concierge
US20060120308A1 (en) * 2004-12-06 2006-06-08 Forbes Stephen K Image exchange for image-based push-to-talk user interface
US7596102B2 (en) * 2004-12-06 2009-09-29 Sony Ericsson Mobile Communications Ab Image exchange for image-based push-to-talk user interface
US8537824B2 (en) * 2005-09-26 2013-09-17 Huawei Technologies Co., Ltd. Method, system and devices for processing messages in multimedia messaging service
US20110032943A1 (en) * 2005-09-26 2011-02-10 Huawei Technologies Co., Ltd. Method, system and devices for processing messages in multimedia messaging service
US20070239825A1 (en) * 2006-04-06 2007-10-11 Sbc Knowledge Ventures L.P. System and method for distributing video conference data over an internet protocol television system
US7640301B2 (en) 2006-04-06 2009-12-29 Att Knowledge Ventures, L.P. System and method for distributing video conference data over an internet protocol television system
US9661268B2 (en) 2006-04-06 2017-05-23 At&T Intellectual Property I, L.P. System and method for distributing video conference data over an internet protocol television system
US8706807B2 (en) 2006-04-06 2014-04-22 AT&T Intellectual Protperty I, LP System and method for distributing video conference data over an internet protocol television system
US8140100B2 (en) * 2006-06-30 2012-03-20 Huawei Technologies Co., Ltd. Method, system and apparatus for implementing push to talk over cellular session storing and broadcasting
US20090047985A1 (en) * 2006-06-30 2009-02-19 Huawei Technologies Co., Ltd. Method, system and apparatus for implementing push to talk over cellular session storing and broadcasting
US20080235362A1 (en) * 2007-03-19 2008-09-25 Tandberg Telecom As System and method for conference management
US9009225B2 (en) * 2007-03-19 2015-04-14 Cisco Technology, Inc. System and method for conference management
US8212856B2 (en) 2007-05-15 2012-07-03 Radvision Ltd. Methods, media, and devices for providing visual resources of video conference participants
US20080284841A1 (en) * 2007-05-15 2008-11-20 Ori Modai Methods, media, and devices for providing visual resources of video conference participants
US7990889B2 (en) 2007-10-01 2011-08-02 Hewlett-Packard Development Company, L.P. Systems and methods for managing virtual collaboration systems
US20090323552A1 (en) * 2007-10-01 2009-12-31 Hewlett-Packard Development Company, L.P. Systems and Methods for Managing Virtual Collaboration Systems Spread Over Different Networks
US20090265603A1 (en) * 2008-04-21 2009-10-22 Samsung Electronics Co., Ltd. Apparatus and method for composing scenes using rich media contents
US8707151B2 (en) * 2008-04-21 2014-04-22 Samsung Electronics Co., Ltd Apparatus and method for composing scenes using Rich Media contents
US20100085417A1 (en) * 2008-10-07 2010-04-08 Ottalingam Satyanarayanan Service level view of audiovisual conference systems
US8441516B2 (en) * 2008-10-07 2013-05-14 Cisco Technology, Inc. Service level view of audiovisual conference systems
US9571358B2 (en) 2008-10-07 2017-02-14 Cisco Technology, Inc. Service level view of audiovisual conference systems
US9007424B2 (en) 2008-10-07 2015-04-14 Cisco Technology, Inc. Service level view of audiovisual conference systems
US20100293469A1 (en) * 2009-05-14 2010-11-18 Gautam Khot Providing Portions of a Presentation During a Videoconference
US20110258271A1 (en) * 2010-04-19 2011-10-20 Gaquin John Francis Xavier Methods and systems for distributing attachments to messages
US20120306992A1 (en) * 2011-06-02 2012-12-06 Microsoft Corporation Techniques to provide fixed video conference feeds of remote attendees with attendee information
US8624955B2 (en) * 2011-06-02 2014-01-07 Microsoft Corporation Techniques to provide fixed video conference feeds of remote attendees with attendee information
CN104170317A (zh) * 2012-03-16 2014-11-26 株式会社理光 通信控制系统和控制设备
US9525845B2 (en) 2012-09-27 2016-12-20 Dobly Laboratories Licensing Corporation Near-end indication that the end of speech is received by the far end in an audio or video conference
US20150019942A1 (en) * 2013-07-12 2015-01-15 Samsung Electronics Co., Ltd. File attachment method and electronic device thereof
US9852403B2 (en) * 2013-07-12 2017-12-26 Samsung Electronics Co., Ltd. File attachment method and electronic device thereof
US10693670B2 (en) 2017-05-12 2020-06-23 Fujitsu Limited Information processing apparatus, information processing system, and information processing method

Also Published As

Publication number Publication date
EP1676439B1 (fr) 2011-12-21
US20110134206A1 (en) 2011-06-09
NO318868B1 (no) 2005-05-18
NO20034775D0 (no) 2003-10-24
US8560641B2 (en) 2013-10-15
EP1676439A1 (fr) 2006-07-05
JP2007513537A (ja) 2007-05-24
ATE538596T1 (de) 2012-01-15
WO2005041574A1 (fr) 2005-05-06
CN1871855A (zh) 2006-11-29

Similar Documents

Publication Publication Date Title
US8560641B2 (en) Enhanced multimedia capabilities in video conferencing
CN108055496B (zh) 一种视频会议的直播方法和系统
US8316104B2 (en) Method and apparatus for collaborative system
US10678393B2 (en) Capturing multimedia data based on user action
US9635525B2 (en) Voice messaging method and mobile terminal supporting voice messaging in mobile messenger service
US7058689B2 (en) Sharing of still images within a video telephony call
US7561179B2 (en) Distributed real-time media composer
EP2658232A1 (fr) Procédé et système pour un système de communication multimédia optimisé
JPWO2007055206A1 (ja) 通信装置、通信方法、通信システム、プログラム、および、コンピュータ読み取り可能な記録媒体
US8014775B2 (en) Method and system for implementing messaging services and a message application server
JP2002007294A (ja) 画像配信システム及び方法並びに記憶媒体
CN108259813A (zh) 多功能传屏装置、系统及方法
CN113923470B (zh) 直播流处理方法及装置
US20110137438A1 (en) Video conference system and method based on video surveillance system
US7792063B2 (en) Method, apparatus, and computer program product for gatekeeper streaming
US20100066806A1 (en) Internet video image producing method
JP2009194661A (ja) 会議端末装置
KR20040081370A (ko) 원격 비디오 체인의 제어 방법 및 시스템
JP4406295B2 (ja) アプリケーション連携システム、及びアプリケーション連携方法
JP2005032172A (ja) セッション制御代行システムと通信サービスシステムおよびセッション制御方法ならびにプログラムと記録媒体
KR100690874B1 (ko) 서로 다른 메시징 서비스의 상호연동 방법
CN112019791A (zh) 基于教育考试的多方音视频通话方法及系统
CN108270995B (zh) 一种终端与视频监控设备之间的通信方法及系统
CN116582521A (zh) 一种多人视频端会议数据处理系统及方法
Lewis et al. A Multimodal Instant Messaging System using XML-Based Protocols

Legal Events

Date Code Title Description
AS Assignment

Owner name: TANDBERG TELECOM AS, NORWAY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KJESBU, SNORRE;CHRISTENSEN, ESPEN;REEL/FRAME:016366/0535

Effective date: 20041210

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION