US20160070534A1 - Techniques to remotely manage a multimedia conference event - Google Patents

Techniques to remotely manage a multimedia conference event Download PDF

Info

Publication number
US20160070534A1
US20160070534A1 US14/931,297 US201514931297A US2016070534A1 US 20160070534 A1 US20160070534 A1 US 20160070534A1 US 201514931297 A US201514931297 A US 201514931297A US 2016070534 A1 US2016070534 A1 US 2016070534A1
Authority
US
United States
Prior art keywords
multimedia conference
remote control
mobile remote
information
conference server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/931,297
Inventor
Kripalani Kripalani
Ashutosh Tripathi
Namit Sethumathavan Tanesheri
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Priority to US14/931,297 priority Critical patent/US20160070534A1/en
Publication of US20160070534A1 publication Critical patent/US20160070534A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0231Cordless keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/152Multipoint control units therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0384Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • H04L65/4038Arrangements for multi-party communication, e.g. for conferences with floor control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/60Scheduling or organising the servicing of application requests, e.g. requests for application data transmissions using the analysis and optimisation of the required network resources
    • H04L67/62Establishing a time schedule for servicing the requests

Definitions

  • a multimedia conference system typically allows multiple participants to communicate and share different types of media content in a collaborative and real-time meeting over a network.
  • the multimedia conference system may display different types of media content using various graphical user interface (GUI) windows or views.
  • GUI graphical user interface
  • one GUI view might include video images of participants
  • another GUI view might include presentation slides
  • yet another GUI view might include text messages between participants, and so forth.
  • GUI graphical user interface
  • a conference leader In a virtual meeting environment there is typically a conference leader or presenter controlling the various media resources for the multimedia conference system. For example, a conference leader may control presentation slides while delivering a speech. In some cases, however, it may be difficult for the conference leader to manage the various media resources provided by the multimedia conference system. This may limit the effective delivery of media content for a meeting. Techniques directed to improving management of media resources in a virtual meeting environment may therefore enhance user experience and convenience.
  • Various embodiments may be generally directed to multimedia conference systems. Some embodiments may be particularly directed to techniques to manage media resources for a multimedia conference event.
  • the multimedia conference event may include multiple participants, some of which may gather in a conference room, while others may participate in the multimedia conference event from remote locations. One of the participants may utilize a mobile remote control to manage various media resources for the multimedia conference event.
  • an apparatus may comprise a mobile remote control having a communications component operative to establish a wireless connection between a mobile remote control and a multimedia conference server hosting a multimedia conference event.
  • the mobile remote control may include a mobile remote control component communicatively coupled to the communications component, the mobile remote control component operative to manage the multimedia conference event from the mobile remote control by communicating control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection.
  • FIG. 1 illustrates an embodiment of a multimedia conference system.
  • FIG. 2 illustrates an embodiment of a mobile remote control.
  • FIG. 3 illustrates an embodiment of a mobile remote control component.
  • FIG. 4 illustrates an embodiment of a logic flow.
  • FIG. 5 illustrates an embodiment of a computing architecture.
  • FIG. 6 illustrates an embodiment of an article.
  • Various embodiments include physical or logical structures arranged to perform certain operations, functions or services.
  • the structures may comprise physical structures, logical structures or a combination of both.
  • the physical or logical structures are implemented using hardware elements, software elements, or a combination of both. Descriptions of embodiments with reference to particular hardware or software elements, however, are meant as examples and not limitations. Decisions to use hardware or software elements to actually practice an embodiment depends on a number of external factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds, and other design or performance constraints.
  • the physical or logical structures may have corresponding physical or logical connections to communicate information between the structures in the form of electronic signals or messages.
  • connections may comprise wired and/or wireless connections as appropriate for the information or particular structure. It is worthy to note that any reference to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • Various embodiments may be generally directed to multimedia conference systems arranged to provide meeting and collaboration services to multiple participants over a network.
  • Some multimedia conference systems may be designed to operate with various packet-based networks, such as the Internet or World Wide Web (“web”), to provide web-based conferencing services. Such implementations are sometimes referred to as web conferencing systems.
  • An example of a web conferencing system may include MICROSOFT® OFFICE LIVE MEETING made by Microsoft Corporation, Redmond, Wash.
  • Other multimedia conference systems may be designed to operate for a private network, business, organization, or enterprise, and may utilize a multimedia conference server such as MICROSOFT OFFICE COMMUNICATIONS SERVER made by Microsoft Corporation, Redmond, Wash. It may be appreciated, however, that implementations are not limited to these examples.
  • a multimedia conference system may include, among other network elements, a multimedia conference server or other processing device arranged to provide web conferencing services.
  • a multimedia conference server may include, among other server elements, a server meeting component operative to control and mix different types of media content for a meeting and collaboration event, such as a web conference.
  • a meeting and collaboration event may refer to any multimedia conference event offering various types of multimedia information in a real-time or live online environment, and is sometimes referred to herein as simply a “meeting event,” “multimedia event” or “multimedia conference event.”
  • the multimedia conference system may further include one or more computing devices implemented as meeting consoles.
  • Each meeting console may be arranged to participate in a multimedia event by connecting to the multimedia conference server. Different types of media information from the various meeting consoles may be received by the multimedia conference server during the multimedia event, which in turn distributes the media information to some or all of the other meeting consoles participating in the multimedia event.
  • any given meeting console may have a display with multiple media content views of different types of media content. In this manner various geographically disparate participants may interact and communicate information in a virtual meeting environment similar to a physical meeting environment where all the participants are within one room.
  • a conference leader In a virtual meeting environment there is typically a conference leader or presenter controlling the various media resources for the multimedia conference system.
  • a conference leader may control presentation slides while delivering a speech.
  • the conference leader may not have convenient access to a meeting console to participate in the multimedia conference system.
  • the conference leader may have access to a meeting console but may be incapable of moving from the meeting console during a presentation. This may limit the effective delivery of media content for a meeting.
  • a conference leader may utilize a mobile remote control to direct, manage or otherwise control various media resources provided by a multimedia conference server.
  • the media resources provided by the multimedia conference server may be controlled via a user interface implemented by the mobile remote control.
  • a user interface implemented by the mobile remote control.
  • an operator may use a manual input device, such as a keyboard or pointing device (e.g., stylus, mouse, trackball, touch pad, touch screen, etc.), to enter operator commands into the mobile remote control to control the media resources.
  • an operator may use an audio input device, such as a microphone, speaker and various audio user interface modules, to enter operator commands into the mobile remote control to control the media resources in a “hands-free” mode of operation.
  • the mobile remote control may provide various advantages over conventional static or immobile meeting consoles.
  • One advantage is that the mobile remote control provides mobility to a conference leader.
  • a conference leader can move around a conference room or presentation hall when presenting a larger audience.
  • the conference leader can use the mobile remote control to switch presentation slides or to bring certain media content into focus.
  • a conference leader can be performing other activities or tasks while leading a multimedia conference event, such as a driving a vehicle.
  • the conference leader can use voice command support implemented by the mobile remote control to lead a multimedia conference event without necessarily using her hands.
  • Another advantage is that the mobile remote control provides portability to a conference leader.
  • the mobile remote control may utilize a form factor that is convenient for single hand use, such as the size of a cellular telephone, handheld computer, or smart phone.
  • the mobile remote control provides feedback or status information of a multimedia conference event for the conference leader.
  • the mobile remote control may dynamically render its user interfaces according to the particular media content in focus. In scenarios where there are multiple presenters, this feedback on the mobile remote control helps the conference leader to find out which particular media content is currently highlighted or in focus. The feedback also provides information about the attendees in the meeting and their status.
  • FIG. 1 illustrates a block diagram for a multimedia conference system 100 .
  • Multimedia conference system 100 may represent a general system architecture suitable for implementing various embodiments.
  • Multimedia conference system 100 may comprise multiple elements.
  • An element may comprise any physical or logical structure arranged to perform certain operations.
  • Each element may be implemented as hardware, software, or any combination thereof, as desired for a given set of design parameters or performance constraints.
  • Examples of hardware elements may include devices, components, processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
  • ASIC application specific integrated circuits
  • PLD programmable logic devices
  • DSP digital signal processors
  • FPGA field programmable gate array
  • memory units logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
  • Examples of software may include any software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, interfaces, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
  • multimedia conference system 100 as shown in FIG. 1 has a limited number of elements in a certain topology, it may be appreciated that multimedia conference system 100 may include more or less elements in alternate topologies as desired for a given implementation. The embodiments are not limited in this context.
  • system As used herein the terms “system,” “subsystem,” “component,” and “module” are intended to refer to a computer-related entity, comprising either hardware, a combination of hardware and software, software, or software in execution.
  • a component can be implemented as a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and the server can be a component.
  • One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers as desired for a given implementation. The embodiments are not limited in this context.
  • the multimedia conference system 100 may comprise, or form part of, a wired communications system, a wireless communications system, or a combination of both.
  • the multimedia conference system 100 may include one or more elements arranged to communicate information over one or more types of wired communications links.
  • Examples of a wired communications link may include, without limitation, a wire, cable, bus, printed circuit board (PCB), Ethernet connection, peer-to-peer (P2P) connection, backplane, switch fabric, semiconductor material, twisted-pair wire, co-axial cable, fiber optic connection, and so forth.
  • the multimedia conference system 100 also may include one or more elements arranged to communicate information over one or more types of wireless communications links.
  • Examples of a wireless communications link may include, without limitation, a radio channel, infrared channel, radio-frequency (RF) channel, Wireless Fidelity (WiFi) channel, a portion of the RF spectrum, and/or one or more licensed or license-free frequency bands.
  • RF radio-frequency
  • WiFi Wireless Fidelity
  • the multimedia conference system 100 may be arranged to communicate, manage or process different types of information, such as media information and control information.
  • media information may generally include any data representing content meant for a user, such as voice information, video information, audio information, image information, textual information, numerical information, application information, alphanumeric symbols, graphics, and so forth.
  • Voice information may comprise a subset of audio information, and is broadly meant to include any information communicated by a human being, such as words, speech, speech utterances, sounds, vocal noise, and so forth.
  • Control information may refer to any data representing commands, instructions or control words meant for an automated system. For example, control information may be used to route media information through a system, to establish a connection between devices, instruct a device to process the media information in a predetermined manner, and so forth.
  • multimedia conference system 100 may include a multimedia conference server 130 .
  • the multimedia conference server 130 may comprise any logical or physical entity that is arranged to establish, manage or control a multimedia conference call between meeting consoles 110 - 1 - m over a network 120 .
  • Network 120 may comprise, for example, a packet-switched network, a circuit-switched network, or a combination of both.
  • the multimedia conference server 130 may comprise or be implemented as any processing or computing device, such as a computer, a server, a server array or server farm, a work station, a mini-computer, a main frame computer, a supercomputer, and so forth.
  • the multimedia conference server 130 may comprise or implement a general or specific computing architecture suitable for communicating and processing multimedia information.
  • the multimedia conference server 130 may be implemented using a computing architecture as described with reference to FIG. 5 .
  • Examples for the multimedia conference server 130 may include without limitation a MICROSOFT OFFICE COMMUNICATIONS SERVER, a MICROSOFT OFFICE LIVE MEETING server, and so forth.
  • a specific implementation for the multimedia conference server 130 may vary depending upon a set of communication protocols or standards to be used for the multimedia conference server 130 .
  • the multimedia conference server 130 may be implemented in accordance with the Internet Engineering Task Force (IETF) Multiparty Multimedia Session Control (MMUSIC) Working Group Session Initiation Protocol (SIP) series of standards and/or variants.
  • IETF Internet Engineering Task Force
  • MMUSIC Multiparty Multimedia Session Control
  • SIP Working Group Session Initiation Protocol
  • SIP is a proposed standard for initiating, modifying, and terminating an interactive user session that involves multimedia elements such as video, voice, instant messaging, online games, and virtual reality.
  • the multimedia conference server 130 may be implemented in accordance with the International Telecommunication Union (ITU) H.323 series of standards and/or variants.
  • ITU International Telecommunication Union
  • the H.323 standard defines a multipoint control unit (MCU) to coordinate conference call operations.
  • the MCU includes a multipoint controller (MC) that handles H.245 signaling, and one or more multipoint processors (MP) to mix and process the data streams.
  • MC multipoint controller
  • MP multipoint processors
  • Both the SIP and H.323 standards are essentially signaling protocols for Voice over Internet Protocol (VoIP) or Voice Over Packet (VOP) multimedia conference call operations. It may be appreciated that other signaling protocols may be implemented for the multimedia conference server 130 , however, and still fall within the scope of the embodiments.
  • multimedia conference system 100 may be used for multimedia conference calls.
  • Multimedia conference calls typically involve communicating voice, video, and/or data information between multiple end points.
  • a public or private packet network 120 may be used for audio conferencing calls, video conferencing calls, audio/video conferencing calls, collaborative document sharing and editing, and so forth.
  • the packet network 120 may also be connected to a Public Switched Telephone Network (PSTN) via one or more suitable VoIP gateways arranged to convert between circuit-switched information and packet information.
  • PSTN Public Switched Telephone Network
  • each meeting console 110 - 1 - m may connect to multimedia conference server 130 via the packet network 120 using various types of wired or wireless communications links operating at varying connection speeds or bandwidths, such as a lower bandwidth PSTN telephone connection, a medium bandwidth DSL modem connection or cable modem connection, and a higher bandwidth intranet connection over a local area network (LAN), for example.
  • LAN local area network
  • the multimedia conference server 130 may establish, manage and control a multimedia conference call between meeting consoles 110 - 1 - m .
  • the multimedia conference call may comprise a live web-based conference call using a web conferencing application that provides full collaboration capabilities.
  • the multimedia conference server 130 operates as a central server that controls and distributes media information in the conference. It receives media information from various meeting consoles 110 - 1 - m , performs mixing operations for the multiple types of media information, and forwards the media information to some or all of the other participants.
  • One or more of the meeting consoles 110 - 1 - m may join a conference by connecting to the multimedia conference server 130 .
  • the multimedia conference server 130 may implement various admission control techniques to authenticate and add meeting consoles 110 - 1 - m in a secure and controlled manner.
  • the multimedia conference system 100 may include one or more computing devices implemented as meeting consoles 110 - 1 - m to connect to the multimedia conference server 130 over one or more communications connections via the network 120 .
  • a computing device may implement a client application that may host multiple meeting consoles each representing a separate conference at the same time.
  • the client application may receive multiple audio, video and data streams. For example, video streams from all or a subset of the participants may be displayed as a mosaic on the participant's display with a top window with video for the current active speaker, and a panoramic view of the other participants in other windows.
  • the meeting consoles 110 - 1 - m may comprise any logical or physical entity that is arranged to participate or engage in a multimedia conference call managed by the multimedia conference server 130 .
  • the meeting consoles 110 - 1 - m may be implemented as any device that includes, in its most basic form, a processing system including a processor and memory, one or more multimedia input/output (I/O) components, and a wireless and/or wired network connection.
  • a processing system including a processor and memory, one or more multimedia input/output (I/O) components, and a wireless and/or wired network connection.
  • multimedia I/O components may include audio I/O components (e.g., microphones, speakers), video I/O components (e.g., video camera, display), tactile (I/O) components (e.g., vibrators), user data (I/O) components (e.g., keyboard, thumb board, keypad, touch screen), and so forth.
  • audio I/O components e.g., microphones, speakers
  • video I/O components e.g., video camera, display
  • tactile (I/O) components e.g., vibrators
  • user data (I/O) components e.g., keyboard, thumb board, keypad, touch screen
  • Examples of the meeting consoles 110 - 1 - m may include a telephone, a VoIP or VOP telephone, a packet telephone designed to operate on the PSTN, an Internet telephone, a video telephone, a cellular telephone, a personal digital assistant (PDA), a combination cellular telephone and PDA, a mobile computing device, a smart phone, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a handheld computer, a network appliance, and so forth.
  • the meeting consoles 110 - 1 - m may be implemented using a general or specific computing architecture similar to the computing architecture described with reference to FIG. 5 .
  • the meeting consoles 110 - 1 - m may comprise or implement respective client meeting components 112 - 1 - n .
  • the client meeting components 112 - 1 - n may be designed to interoperate with the server meeting component 132 of the multimedia conference server 130 to establish, manage or control a multimedia conference event.
  • the client meeting components 112 - 1 - n may comprise or implement the appropriate application programs and user interface controls to allow the respective meeting consoles 110 - 1 - m to participate in a web conference facilitated by the multimedia conference server 130 .
  • This may include input equipment (e.g., video camera, microphone, keyboard, mouse, controller, etc.) to capture media information provided by the operator of a meeting console 110 - 1 - m , and output equipment (e.g., display, speaker, etc.) to reproduce media information by the operators of other meeting consoles 110 - 1 - m .
  • client meeting components 112 - 1 - n may include without limitation a MICROSOFT OFFICE COMMUNICATOR or the MICROSOFT OFFICE LIVE MEETING Windows Based Meeting Console, and so forth.
  • a representative meeting console 110 - 1 may be connected to various multimedia input devices and/or multimedia output devices capable of capturing, communicating or reproducing multimedia information.
  • the multimedia input devices may comprise any logical or physical device arranged to capture or receive as input multimedia information from one or more operators or participants of the meeting console 110 - 1 , including audio input devices, video input devices, image input devices, text input devices, and other multimedia input equipment.
  • Examples of multimedia input devices may include without limitation video cameras, microphones, microphone arrays, conference telephones, whiteboards, interactive whiteboards, voice-to-text components, text-to-voice components, voice recognition systems, pointing devices, keyboards, touchscreens, tablet computers, handwriting recognition devices, and so forth.
  • An example of a video camera may include a ringcam, such as the MICROSOFT ROUNDTABLE made by Microsoft Corporation, Redmond, Wash.
  • the MICROSOFT ROUNDTABLE is a videoconferencing device with a 360 degree camera that provides remote meeting participants a panoramic video of everyone sitting around a conference table.
  • the multimedia output devices may comprise any logical or physical device arranged to reproduce or display as output multimedia information to one or more operators or participants of the meeting consoles 110 - 1 , including audio output devices, video output devices, image output devices, text input devices, and other multimedia output equipment. Examples of multimedia output devices may include without limitation electronic displays, video projectors, speakers, vibrating units, printers, facsimile machines, and so forth.
  • the representative meeting console 110 - 1 may include various multimedia input devices arranged to capture media content from one or more participants 154 - 1 - p , and stream the media content to the multimedia conference server 130 .
  • the meeting console 110 - 1 includes various types of multimedia input equipment, such as a video camera 106 and an array of microphones 104 - 1 - r .
  • the video camera 106 may capture video content including video content of one or more participants 154 - 1 - p within video capture range of the video camera 106 , and stream the video content to the multimedia conference server 130 via the meeting console 110 - 1 .
  • audio input devices such as the array of microphones 104 - 1 - r may capture audio content including audio content from one or more participants 154 - 1 - p within audio capture range of the microphones 104 - 1 - r , and stream the audio content to the multimedia conference server 130 via the meeting console 110 - 1 .
  • the meeting console may also include various multimedia output devices, such as one or more speakers 108 - 1 - s and an electronic display. Audio output devices such as the one or more speakers 108 - 1 - s may reproduce audio content for the participants 154 - 1 - p .
  • Video output devices such as the electronic display may be used to reproduce video content from other participants using remote meeting consoles 110 - 2 - m received via the multimedia conference server 130 .
  • the meeting consoles 110 - 1 - m and the multimedia conference server 130 may communicate media information and control information utilizing various media connections established for a given multimedia conference event.
  • the meeting consoles 110 - 1 - m may each comprise a respective communications component 116 - 1 - v .
  • the communications components 116 - 1 - v may comprise various communications resources suitable for establishing the various media connections. Examples of the communications resources may include transmitters, receivers, transceivers, radios, network interfaces, network interface cards, processors, memory, media access control (MAC) layer parts, physical (PHY) layer parts, connectors, communications media, communications interfaces, and so forth.
  • MAC media access control
  • PHY physical
  • the communications components 116 - 1 - v may establish media connections in general, and audio connections in particular, using various VoIP signaling protocols, such as the SIP series of protocols.
  • the SIP series of protocols are application-layer control (signaling) protocol for creating, modifying and terminating sessions with one or more participants. These sessions include Internet multimedia conferences, Internet telephone calls and multimedia distribution. Members in a session can communicate via multicast or via a mesh of unicast relations, or a combination of these.
  • SIP is designed as part of the overall IETF multimedia data and control architecture currently incorporating protocols such as the resource reservation protocol (RSVP) (IEEE RFC 2205) for reserving network resources, the real-time transport protocol (RTP) (IEEE RFC 1889) for transporting real-time data and providing Quality-of-Service (QOS) feedback, the real-time streaming protocol (RTSP) (IEEE RFC 2326) for controlling delivery of streaming media, the session announcement protocol (SAP) for advertising multimedia sessions via multicast, the session description protocol (SDP) (IEEE RFC 2327) for describing multimedia sessions, and others.
  • the meeting consoles 110 - 1 - m may use SIP as a signaling channel to setup the media connections, and RTP as a media channel to transport media information over the media connections.
  • the communications components 116 - 1 - v may establish media connections using various circuit-switched techniques.
  • the communications components 116 - 1 - v may establish a media connection using Pulse Code Modulation (PCM) signals over a circuit-switched network.
  • PCM Pulse Code Modulation
  • An example of a circuit-switched network may include the Public Switched Telephone Network (PSTN), a private network, and so forth.
  • PSTN Public Switched Telephone Network
  • private network and so forth.
  • a schedule device 108 may be used to generate a multimedia conference event reservation for the multimedia conference system 100 .
  • the scheduling device 108 may comprise, for example, a computing device having the appropriate hardware and software for scheduling multimedia conference events.
  • the scheduling device 108 may comprise a computer utilizing MICROSOFT OFFICE OUTLOOK® application software, made by Microsoft Corporation, Redmond, Wash.
  • the MICROSOFT OFFICE OUTLOOK application software comprises messaging and collaboration client software that may be used to schedule a multimedia conference event.
  • An operator may use MICROSOFT OFFICE OUTLOOK to convert a schedule request to a MICROSOFT OFFICE LIVE MEETING event that is sent to a list of meeting invitees.
  • the schedule request may include a hyperlink to a virtual room for a multimedia conference event.
  • An invitee may click on the hyperlink, and the meeting console 110 - 1 - m launches a web browser, connects to the multimedia conference server 130 , and joins the virtual room.
  • the participants can present a slide presentation, annotate documents or brainstorm on the built in whiteboard, among other tools.
  • a conference leader In a virtual meeting environment there is typically a conference leader or presenter controlling the various media resources for the multimedia conference system.
  • a conference leader may control presentation slides while delivering a speech.
  • the conference leader may not have convenient access to a meeting console to participate in the multimedia conference system.
  • the conference leader may have access to a meeting console but may be incapable of moving from the meeting console during a presentation. This may limit the effective delivery of media content for a meeting.
  • a conference leader may utilize a mobile remote control 190 to direct, manage or otherwise control various media resources provided by the multimedia conference server 130 .
  • the mobile remote control 190 may establish a wireless connection 192 with the multimedia conference server 130 to participate in a multimedia conference event with one or more of the meeting consoles 110 - 1 - m .
  • an operator may assume the role of a conference leader and utilize the mobile remote control 190 to direct, manage or otherwise control various media resources and multimedia conference features provided by the multimedia conference server 130 . This may be accomplished using manual user interface controls or voice commands.
  • a conference leader may utilize the mobility and portability of the mobile remote control 190 to direct, manage or control media resources and features for a multimedia conference event with increased convenience to the conference leader.
  • the multimedia conference system 100 in general, and the mobile remote control 190 in particular, may be described in more detail with reference to FIG. 2 .
  • FIG. 2 illustrates a more detailed block diagram for the mobile remote control 190 .
  • the mobile remote control 190 may be implemented using any mobile device having computing capabilities, wireless communication capabilities, a form factor suitable for being held by a single average human hand, and a portable power supply such as one or more batteries. Examples for the mobile remote control 190 may include without limitation a mobile phone, a cellular telephone, a PDA, a combination cellular telephone and PDA, a handheld computer, a mobile computing device, a smart phone, digital media player, and so forth. In one embodiment, for example, the mobile remote control 190 may be implemented as a handheld computer such as a MICROSOFT ZUNE® made by Microsoft Corporation, Redmond, Wash. The embodiments, however, are not limited to this example. In some implementations, the mobile remote control 190 may be implemented using a general or specific computing architecture similar to the computing architecture described with reference to FIG. 5 .
  • the mobile remote control 190 includes the client meeting component 112 and the communications component 116 as described with reference to the meeting consoles 110 - 1 - m .
  • the mobile remote control 190 may also include a display component 202 , a power supply component 204 , and a mobile remote control component 206 .
  • the mobile remote control 190 may implement some or all of the functionality of the meeting consoles 110 - 1 - m in order to participate in a multimedia conference event.
  • the mobile remote control 190 may also be used to remotely manage the multimedia conference event via interoperation with the server meeting component 132 of the multimedia conference server 130 .
  • the display component 202 may include a digital electronic display for the presentation of information supplied as an electrical signal for visual or tactile reception.
  • digital electronic displays may include without limitation electronic paper, nixie tube displays, vacuum fluorescent displays, light-emitting diode displays, electroluminescent displays, plasma display panels, liquid crystal displays, thin-film transistor displays, organic light-emitting diode displays, surface-conduction electron-emitter displays, laser television displays, carbon nanotubes, nanocrystal displays, and so forth.
  • the display component 202 may be implemented as a touch screen display.
  • the power supply component 204 may include one or more portable rechargeable direct current (DC) batteries.
  • DC batteries may include without limitation a nickel-cadmium (NiCd) cell, a nickel metal hydride (NiMH) cell, a lithium-ion (Li-Ion) cell, and so forth.
  • the communications component 116 may be operative to establish the wireless connection 192 between the mobile remote control 190 and the multimedia conference server 130 hosting a multimedia conference event.
  • the wireless connection 192 may include without limitation a radio channel, infrared channel, RF channel, WiFi channel, a portion of the RF spectrum, and/or one or more licensed or license-free frequency bands.
  • the mobile remote control component 206 may be communicatively coupled to the communications component 116 .
  • the mobile remote control component 206 may implement control logic and user interfaces for remotely managing media resources for a multimedia conference event hosted by the multimedia conference server 130 . More particularly, the mobile remote control component 206 may be operative to manage a multimedia conference event from the mobile remote control 190 by communicating control information and media information with the multimedia conference server 130 for the multimedia conference event over the wireless connection 192 .
  • the mobile remote control component 206 may implement remote control operations to direct, manage or otherwise control an online conferencing tool from a mobile device with voice commands capabilities.
  • the mobile remote control component 206 may be used by a conference leader or presenter who is presenting for the multimedia conference event, such as a conference meeting.
  • the mobile remote control component 206 enables the operator to control basic and enhanced functionalities of the meeting from the mobile remote control 190 .
  • a presenter joins a multimedia conference event using the mobile remote control 190 .
  • the presenter will then have the ability to view some or all of the media content uploaded during the meeting on the mobile remote control 190 .
  • the presenter can then invoke media content such as a slide set and switch slides for the multimedia conference event from the mobile remote control 190 .
  • the mobile remote control component 206 also supports other conference functions or features for the presenter such as muting and attendee, removing an attendee from a meeting, changing meeting status (e.g., mood), and so forth.
  • the presenter can choose to use the on screen user interface (or hardware buttons) or voice commands to trigger a command on the mobile remote control 190 .
  • the mobile remote control 190 In addition to manual operator commands, the mobile remote control 190 also supports context related voice commands.
  • the mobile remote control component 206 automatically adapts to the media content currently viewed by the operator, and supports some or all of the available commands related to the various media content.
  • the mobile remote control component 206 also supports a context sensitive tool bar.
  • the mobile remote control component 206 also takes feedback from the online conferencing tool to keep the operator informed about the current context for a multimedia conference event.
  • the feedback may include a preview of the content currently in focus and the list of attendees currently in the meeting.
  • the control logic built into the mobile remote control component 206 takes care of updating a context related tool bar and a preview window or pane generated by the GUI implemented by the mobile remote control component 206 .
  • the control logic also updates the voice commands shell, thereby informing the mobile remote control component 206 about what commands it can expect at that point of time. This may include the creation of grammar files at run time which goes as input to the voice command shell.
  • FIG. 3 illustrates a block diagram for the mobile remote control component 206 .
  • the mobile remote control component 206 may comprise multiple modules.
  • the mobile remote control component 206 includes a polling module 302 , a command generator module 304 , a remote user interface module 306 , a command parser 320 , a dynamic grammar generator module 322 , and a dynamic toolbar customizer module 324 .
  • Each of the modules may be implemented using hardware elements, software elements, or a combination of hardware elements and software elements.
  • the mobile remote control component 206 as shown in FIG. 3 has a limited number of elements in a certain topology, it may be appreciated that the mobile remote control component 206 may include more or less elements in alternate topologies as desired for a given implementation. The embodiments are not limited in this context.
  • the polling module 302 may be arranged to interface with the multimedia conference server 130 .
  • the polling module 302 may be configured to continuously or periodically communicate control information and media information between the mobile remote control 190 and the server meeting component 132 of the multimedia conference server 130 .
  • the polling module 302 and the multimedia conference server 130 may follow a push model where control and media information are pushed to each other when the information is ready for transmission.
  • the wireless connection 192 may be in continuous service, particularly when the multimedia conference server 130 is experiencing heavy traffic loads.
  • the push model has the advantage of reducing latency at the expense of increasing power consumption.
  • the polling module 302 may implement a pull model where control and media information are pulled from each other when the information is ready for reception. For example, the polling module 302 may periodically or on demand poll the multimedia conference server 130 to determine whether the server meeting component 132 has any information ready for transmission using a poll request. If so, the polling module 302 may pull the information from the multimedia conference server 130 .
  • the pull model has the advantage of decreased power consumption at the expense of increasing latency in data transmissions.
  • the polling period may be a configurable parameter, thereby allowing the operator to select the amount of latency that can be tolerated against the amount of battery life remaining for the mobile remote control 190 .
  • the polling module 302 may implement a hybrid push and pull model to realize some of the advantages of both models while reducing the corresponding disadvantages.
  • the polling module 302 may utilize a pull model to receive information from the multimedia conference server 130 to increase battery life, and a push model to send control information to the multimedia conference server 130 to reduce latency.
  • the command generator module 304 may be communicatively coupled to the polling module 302 .
  • the command generator module 304 may be arranged to generate control directives for the multimedia conference server 130 from operator input commands or operator voice commands.
  • the command generator module 304 may map various operator input command or operator voice commands received by the remote user interface module 306 to a set of API commands used by the server meeting component 132 of the multimedia conference server 130 .
  • the mobile remote control 190 may be interoperable with different types of multimedia conferencing applications implemented by the multimedia conference server 130 .
  • the remote user interface module 306 may be trained to accept various types of operator input commands or voice commands to control the media resources or various multimedia conference controls provided by the server meeting component 132 .
  • a conference leader or presenter could train the remote user interface module 306 to accept multiple operator voice commands such as “next slide” or “slide,” and the command generator module 304 may map the multiple operator voice commands to the appropriate API to control the presentation slides shown in the virtual meeting environment.
  • the remote user interface module 306 may be communicatively coupled to the command generator module 304 .
  • the remote user interface module 306 may be generally arranged to provide various visual or tactile GUI views for an operator, and accept various forms of operator commands and other input from an operator.
  • the remote user interface module 306 may comprise a GUI module 308 .
  • the GUI module 308 may be arranged to receive operator input commands and display media content from the multimedia conference server 130 .
  • the GUI module 308 may generate and display various on screen input controls to control media resources for the multimedia conference event.
  • the screen input controls may be designed to control various audio resources, video resources and other media resources, such as managing live and recorded video, chat, slide and application sharing, Voice over Internet Protocol (VoIP) and PSTN audio, and audience feedback tools.
  • VoIP Voice over Internet Protocol
  • the GUI module 308 may also display various toolbars to allow an operator to select various features and options for the mobile remote control 190 , such as provided by the client meeting module 112 , the server meeting module 132 , the mobile remote control component 206 , or an operating system (OS) program.
  • the various toolbars may be dynamic and change according to a current context for the multimedia conference event.
  • the GUI module 308 may also generate a GUI view to display media content received from the multimedia conference server 130 in a preview pane.
  • the GUI module 308 may receive feedback information from the multimedia conference server 130 to keep the operator informed about the current context for a multimedia conference event.
  • the feedback may include a preview of the media content currently in focus and the list of attendees currently in the meeting.
  • the preview pane may be part of a GUI view displayed on a touch screen display implemented as the display component 202 .
  • the remote user interface module 306 may further comprise an audio interface module 310 .
  • the audio interface module 310 may be arranged to receive operator voice commands, and reproduce narrations for media content received from the multimedia conference server.
  • the mobile remote control 190 may use the audio interface module 310 to support context related voice commands.
  • the audio interface module 310 automatically adapts to the media content currently viewed by the operator, and supports some or all of the available commands related to the various media content.
  • the audio interface module 310 creates and manages a voice commands shell, and periodically updates the voice commands shell with customized operator commands.
  • the audio interface module 310 may audibly reproduce control information or media information for an operator. This may be convenient, for example, if the operator is working in a hands-free environment, such as when driving. The operator could control the media resources for the multimedia conference event, and receive feedback from the multimedia conference event, using a completely audio-based interface thereby reducing or removing the need for the user to manually operate the mobile remote control 190 .
  • the command parser module 320 may be communicatively coupled to the polling module 302 .
  • the command parser module 320 may be arranged to parse control directives for the mobile remote control received from the multimedia conference server 130 .
  • the multimedia conference server 130 may periodically send control directives to the mobile remote control 190 as well.
  • the server meeting component 132 of the multimedia conference server 130 may periodically configuration information, diagnostic information, connection repair information or capabilities information to the mobile remote control 190 to enhance operations for the mobile remote control 190 , or interoperability with the server meeting component 132 .
  • the server meeting component 132 may also send updated API commands, or different command tools based on the currently displayed media content for a multimedia conference event.
  • the command parser module 320 may receive the control directives, parse the control directives and output the parsed control directives to the remote user interface module 306 for visual or audio reproduction of the control directives to an operator.
  • the dynamic grammar generator module 322 may be communicatively coupled to the command parser module 320 , as well as the GUI module 308 and the audio interface module 310 .
  • the dynamic grammar generator module 322 may be arranged to generate context related voice commands.
  • the dynamic grammar generator module 322 may create or generate grammar files at run time.
  • the grammar files may be used as input for the voice command shell used by the audio interface module 310 .
  • the dynamic toolbar customizer module 324 may also be communicatively coupled to the command parser module 320 , as well as the GUI module 308 and the audio interface module 310 .
  • the dynamic toolbar customizer module 324 may be arranged to create and generate a context sensitive tool bar for the GUI module 308 .
  • the context sensitive tool bar may display various options or features based on a current context for the multimedia conference event. For example, the context sensitive tool bar may display a first set of options or features when a presentation slide deck is in focus and currently active, a second set of options or features when a chat window is in focus and currently active, a third set of options or features when a streaming video window is in focus and currently active, and so forth.
  • logic flows may be further described with reference to one or more logic flows. It may be appreciated that the representative logic flows do not necessarily have to be executed in the order presented, or in any particular order, unless otherwise indicated. Moreover, various activities described with respect to the logic flows can be executed in serial or parallel fashion.
  • the logic flows may be implemented using one or more hardware elements and/or software elements of the described embodiments or alternative elements as desired for a given set of design and performance constraints.
  • the logic flows may be implemented as logic (e.g., computer program instructions) for execution by a logic device (e.g., a general-purpose or specific-purpose computer).
  • FIG. 4 illustrates one embodiment of a logic flow 400 .
  • Logic flow 400 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 400 may establish a wireless connection between a mobile remote control and a multimedia conference server hosting a multimedia conference event at block 402 .
  • the communications component 116 of the mobile remote control 190 may establish the wireless connection 192 between the mobile remote control 190 and the multimedia conference server 130 hosting a multimedia conference event.
  • the wireless connection 192 may comprise, for example, a wireless local area network (WLAN) connection (e.g., IEEE 802.11, 802.16, 802.20, and variants) or a cellular network connection (e.g., using a Code Division Multiple Access system, Global System for Mobile Communications system, Time Division Multiple Access system, Universal Mobile Telephone System, and so forth).
  • WLAN wireless local area network
  • a cellular network connection e.g., using a Code Division Multiple Access system, Global System for Mobile Communications system, Time Division Multiple Access system, Universal Mobile Telephone System, and so forth.
  • the logic flow 400 may manage the multimedia conference event from the mobile remote control by communicating control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection at block 404 .
  • the mobile remote control component 206 may manage the multimedia conference event from the mobile remote control 190 by communicating control information and media information with the multimedia conference server 130 for the multimedia conference event over the wireless connection 192 .
  • the mobile remote control component 206 may manage various features, options or functionality provided by the server meeting component 132 of the multimedia conference server 130 , and receive multimedia conference event feedback information from the server meeting component 132 .
  • the mobile remote control 190 may also participate in the multimedia conference event and communicate multimedia conference event information with the server meeting component 132 in a manner similar to the meeting consoles 110 - 1 - m utilizing the client meeting component 112 .
  • FIG. 5 further illustrates a more detailed block diagram of computing architecture 510 suitable for implementing the mobile remote control 190 , meeting consoles 110 - 1 - m and/or the multimedia conference server 130 .
  • computing architecture 510 typically includes at least one processing unit 532 and memory 534 .
  • Memory 534 may be implemented using any machine-readable or computer-readable media capable of storing data, including both volatile and non-volatile memory.
  • memory 534 may include read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information. As shown in FIG.
  • ROM read-only memory
  • RAM random-access memory
  • DRAM dynamic RAM
  • DDRAM Double-Data-Rate DRAM
  • SDRAM synchronous DRAM
  • SRAM static RAM
  • PROM programmable ROM
  • EPROM erasable programmable ROM
  • EEPROM electrically erasable programmable ROM
  • flash memory polymer memory such
  • memory 534 may store various software programs, such as one or more application programs 536 - 1 - t and accompanying data.
  • application programs 536 - 1 - t may include server meeting component 132 , client meeting components 112 - 1 - n , or the mobile remote control component 206 .
  • Computing architecture 510 may also have additional features and/or functionality beyond its basic configuration.
  • computing architecture 510 may include removable storage 538 and non-removable storage 540 , which may also comprise various types of machine-readable or computer-readable media as previously described.
  • Computing architecture 510 may also have one or more input devices 544 such as a keyboard, mouse, pen, voice input device, touch input device, measurement devices, sensors, and so forth.
  • Computing architecture 510 may also include one or more output devices 542 , such as displays, speakers, printers, and so forth.
  • Computing architecture 510 may further include one or more communications connections 546 that allow computing architecture 510 to communicate with other devices.
  • Communications connections 546 may be representative of, for example, the communications interfaces for the communications components 116 - 1 - v .
  • Communications connections 546 may include various types of standard communication elements, such as one or more communications interfaces, network interfaces, network interface cards (NIC), radios, wireless transmitters/receivers (transceivers), wired and/or wireless communication media, physical connectors, and so forth.
  • Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired communications media and wireless communications media.
  • wired communications media may include a wire, cable, metal leads, printed circuit boards (PCB), backplanes, switch fabrics, semiconductor material, twisted-pair wire, co-axial cable, fiber optics, a propagated signal, and so forth.
  • wireless communications media may include acoustic, radio-frequency (RF) spectrum, infrared and other wireless media.
  • RF radio-frequency
  • FIG. 6 illustrates a diagram an article of manufacture 600 suitable for storing logic for the various embodiments, including the logic flow 400 .
  • the article 600 may comprise a storage medium 602 to store logic 604 .
  • the storage medium 602 may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
  • Examples of the logic 604 may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
  • software elements such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
  • the article 600 and/or the computer-readable storage medium 602 may store logic 604 comprising executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments.
  • the executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like.
  • the executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function.
  • the instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language, such as C, C++, Java, BASIC, Perl, Matlab, Pascal, Visual BASIC, assembly language, and others.
  • Various embodiments may be implemented using hardware elements, software elements, or a combination of both.
  • hardware elements may include any of the examples as previously provided for a logic device, and further including microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
  • Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
  • Coupled and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Telephonic Communication Services (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Techniques to remotely manage a multimedia conference event are described. An apparatus may comprise a mobile remote control having a communications component operative to establish a wireless connection between a mobile remote control and a multimedia conference server hosting a multimedia conference event. The mobile remote control may include a mobile remote control component communicatively coupled to the communications component, the mobile remote control component operative to manage the multimedia conference event from the mobile remote control by communicating control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection. Other embodiments are described and claimed.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The application is a continuation of, claims the benefit of and priority to, previously filed U.S. patent application Ser. No. 12/062,536 entitled “Techniques to Remotely Manage a Multimedia Conference Event” filed on Apr. 4, 2008, the subject matter of which is hereby incorporated by reference in its entirety.
  • BACKGROUND
  • A multimedia conference system typically allows multiple participants to communicate and share different types of media content in a collaborative and real-time meeting over a network. The multimedia conference system may display different types of media content using various graphical user interface (GUI) windows or views. For example, one GUI view might include video images of participants, another GUI view might include presentation slides, yet another GUI view might include text messages between participants, and so forth. In this manner various geographically disparate participants may interact and communicate information in a virtual meeting environment similar to a physical meeting environment where all the participants are within one room.
  • In a virtual meeting environment there is typically a conference leader or presenter controlling the various media resources for the multimedia conference system. For example, a conference leader may control presentation slides while delivering a speech. In some cases, however, it may be difficult for the conference leader to manage the various media resources provided by the multimedia conference system. This may limit the effective delivery of media content for a meeting. Techniques directed to improving management of media resources in a virtual meeting environment may therefore enhance user experience and convenience.
  • SUMMARY
  • Various embodiments may be generally directed to multimedia conference systems. Some embodiments may be particularly directed to techniques to manage media resources for a multimedia conference event. The multimedia conference event may include multiple participants, some of which may gather in a conference room, while others may participate in the multimedia conference event from remote locations. One of the participants may utilize a mobile remote control to manage various media resources for the multimedia conference event.
  • In one embodiment, for example, an apparatus may comprise a mobile remote control having a communications component operative to establish a wireless connection between a mobile remote control and a multimedia conference server hosting a multimedia conference event. The mobile remote control may include a mobile remote control component communicatively coupled to the communications component, the mobile remote control component operative to manage the multimedia conference event from the mobile remote control by communicating control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection. Other embodiments are described and claimed.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an embodiment of a multimedia conference system.
  • FIG. 2 illustrates an embodiment of a mobile remote control.
  • FIG. 3 illustrates an embodiment of a mobile remote control component.
  • FIG. 4 illustrates an embodiment of a logic flow.
  • FIG. 5 illustrates an embodiment of a computing architecture.
  • FIG. 6 illustrates an embodiment of an article.
  • DETAILED DESCRIPTION
  • Various embodiments include physical or logical structures arranged to perform certain operations, functions or services. The structures may comprise physical structures, logical structures or a combination of both. The physical or logical structures are implemented using hardware elements, software elements, or a combination of both. Descriptions of embodiments with reference to particular hardware or software elements, however, are meant as examples and not limitations. Decisions to use hardware or software elements to actually practice an embodiment depends on a number of external factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds, and other design or performance constraints. Furthermore, the physical or logical structures may have corresponding physical or logical connections to communicate information between the structures in the form of electronic signals or messages. The connections may comprise wired and/or wireless connections as appropriate for the information or particular structure. It is worthy to note that any reference to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • Various embodiments may be generally directed to multimedia conference systems arranged to provide meeting and collaboration services to multiple participants over a network. Some multimedia conference systems may be designed to operate with various packet-based networks, such as the Internet or World Wide Web (“web”), to provide web-based conferencing services. Such implementations are sometimes referred to as web conferencing systems. An example of a web conferencing system may include MICROSOFT® OFFICE LIVE MEETING made by Microsoft Corporation, Redmond, Wash. Other multimedia conference systems may be designed to operate for a private network, business, organization, or enterprise, and may utilize a multimedia conference server such as MICROSOFT OFFICE COMMUNICATIONS SERVER made by Microsoft Corporation, Redmond, Wash. It may be appreciated, however, that implementations are not limited to these examples.
  • A multimedia conference system may include, among other network elements, a multimedia conference server or other processing device arranged to provide web conferencing services. For example, a multimedia conference server may include, among other server elements, a server meeting component operative to control and mix different types of media content for a meeting and collaboration event, such as a web conference. A meeting and collaboration event may refer to any multimedia conference event offering various types of multimedia information in a real-time or live online environment, and is sometimes referred to herein as simply a “meeting event,” “multimedia event” or “multimedia conference event.”
  • In one embodiment, the multimedia conference system may further include one or more computing devices implemented as meeting consoles. Each meeting console may be arranged to participate in a multimedia event by connecting to the multimedia conference server. Different types of media information from the various meeting consoles may be received by the multimedia conference server during the multimedia event, which in turn distributes the media information to some or all of the other meeting consoles participating in the multimedia event As such, any given meeting console may have a display with multiple media content views of different types of media content. In this manner various geographically disparate participants may interact and communicate information in a virtual meeting environment similar to a physical meeting environment where all the participants are within one room.
  • In a virtual meeting environment there is typically a conference leader or presenter controlling the various media resources for the multimedia conference system. For example, a conference leader may control presentation slides while delivering a speech. In some cases, however, it may be difficult for the conference leader to manage the various media resources provided by the multimedia conference system. For example, the conference leader may not have convenient access to a meeting console to participate in the multimedia conference system. In another example, the conference leader may have access to a meeting console but may be incapable of moving from the meeting console during a presentation. This may limit the effective delivery of media content for a meeting.
  • To solve these and other problems, various embodiments are directed to techniques to remotely manage a multimedia conference event. A conference leader may utilize a mobile remote control to direct, manage or otherwise control various media resources provided by a multimedia conference server. The media resources provided by the multimedia conference server may be controlled via a user interface implemented by the mobile remote control. For example, an operator may use a manual input device, such as a keyboard or pointing device (e.g., stylus, mouse, trackball, touch pad, touch screen, etc.), to enter operator commands into the mobile remote control to control the media resources. Additionally or alternatively, an operator may use an audio input device, such as a microphone, speaker and various audio user interface modules, to enter operator commands into the mobile remote control to control the media resources in a “hands-free” mode of operation.
  • The mobile remote control may provide various advantages over conventional static or immobile meeting consoles. One advantage is that the mobile remote control provides mobility to a conference leader. For example, a conference leader can move around a conference room or presentation hall when presenting a larger audience. The conference leader can use the mobile remote control to switch presentation slides or to bring certain media content into focus. In another example, a conference leader can be performing other activities or tasks while leading a multimedia conference event, such as a driving a vehicle. The conference leader can use voice command support implemented by the mobile remote control to lead a multimedia conference event without necessarily using her hands. Another advantage is that the mobile remote control provides portability to a conference leader. For example, the mobile remote control may utilize a form factor that is convenient for single hand use, such as the size of a cellular telephone, handheld computer, or smart phone. Yet another advantage is that the mobile remote control provides feedback or status information of a multimedia conference event for the conference leader. For example, the mobile remote control may dynamically render its user interfaces according to the particular media content in focus. In scenarios where there are multiple presenters, this feedback on the mobile remote control helps the conference leader to find out which particular media content is currently highlighted or in focus. The feedback also provides information about the attendees in the meeting and their status.
  • FIG. 1 illustrates a block diagram for a multimedia conference system 100. Multimedia conference system 100 may represent a general system architecture suitable for implementing various embodiments. Multimedia conference system 100 may comprise multiple elements. An element may comprise any physical or logical structure arranged to perform certain operations. Each element may be implemented as hardware, software, or any combination thereof, as desired for a given set of design parameters or performance constraints. Examples of hardware elements may include devices, components, processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software may include any software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, interfaces, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Although multimedia conference system 100 as shown in FIG. 1 has a limited number of elements in a certain topology, it may be appreciated that multimedia conference system 100 may include more or less elements in alternate topologies as desired for a given implementation. The embodiments are not limited in this context.
  • As used herein the terms “system,” “subsystem,” “component,” and “module” are intended to refer to a computer-related entity, comprising either hardware, a combination of hardware and software, software, or software in execution. For example, a component can be implemented as a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers as desired for a given implementation. The embodiments are not limited in this context.
  • In various embodiments, the multimedia conference system 100 may comprise, or form part of, a wired communications system, a wireless communications system, or a combination of both. For example, the multimedia conference system 100 may include one or more elements arranged to communicate information over one or more types of wired communications links. Examples of a wired communications link may include, without limitation, a wire, cable, bus, printed circuit board (PCB), Ethernet connection, peer-to-peer (P2P) connection, backplane, switch fabric, semiconductor material, twisted-pair wire, co-axial cable, fiber optic connection, and so forth. The multimedia conference system 100 also may include one or more elements arranged to communicate information over one or more types of wireless communications links. Examples of a wireless communications link may include, without limitation, a radio channel, infrared channel, radio-frequency (RF) channel, Wireless Fidelity (WiFi) channel, a portion of the RF spectrum, and/or one or more licensed or license-free frequency bands.
  • In various embodiments, the multimedia conference system 100 may be arranged to communicate, manage or process different types of information, such as media information and control information. Examples of media information may generally include any data representing content meant for a user, such as voice information, video information, audio information, image information, textual information, numerical information, application information, alphanumeric symbols, graphics, and so forth. Voice information may comprise a subset of audio information, and is broadly meant to include any information communicated by a human being, such as words, speech, speech utterances, sounds, vocal noise, and so forth. Control information may refer to any data representing commands, instructions or control words meant for an automated system. For example, control information may be used to route media information through a system, to establish a connection between devices, instruct a device to process the media information in a predetermined manner, and so forth.
  • In various embodiments, multimedia conference system 100 may include a multimedia conference server 130. The multimedia conference server 130 may comprise any logical or physical entity that is arranged to establish, manage or control a multimedia conference call between meeting consoles 110-1-m over a network 120. Network 120 may comprise, for example, a packet-switched network, a circuit-switched network, or a combination of both. In various embodiments, the multimedia conference server 130 may comprise or be implemented as any processing or computing device, such as a computer, a server, a server array or server farm, a work station, a mini-computer, a main frame computer, a supercomputer, and so forth. The multimedia conference server 130 may comprise or implement a general or specific computing architecture suitable for communicating and processing multimedia information. In one embodiment, for example, the multimedia conference server 130 may be implemented using a computing architecture as described with reference to FIG. 5. Examples for the multimedia conference server 130 may include without limitation a MICROSOFT OFFICE COMMUNICATIONS SERVER, a MICROSOFT OFFICE LIVE MEETING server, and so forth.
  • A specific implementation for the multimedia conference server 130 may vary depending upon a set of communication protocols or standards to be used for the multimedia conference server 130. In one example, the multimedia conference server 130 may be implemented in accordance with the Internet Engineering Task Force (IETF) Multiparty Multimedia Session Control (MMUSIC) Working Group Session Initiation Protocol (SIP) series of standards and/or variants. SIP is a proposed standard for initiating, modifying, and terminating an interactive user session that involves multimedia elements such as video, voice, instant messaging, online games, and virtual reality. In another example, the multimedia conference server 130 may be implemented in accordance with the International Telecommunication Union (ITU) H.323 series of standards and/or variants. The H.323 standard defines a multipoint control unit (MCU) to coordinate conference call operations. In particular, the MCU includes a multipoint controller (MC) that handles H.245 signaling, and one or more multipoint processors (MP) to mix and process the data streams. Both the SIP and H.323 standards are essentially signaling protocols for Voice over Internet Protocol (VoIP) or Voice Over Packet (VOP) multimedia conference call operations. It may be appreciated that other signaling protocols may be implemented for the multimedia conference server 130, however, and still fall within the scope of the embodiments.
  • In general operation, multimedia conference system 100 may be used for multimedia conference calls. Multimedia conference calls typically involve communicating voice, video, and/or data information between multiple end points. For example, a public or private packet network 120 may be used for audio conferencing calls, video conferencing calls, audio/video conferencing calls, collaborative document sharing and editing, and so forth. The packet network 120 may also be connected to a Public Switched Telephone Network (PSTN) via one or more suitable VoIP gateways arranged to convert between circuit-switched information and packet information.
  • To establish a multimedia conference call over the packet network 120, each meeting console 110-1-m may connect to multimedia conference server 130 via the packet network 120 using various types of wired or wireless communications links operating at varying connection speeds or bandwidths, such as a lower bandwidth PSTN telephone connection, a medium bandwidth DSL modem connection or cable modem connection, and a higher bandwidth intranet connection over a local area network (LAN), for example.
  • In various embodiments, the multimedia conference server 130 may establish, manage and control a multimedia conference call between meeting consoles 110-1-m. In some embodiments, the multimedia conference call may comprise a live web-based conference call using a web conferencing application that provides full collaboration capabilities. The multimedia conference server 130 operates as a central server that controls and distributes media information in the conference. It receives media information from various meeting consoles 110-1-m, performs mixing operations for the multiple types of media information, and forwards the media information to some or all of the other participants. One or more of the meeting consoles 110-1-m may join a conference by connecting to the multimedia conference server 130. The multimedia conference server 130 may implement various admission control techniques to authenticate and add meeting consoles 110-1-m in a secure and controlled manner.
  • In various embodiments, the multimedia conference system 100 may include one or more computing devices implemented as meeting consoles 110-1-m to connect to the multimedia conference server 130 over one or more communications connections via the network 120. For example, a computing device may implement a client application that may host multiple meeting consoles each representing a separate conference at the same time. Similarly, the client application may receive multiple audio, video and data streams. For example, video streams from all or a subset of the participants may be displayed as a mosaic on the participant's display with a top window with video for the current active speaker, and a panoramic view of the other participants in other windows.
  • The meeting consoles 110-1-m may comprise any logical or physical entity that is arranged to participate or engage in a multimedia conference call managed by the multimedia conference server 130. The meeting consoles 110-1-m may be implemented as any device that includes, in its most basic form, a processing system including a processor and memory, one or more multimedia input/output (I/O) components, and a wireless and/or wired network connection. Examples of multimedia I/O components may include audio I/O components (e.g., microphones, speakers), video I/O components (e.g., video camera, display), tactile (I/O) components (e.g., vibrators), user data (I/O) components (e.g., keyboard, thumb board, keypad, touch screen), and so forth. Examples of the meeting consoles 110-1-m may include a telephone, a VoIP or VOP telephone, a packet telephone designed to operate on the PSTN, an Internet telephone, a video telephone, a cellular telephone, a personal digital assistant (PDA), a combination cellular telephone and PDA, a mobile computing device, a smart phone, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a handheld computer, a network appliance, and so forth. In some implementations, the meeting consoles 110-1-m may be implemented using a general or specific computing architecture similar to the computing architecture described with reference to FIG. 5.
  • The meeting consoles 110-1-m may comprise or implement respective client meeting components 112-1-n. The client meeting components 112-1-n may be designed to interoperate with the server meeting component 132 of the multimedia conference server 130 to establish, manage or control a multimedia conference event. For example, the client meeting components 112-1-n may comprise or implement the appropriate application programs and user interface controls to allow the respective meeting consoles 110-1-m to participate in a web conference facilitated by the multimedia conference server 130. This may include input equipment (e.g., video camera, microphone, keyboard, mouse, controller, etc.) to capture media information provided by the operator of a meeting console 110-1-m, and output equipment (e.g., display, speaker, etc.) to reproduce media information by the operators of other meeting consoles 110-1-m. Examples for client meeting components 112-1-n may include without limitation a MICROSOFT OFFICE COMMUNICATOR or the MICROSOFT OFFICE LIVE MEETING Windows Based Meeting Console, and so forth.
  • As shown in the illustrated embodiment of FIG. 1, a representative meeting console 110-1 may be connected to various multimedia input devices and/or multimedia output devices capable of capturing, communicating or reproducing multimedia information. The multimedia input devices may comprise any logical or physical device arranged to capture or receive as input multimedia information from one or more operators or participants of the meeting console 110-1, including audio input devices, video input devices, image input devices, text input devices, and other multimedia input equipment. Examples of multimedia input devices may include without limitation video cameras, microphones, microphone arrays, conference telephones, whiteboards, interactive whiteboards, voice-to-text components, text-to-voice components, voice recognition systems, pointing devices, keyboards, touchscreens, tablet computers, handwriting recognition devices, and so forth. An example of a video camera may include a ringcam, such as the MICROSOFT ROUNDTABLE made by Microsoft Corporation, Redmond, Wash. The MICROSOFT ROUNDTABLE is a videoconferencing device with a 360 degree camera that provides remote meeting participants a panoramic video of everyone sitting around a conference table. The multimedia output devices may comprise any logical or physical device arranged to reproduce or display as output multimedia information to one or more operators or participants of the meeting consoles 110-1, including audio output devices, video output devices, image output devices, text input devices, and other multimedia output equipment. Examples of multimedia output devices may include without limitation electronic displays, video projectors, speakers, vibrating units, printers, facsimile machines, and so forth.
  • In the illustrated embodiment shown in FIG. 1, the representative meeting console 110-1 may include various multimedia input devices arranged to capture media content from one or more participants 154-1-p, and stream the media content to the multimedia conference server 130. The meeting console 110-1 includes various types of multimedia input equipment, such as a video camera 106 and an array of microphones 104-1-r. The video camera 106 may capture video content including video content of one or more participants 154-1-p within video capture range of the video camera 106, and stream the video content to the multimedia conference server 130 via the meeting console 110-1. Similarly, audio input devices such as the array of microphones 104-1-r may capture audio content including audio content from one or more participants 154-1-p within audio capture range of the microphones 104-1-r, and stream the audio content to the multimedia conference server 130 via the meeting console 110-1. The meeting console may also include various multimedia output devices, such as one or more speakers 108-1-s and an electronic display. Audio output devices such as the one or more speakers 108-1-s may reproduce audio content for the participants 154-1-p. Video output devices such as the electronic display may be used to reproduce video content from other participants using remote meeting consoles 110-2-m received via the multimedia conference server 130.
  • The meeting consoles 110-1-m and the multimedia conference server 130 may communicate media information and control information utilizing various media connections established for a given multimedia conference event. In one embodiment, for example, the meeting consoles 110-1-m may each comprise a respective communications component 116-1-v. The communications components 116-1-v may comprise various communications resources suitable for establishing the various media connections. Examples of the communications resources may include transmitters, receivers, transceivers, radios, network interfaces, network interface cards, processors, memory, media access control (MAC) layer parts, physical (PHY) layer parts, connectors, communications media, communications interfaces, and so forth.
  • The communications components 116-1-v may establish media connections in general, and audio connections in particular, using various VoIP signaling protocols, such as the SIP series of protocols. The SIP series of protocols are application-layer control (signaling) protocol for creating, modifying and terminating sessions with one or more participants. These sessions include Internet multimedia conferences, Internet telephone calls and multimedia distribution. Members in a session can communicate via multicast or via a mesh of unicast relations, or a combination of these. SIP is designed as part of the overall IETF multimedia data and control architecture currently incorporating protocols such as the resource reservation protocol (RSVP) (IEEE RFC 2205) for reserving network resources, the real-time transport protocol (RTP) (IEEE RFC 1889) for transporting real-time data and providing Quality-of-Service (QOS) feedback, the real-time streaming protocol (RTSP) (IEEE RFC 2326) for controlling delivery of streaming media, the session announcement protocol (SAP) for advertising multimedia sessions via multicast, the session description protocol (SDP) (IEEE RFC 2327) for describing multimedia sessions, and others. For example, the meeting consoles 110-1-m may use SIP as a signaling channel to setup the media connections, and RTP as a media channel to transport media information over the media connections.
  • The communications components 116-1-v may establish media connections using various circuit-switched techniques. For example, the communications components 116-1-v may establish a media connection using Pulse Code Modulation (PCM) signals over a circuit-switched network. An example of a circuit-switched network may include the Public Switched Telephone Network (PSTN), a private network, and so forth.
  • In general operation, a schedule device 108 may be used to generate a multimedia conference event reservation for the multimedia conference system 100. The scheduling device 108 may comprise, for example, a computing device having the appropriate hardware and software for scheduling multimedia conference events. For example, the scheduling device 108 may comprise a computer utilizing MICROSOFT OFFICE OUTLOOK® application software, made by Microsoft Corporation, Redmond, Wash. The MICROSOFT OFFICE OUTLOOK application software comprises messaging and collaboration client software that may be used to schedule a multimedia conference event. An operator may use MICROSOFT OFFICE OUTLOOK to convert a schedule request to a MICROSOFT OFFICE LIVE MEETING event that is sent to a list of meeting invitees. The schedule request may include a hyperlink to a virtual room for a multimedia conference event. An invitee may click on the hyperlink, and the meeting console 110-1-m launches a web browser, connects to the multimedia conference server 130, and joins the virtual room. Once there, the participants can present a slide presentation, annotate documents or brainstorm on the built in whiteboard, among other tools.
  • In a virtual meeting environment there is typically a conference leader or presenter controlling the various media resources for the multimedia conference system. For example, a conference leader may control presentation slides while delivering a speech. In some cases, however, it may be difficult for the conference leader to manage the various media resources provided by the multimedia conference system. For example, the conference leader may not have convenient access to a meeting console to participate in the multimedia conference system. In another example, the conference leader may have access to a meeting console but may be incapable of moving from the meeting console during a presentation. This may limit the effective delivery of media content for a meeting.
  • To solve these and other problems, various embodiments are directed to techniques to remotely manage a multimedia conference event. A conference leader may utilize a mobile remote control 190 to direct, manage or otherwise control various media resources provided by the multimedia conference server 130. The mobile remote control 190 may establish a wireless connection 192 with the multimedia conference server 130 to participate in a multimedia conference event with one or more of the meeting consoles 110-1-m. Furthermore, an operator may assume the role of a conference leader and utilize the mobile remote control 190 to direct, manage or otherwise control various media resources and multimedia conference features provided by the multimedia conference server 130. This may be accomplished using manual user interface controls or voice commands. In this manner, a conference leader may utilize the mobility and portability of the mobile remote control 190 to direct, manage or control media resources and features for a multimedia conference event with increased convenience to the conference leader. The multimedia conference system 100 in general, and the mobile remote control 190 in particular, may be described in more detail with reference to FIG. 2.
  • FIG. 2 illustrates a more detailed block diagram for the mobile remote control 190. The mobile remote control 190 may be implemented using any mobile device having computing capabilities, wireless communication capabilities, a form factor suitable for being held by a single average human hand, and a portable power supply such as one or more batteries. Examples for the mobile remote control 190 may include without limitation a mobile phone, a cellular telephone, a PDA, a combination cellular telephone and PDA, a handheld computer, a mobile computing device, a smart phone, digital media player, and so forth. In one embodiment, for example, the mobile remote control 190 may be implemented as a handheld computer such as a MICROSOFT ZUNE® made by Microsoft Corporation, Redmond, Wash. The embodiments, however, are not limited to this example. In some implementations, the mobile remote control 190 may be implemented using a general or specific computing architecture similar to the computing architecture described with reference to FIG. 5.
  • In the illustrated embodiment shown in FIG. 2, the mobile remote control 190 includes the client meeting component 112 and the communications component 116 as described with reference to the meeting consoles 110-1-m. In addition, the mobile remote control 190 may also include a display component 202, a power supply component 204, and a mobile remote control component 206. With the client meeting component 112, the communications component 116, and the display component 202, the mobile remote control 190 may implement some or all of the functionality of the meeting consoles 110-1-m in order to participate in a multimedia conference event. With the addition of the mobile remote control component 206, the mobile remote control 190 may also be used to remotely manage the multimedia conference event via interoperation with the server meeting component 132 of the multimedia conference server 130.
  • The display component 202 may include a digital electronic display for the presentation of information supplied as an electrical signal for visual or tactile reception. Examples of digital electronic displays may include without limitation electronic paper, nixie tube displays, vacuum fluorescent displays, light-emitting diode displays, electroluminescent displays, plasma display panels, liquid crystal displays, thin-film transistor displays, organic light-emitting diode displays, surface-conduction electron-emitter displays, laser television displays, carbon nanotubes, nanocrystal displays, and so forth. In one embodiment, for example, the display component 202 may be implemented as a touch screen display.
  • The power supply component 204 may include one or more portable rechargeable direct current (DC) batteries. Examples of DC batteries may include without limitation a nickel-cadmium (NiCd) cell, a nickel metal hydride (NiMH) cell, a lithium-ion (Li-Ion) cell, and so forth.
  • In one embodiment, for example, the communications component 116 may be operative to establish the wireless connection 192 between the mobile remote control 190 and the multimedia conference server 130 hosting a multimedia conference event. Examples of the wireless connection 192 may include without limitation a radio channel, infrared channel, RF channel, WiFi channel, a portion of the RF spectrum, and/or one or more licensed or license-free frequency bands.
  • The mobile remote control component 206 may be communicatively coupled to the communications component 116. The mobile remote control component 206 may implement control logic and user interfaces for remotely managing media resources for a multimedia conference event hosted by the multimedia conference server 130. More particularly, the mobile remote control component 206 may be operative to manage a multimedia conference event from the mobile remote control 190 by communicating control information and media information with the multimedia conference server 130 for the multimedia conference event over the wireless connection 192.
  • The mobile remote control component 206 may implement remote control operations to direct, manage or otherwise control an online conferencing tool from a mobile device with voice commands capabilities. The mobile remote control component 206 may be used by a conference leader or presenter who is presenting for the multimedia conference event, such as a conference meeting. The mobile remote control component 206 enables the operator to control basic and enhanced functionalities of the meeting from the mobile remote control 190.
  • By way of example, assume a presenter joins a multimedia conference event using the mobile remote control 190. The presenter will then have the ability to view some or all of the media content uploaded during the meeting on the mobile remote control 190. The presenter can then invoke media content such as a slide set and switch slides for the multimedia conference event from the mobile remote control 190. The mobile remote control component 206 also supports other conference functions or features for the presenter such as muting and attendee, removing an attendee from a meeting, changing meeting status (e.g., mood), and so forth. The presenter can choose to use the on screen user interface (or hardware buttons) or voice commands to trigger a command on the mobile remote control 190.
  • In addition to manual operator commands, the mobile remote control 190 also supports context related voice commands. The mobile remote control component 206 automatically adapts to the media content currently viewed by the operator, and supports some or all of the available commands related to the various media content. The mobile remote control component 206 also supports a context sensitive tool bar.
  • The mobile remote control component 206 also takes feedback from the online conferencing tool to keep the operator informed about the current context for a multimedia conference event. The feedback may include a preview of the content currently in focus and the list of attendees currently in the meeting.
  • The control logic built into the mobile remote control component 206 takes care of updating a context related tool bar and a preview window or pane generated by the GUI implemented by the mobile remote control component 206. The control logic also updates the voice commands shell, thereby informing the mobile remote control component 206 about what commands it can expect at that point of time. This may include the creation of grammar files at run time which goes as input to the voice command shell.
  • FIG. 3 illustrates a block diagram for the mobile remote control component 206. The mobile remote control component 206 may comprise multiple modules. In the illustrated embodiment shown in FIG. 3, for example, the mobile remote control component 206 includes a polling module 302, a command generator module 304, a remote user interface module 306, a command parser 320, a dynamic grammar generator module 322, and a dynamic toolbar customizer module 324. Each of the modules may be implemented using hardware elements, software elements, or a combination of hardware elements and software elements. Although the mobile remote control component 206 as shown in FIG. 3 has a limited number of elements in a certain topology, it may be appreciated that the mobile remote control component 206 may include more or less elements in alternate topologies as desired for a given implementation. The embodiments are not limited in this context.
  • The polling module 302 may be arranged to interface with the multimedia conference server 130. The polling module 302 may be configured to continuously or periodically communicate control information and media information between the mobile remote control 190 and the server meeting component 132 of the multimedia conference server 130.
  • In one embodiment, for example, the polling module 302 and the multimedia conference server 130 may follow a push model where control and media information are pushed to each other when the information is ready for transmission. In this manner, the wireless connection 192 may be in continuous service, particularly when the multimedia conference server 130 is experiencing heavy traffic loads. The push model has the advantage of reducing latency at the expense of increasing power consumption.
  • Since the mobile remote control 190 is a mobile device utilizing a portable power supply component 204, however, maintaining continuous communications with the multimedia conference server 130 may significantly increase power consumption thereby reducing the amount of energy needed to power the mobile remote control 190. To extend battery life, the polling module 302 may implement a pull model where control and media information are pulled from each other when the information is ready for reception. For example, the polling module 302 may periodically or on demand poll the multimedia conference server 130 to determine whether the server meeting component 132 has any information ready for transmission using a poll request. If so, the polling module 302 may pull the information from the multimedia conference server 130. The pull model has the advantage of decreased power consumption at the expense of increasing latency in data transmissions. The polling period may be a configurable parameter, thereby allowing the operator to select the amount of latency that can be tolerated against the amount of battery life remaining for the mobile remote control 190.
  • In one embodiment, the polling module 302 may implement a hybrid push and pull model to realize some of the advantages of both models while reducing the corresponding disadvantages. For example, the polling module 302 may utilize a pull model to receive information from the multimedia conference server 130 to increase battery life, and a push model to send control information to the multimedia conference server 130 to reduce latency.
  • The command generator module 304 may be communicatively coupled to the polling module 302. The command generator module 304 may be arranged to generate control directives for the multimedia conference server 130 from operator input commands or operator voice commands. The command generator module 304 may map various operator input command or operator voice commands received by the remote user interface module 306 to a set of API commands used by the server meeting component 132 of the multimedia conference server 130. In this manner, the mobile remote control 190 may be interoperable with different types of multimedia conferencing applications implemented by the multimedia conference server 130. Furthermore, the remote user interface module 306 may be trained to accept various types of operator input commands or voice commands to control the media resources or various multimedia conference controls provided by the server meeting component 132. For example, a conference leader or presenter could train the remote user interface module 306 to accept multiple operator voice commands such as “next slide” or “slide,” and the command generator module 304 may map the multiple operator voice commands to the appropriate API to control the presentation slides shown in the virtual meeting environment.
  • The remote user interface module 306 may be communicatively coupled to the command generator module 304. The remote user interface module 306 may be generally arranged to provide various visual or tactile GUI views for an operator, and accept various forms of operator commands and other input from an operator.
  • The remote user interface module 306 may comprise a GUI module 308. The GUI module 308 may be arranged to receive operator input commands and display media content from the multimedia conference server 130. The GUI module 308 may generate and display various on screen input controls to control media resources for the multimedia conference event. For example, the screen input controls may be designed to control various audio resources, video resources and other media resources, such as managing live and recorded video, chat, slide and application sharing, Voice over Internet Protocol (VoIP) and PSTN audio, and audience feedback tools. The GUI module 308 may also display various toolbars to allow an operator to select various features and options for the mobile remote control 190, such as provided by the client meeting module 112, the server meeting module 132, the mobile remote control component 206, or an operating system (OS) program. The various toolbars may be dynamic and change according to a current context for the multimedia conference event.
  • The GUI module 308 may also generate a GUI view to display media content received from the multimedia conference server 130 in a preview pane. The GUI module 308 may receive feedback information from the multimedia conference server 130 to keep the operator informed about the current context for a multimedia conference event. The feedback may include a preview of the media content currently in focus and the list of attendees currently in the meeting. In one embodiment, for example, the preview pane may be part of a GUI view displayed on a touch screen display implemented as the display component 202.
  • The remote user interface module 306 may further comprise an audio interface module 310. The audio interface module 310 may be arranged to receive operator voice commands, and reproduce narrations for media content received from the multimedia conference server. In addition to manual operator commands, the mobile remote control 190 may use the audio interface module 310 to support context related voice commands. The audio interface module 310 automatically adapts to the media content currently viewed by the operator, and supports some or all of the available commands related to the various media content. The audio interface module 310 creates and manages a voice commands shell, and periodically updates the voice commands shell with customized operator commands. Furthermore, the audio interface module 310 may audibly reproduce control information or media information for an operator. This may be convenient, for example, if the operator is working in a hands-free environment, such as when driving. The operator could control the media resources for the multimedia conference event, and receive feedback from the multimedia conference event, using a completely audio-based interface thereby reducing or removing the need for the user to manually operate the mobile remote control 190.
  • The command parser module 320 may be communicatively coupled to the polling module 302. The command parser module 320 may be arranged to parse control directives for the mobile remote control received from the multimedia conference server 130. As with the mobile remote control 190 sending control directives to the multimedia conference server 130, the multimedia conference server 130 may periodically send control directives to the mobile remote control 190 as well. For example, the server meeting component 132 of the multimedia conference server 130 may periodically configuration information, diagnostic information, connection repair information or capabilities information to the mobile remote control 190 to enhance operations for the mobile remote control 190, or interoperability with the server meeting component 132. The server meeting component 132 may also send updated API commands, or different command tools based on the currently displayed media content for a multimedia conference event. The command parser module 320 may receive the control directives, parse the control directives and output the parsed control directives to the remote user interface module 306 for visual or audio reproduction of the control directives to an operator.
  • The dynamic grammar generator module 322 may be communicatively coupled to the command parser module 320, as well as the GUI module 308 and the audio interface module 310. The dynamic grammar generator module 322 may be arranged to generate context related voice commands. For example, the dynamic grammar generator module 322 may create or generate grammar files at run time. The grammar files may be used as input for the voice command shell used by the audio interface module 310.
  • The dynamic toolbar customizer module 324 may also be communicatively coupled to the command parser module 320, as well as the GUI module 308 and the audio interface module 310. The dynamic toolbar customizer module 324 may be arranged to create and generate a context sensitive tool bar for the GUI module 308. The context sensitive tool bar may display various options or features based on a current context for the multimedia conference event. For example, the context sensitive tool bar may display a first set of options or features when a presentation slide deck is in focus and currently active, a second set of options or features when a chat window is in focus and currently active, a third set of options or features when a streaming video window is in focus and currently active, and so forth.
  • Operations for the above-described embodiments may be further described with reference to one or more logic flows. It may be appreciated that the representative logic flows do not necessarily have to be executed in the order presented, or in any particular order, unless otherwise indicated. Moreover, various activities described with respect to the logic flows can be executed in serial or parallel fashion. The logic flows may be implemented using one or more hardware elements and/or software elements of the described embodiments or alternative elements as desired for a given set of design and performance constraints. For example, the logic flows may be implemented as logic (e.g., computer program instructions) for execution by a logic device (e.g., a general-purpose or specific-purpose computer).
  • FIG. 4 illustrates one embodiment of a logic flow 400. Logic flow 400 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • As shown in FIG. 4, the logic flow 400 may establish a wireless connection between a mobile remote control and a multimedia conference server hosting a multimedia conference event at block 402. For example, the communications component 116 of the mobile remote control 190 may establish the wireless connection 192 between the mobile remote control 190 and the multimedia conference server 130 hosting a multimedia conference event. The wireless connection 192 may comprise, for example, a wireless local area network (WLAN) connection (e.g., IEEE 802.11, 802.16, 802.20, and variants) or a cellular network connection (e.g., using a Code Division Multiple Access system, Global System for Mobile Communications system, Time Division Multiple Access system, Universal Mobile Telephone System, and so forth).
  • The logic flow 400 may manage the multimedia conference event from the mobile remote control by communicating control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection at block 404. For example, the mobile remote control component 206 may manage the multimedia conference event from the mobile remote control 190 by communicating control information and media information with the multimedia conference server 130 for the multimedia conference event over the wireless connection 192. The mobile remote control component 206 may manage various features, options or functionality provided by the server meeting component 132 of the multimedia conference server 130, and receive multimedia conference event feedback information from the server meeting component 132. The mobile remote control 190 may also participate in the multimedia conference event and communicate multimedia conference event information with the server meeting component 132 in a manner similar to the meeting consoles 110-1-m utilizing the client meeting component 112.
  • FIG. 5 further illustrates a more detailed block diagram of computing architecture 510 suitable for implementing the mobile remote control 190, meeting consoles 110-1-m and/or the multimedia conference server 130. In a basic configuration, computing architecture 510 typically includes at least one processing unit 532 and memory 534. Memory 534 may be implemented using any machine-readable or computer-readable media capable of storing data, including both volatile and non-volatile memory. For example, memory 534 may include read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, or any other type of media suitable for storing information. As shown in FIG. 5, memory 534 may store various software programs, such as one or more application programs 536-1-t and accompanying data. Depending on the implementation, examples of application programs 536-1-t may include server meeting component 132, client meeting components 112-1-n, or the mobile remote control component 206.
  • Computing architecture 510 may also have additional features and/or functionality beyond its basic configuration. For example, computing architecture 510 may include removable storage 538 and non-removable storage 540, which may also comprise various types of machine-readable or computer-readable media as previously described. Computing architecture 510 may also have one or more input devices 544 such as a keyboard, mouse, pen, voice input device, touch input device, measurement devices, sensors, and so forth. Computing architecture 510 may also include one or more output devices 542, such as displays, speakers, printers, and so forth.
  • Computing architecture 510 may further include one or more communications connections 546 that allow computing architecture 510 to communicate with other devices. Communications connections 546 may be representative of, for example, the communications interfaces for the communications components 116-1- v . Communications connections 546 may include various types of standard communication elements, such as one or more communications interfaces, network interfaces, network interface cards (NIC), radios, wireless transmitters/receivers (transceivers), wired and/or wireless communication media, physical connectors, and so forth. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired communications media and wireless communications media. Examples of wired communications media may include a wire, cable, metal leads, printed circuit boards (PCB), backplanes, switch fabrics, semiconductor material, twisted-pair wire, co-axial cable, fiber optics, a propagated signal, and so forth. Examples of wireless communications media may include acoustic, radio-frequency (RF) spectrum, infrared and other wireless media. The terms machine-readable media and computer-readable media as used herein are meant to include both storage media and communications media.
  • FIG. 6 illustrates a diagram an article of manufacture 600 suitable for storing logic for the various embodiments, including the logic flow 400. As shown, the article 600 may comprise a storage medium 602 to store logic 604. Examples of the storage medium 602 may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of the logic 604 may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
  • In one embodiment, for example, the article 600 and/or the computer-readable storage medium 602 may store logic 604 comprising executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments. The executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like. The executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function. The instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language, such as C, C++, Java, BASIC, Perl, Matlab, Pascal, Visual BASIC, assembly language, and others.
  • Various embodiments may be implemented using hardware elements, software elements, or a combination of both. Examples of hardware elements may include any of the examples as previously provided for a logic device, and further including microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
  • Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
  • It is emphasized that the Abstract of the Disclosure is provided to comply with 37 C.F.R. Section 1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively. Moreover, the terms “first,” “second,” “third,” and so forth, are used merely as labels, and are not intended to impose numerical requirements on their objects.
  • Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (21)

1.-20. (canceled)
21. An apparatus, comprising:
a display;
a communications component, executing on at least one processor, operative to establish a wireless connection between a mobile remote control and a multimedia conference server hosting a multimedia conference event; and
a mobile remote control component, executing on the at least one processor, communicatively coupled to the communications component and the display, the mobile remote control component operative to manage one or more media resources provided by the multimedia conference server for the multimedia conference event, the mobile remote control component comprising a dynamic grammar generator module to generate a grammar for context related voice commands.
22. The apparatus of claim 21, the mobile remote control component further operative to communicate control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection.
23. The apparatus of claim 21, the mobile remote control component further operative to communicate control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection using at least one of a push model, a pull model, or a hybrid push-pull model.
24. The apparatus of claim 21, the mobile remote control component further operative to communicate control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection using a pull model, wherein the control information or the media information is pulled from the multimedia conference server when the mobile remote control is ready to receive the control information or the media information.
25. The apparatus of claim 21, the mobile remote control component further operative to communicate control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection using a push model, wherein the control information or the media information is pushed to the multimedia conference server or the mobile remote control when the control information or the media information is ready for transmission.
26. The apparatus of claim 21, the mobile remote control component comprising a command generator module operative to generate control directives for the multimedia conference server from operator input commands or operator voice commands.
27. The apparatus of claim 21, the mobile remote control component comprising a remote user interface module, the remote user interface module including an audio interface module operative to receive operator voice commands, and reproduce narrations for media content received from the multimedia conference server.
28. The apparatus of claim 21, the mobile remote control component comprising a command parser module operative to parse control directives for the mobile remote control received from the multimedia conference server.
29. The apparatus of claim 21, the mobile remote control component comprising a dynamic toolbar customizer module operative to update a context-sensitive toolbar of a graphical user interface (GUI) view on the display based on a current context for the multimedia conference event
30. The apparatus of claim 21, the mobile remote control component comprising a remote user interface module, the remote user interface module including a graphical user interface module operative to receive operator input commands and display media content from the multimedia conference server in the GUI view.
31. The apparatus of claim 30, the display comprising a touch screen display, the touch screen display operative to display media content received from the multimedia conference server in a preview pane of the GUI view.
32. An article of manufacture comprising a non-transitory storage medium containing instructions that if executed enable a system to:
establish a wireless connection between a mobile remote control and a multimedia conference server hosting a multimedia conference event;
manage, from the mobile remote control, one or more media resources provided by the multimedia conference server for the multimedia conference event, by communicating control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection using at least one of a push model or a pull model;
generate grammar for context related voice commands;
update a voice commands shell of an audio interface with the grammar; and
update a context-sensitive toolbar of a graphical user interface (GUI) view on a display of the mobile remote control based on a current context for the multimedia conference event.
33. The article of manufacture of claim 32, further comprising instructions that if executed enable the system to at least one of pull the control information or the media information from the multimedia conference server when the mobile remote control is ready to receive the control information or the media information or push the control information or the media information to the multimedia conference server or the mobile remote control when the control information or the media information is ready for transmission.
34. The article of manufacture of claim 32, further comprising instructions that if executed enable the system to map client control directives from the mobile remote control to server control directives for the multimedia conference server, and server control directives from the multimedia conference server to client control directives for the mobile remote control.
35. The article of manufacture of claim 32, further comprising instructions that if executed enable the system to manage the multimedia conference event from the mobile remote control using operator voice commands.
36. A method, comprising:
establishing a wireless connection between a mobile remote control and a multimedia conference server hosting a multimedia conference event; and
managing one or more media resources provided by the multimedia conference server for the multimedia conference event, by communicating control information and media information with the multimedia conference server for the multimedia conference event over the wireless connection using a hybrid push-pull model.
37. The method of claim 36, comprising updating a context-sensitive toolbar of a graphical user interface (GUI) view on a display of a mobile remote control based on a current context for the multimedia conference event.
38. The method of claim 36, comprising generating control directives for the multimedia conference server from operator input commands received from a GUI module.
39. The method of claim 36, comprising at least one of pulling the control information or the media information from the multimedia conference server when the mobile remote control is ready to receive the control information or the media information or pushing the control information or the media information to the multimedia conference server or the mobile remote control when the control information or the media information is ready for transmission.
40. The method of claim 36, comprising generating grammar for context related voice commands at run time for media content currently being viewed by an operator.
US14/931,297 2008-04-04 2015-11-03 Techniques to remotely manage a multimedia conference event Abandoned US20160070534A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/931,297 US20160070534A1 (en) 2008-04-04 2015-11-03 Techniques to remotely manage a multimedia conference event

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/062,536 US9201527B2 (en) 2008-04-04 2008-04-04 Techniques to remotely manage a multimedia conference event
US14/931,297 US20160070534A1 (en) 2008-04-04 2015-11-03 Techniques to remotely manage a multimedia conference event

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/062,536 Continuation US9201527B2 (en) 2008-04-04 2008-04-04 Techniques to remotely manage a multimedia conference event

Publications (1)

Publication Number Publication Date
US20160070534A1 true US20160070534A1 (en) 2016-03-10

Family

ID=41134379

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/062,536 Expired - Fee Related US9201527B2 (en) 2008-04-04 2008-04-04 Techniques to remotely manage a multimedia conference event
US14/931,297 Abandoned US20160070534A1 (en) 2008-04-04 2015-11-03 Techniques to remotely manage a multimedia conference event

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/062,536 Expired - Fee Related US9201527B2 (en) 2008-04-04 2008-04-04 Techniques to remotely manage a multimedia conference event

Country Status (1)

Country Link
US (2) US9201527B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106951171A (en) * 2017-03-14 2017-07-14 北京小米移动软件有限公司 The control method and device of virtual implementing helmet
CN107147552A (en) * 2016-11-16 2017-09-08 北京和利时系统工程有限公司 The method and apparatus that letter reading is returned in a kind of remote control
CN110381285A (en) * 2019-07-19 2019-10-25 视联动力信息技术股份有限公司 A kind of meeting initiating method and device
US11366583B1 (en) 2021-02-02 2022-06-21 Bank Of America Corporation Computer-to-computer users# edit and event transfer and synchronization

Families Citing this family (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8675847B2 (en) 2007-01-03 2014-03-18 Cisco Technology, Inc. Scalable conference bridge
US9311115B2 (en) 2008-05-13 2016-04-12 Apple Inc. Pushing a graphical user interface to a remote device with display rules provided by the remote device
US9870130B2 (en) * 2008-05-13 2018-01-16 Apple Inc. Pushing a user interface to a remote device
US8970647B2 (en) 2008-05-13 2015-03-03 Apple Inc. Pushing a graphical user interface to a remote device with display rules provided by the remote device
US8340272B2 (en) 2008-05-14 2012-12-25 Polycom, Inc. Method and system for initiating a conference based on the proximity of a portable communication device
US20100023876A1 (en) * 2008-07-28 2010-01-28 International Business Machines Corporation System and Method for Reducing Bandwidth Requirements of a Presentation Transmitted Across a Network
US8402391B1 (en) 2008-09-25 2013-03-19 Apple, Inc. Collaboration system
CN102044141B (en) * 2009-10-20 2015-04-22 张学荣 Observing remote control equipment
US8881014B2 (en) * 2009-11-04 2014-11-04 At&T Intellectual Property I, Lp Web based sales presentation method and system with synchronized display
US20110196928A1 (en) * 2010-02-09 2011-08-11 Inxpo, Inc. System and method for providing dynamic and interactive web content and managing attendees during webcasting events
US20110238731A1 (en) * 2010-03-23 2011-09-29 Sony Corporation Method to provide an unlimited number of customized user interfaces
US20110246875A1 (en) * 2010-04-02 2011-10-06 Symantec Corporation Digital whiteboard implementation
US8990702B2 (en) * 2010-09-30 2015-03-24 Yahoo! Inc. System and method for controlling a networked display
US9021354B2 (en) * 2010-04-09 2015-04-28 Apple Inc. Context sensitive remote device
US20110296313A1 (en) * 2010-05-25 2011-12-01 Sony Corporation Translating input from devices to appropriate rui commands
WO2012102416A1 (en) * 2011-01-24 2012-08-02 Lg Electronics Inc. Data sharing between smart devices
EP3352439B1 (en) * 2011-04-20 2019-11-13 Huawei Device Co., Ltd. Data interface configuration method and terminal device
US10225354B2 (en) 2011-06-06 2019-03-05 Mitel Networks Corporation Proximity session mobility
US20120311038A1 (en) * 2011-06-06 2012-12-06 Trinh Trung Tim Proximity Session Mobility Extension
TWI488503B (en) * 2012-01-03 2015-06-11 國際洋行股份有限公司 Conference photography device and the method thereof
US9250768B2 (en) * 2012-02-13 2016-02-02 Samsung Electronics Co., Ltd. Tablet having user interface
US20140122600A1 (en) * 2012-10-26 2014-05-01 Foundation Of Soongsil University-Industry Cooperation Conference server in a system for providing a conference service in rtcweb
US20140129944A1 (en) * 2012-11-05 2014-05-08 International Business Machines Corporation Method and system for synchronization and management of system activities with locally installed applications
US9086725B2 (en) * 2012-11-06 2015-07-21 International Business Machines Corporation Method and system for synchronization and management of system activities with locally installed applications
US20140244740A1 (en) * 2013-02-25 2014-08-28 International Business Machines Corporation Method for Synchronizing, Monitoring and Capturing of System Host Activities Occurring at Locally Installed Applications
US20140244579A1 (en) * 2013-02-25 2014-08-28 International Business Machines Corporation Method for synchronization and management fo system activities with locally installed applications
US8966548B2 (en) * 2013-05-20 2015-02-24 Verizon Patent And Licensing Inc. Alternative media presentation device recommendation
JP6488547B2 (en) * 2014-03-17 2019-03-27 株式会社リコー Conference terminal control system and conference terminal control method
JP2015177464A (en) * 2014-03-17 2015-10-05 株式会社リコー Apparatus control system, apparatus control device, apparatus control method and program
US20160001785A1 (en) * 2014-07-07 2016-01-07 Chin-Jung Hsu Motion sensing system and method
US10291597B2 (en) 2014-08-14 2019-05-14 Cisco Technology, Inc. Sharing resources across multiple devices in online meetings
KR20160032880A (en) * 2014-09-17 2016-03-25 삼성전자주식회사 Display apparatus and method for controlling thereof
US10089604B2 (en) * 2014-11-06 2018-10-02 Comigo Ltd. Method and apparatus for managing a joint slide show with one or more remote user terminals
US10542126B2 (en) 2014-12-22 2020-01-21 Cisco Technology, Inc. Offline virtual participation in an online conference meeting
CN104679524A (en) * 2015-03-17 2015-06-03 郑波 Method for controlling computer through mobile phone
KR101577986B1 (en) * 2015-03-24 2015-12-16 (주)해든브릿지 System for generating two way virtual reality
US10380556B2 (en) * 2015-03-26 2019-08-13 Microsoft Technology Licensing, Llc Changing meeting type depending on audience size
US9948786B2 (en) 2015-04-17 2018-04-17 Cisco Technology, Inc. Handling conferences using highly-distributed agents
US20170104763A1 (en) * 2015-10-09 2017-04-13 Microsoft Technology Licensing, Llc Presentation device and presentation device coordination
US10291762B2 (en) 2015-12-04 2019-05-14 Cisco Technology, Inc. Docking station for mobile computing devices
US10409550B2 (en) * 2016-03-04 2019-09-10 Ricoh Company, Ltd. Voice control of interactive whiteboard appliances
US10516703B2 (en) * 2016-03-07 2019-12-24 Precision Biometrics, Inc. Monitoring and controlling the status of a communication session
US10574609B2 (en) 2016-06-29 2020-02-25 Cisco Technology, Inc. Chat room access control
US20180063803A1 (en) * 2016-08-23 2018-03-01 Maik Andre Lindner System and method for production and synchronization of group experiences using mobile devices
US10592867B2 (en) 2016-11-11 2020-03-17 Cisco Technology, Inc. In-meeting graphical user interface display using calendar information and system
US10516707B2 (en) 2016-12-15 2019-12-24 Cisco Technology, Inc. Initiating a conferencing meeting using a conference room device
FR3060792B1 (en) * 2016-12-19 2018-12-07 Safran Electronics & Defense DATA LOADING DEVICE IN COMPUTERIZED DATA PROCESSING UNITS FROM A DATA SOURCE
US10515117B2 (en) 2017-02-14 2019-12-24 Cisco Technology, Inc. Generating and reviewing motion metadata
US9942519B1 (en) 2017-02-21 2018-04-10 Cisco Technology, Inc. Technologies for following participants in a video conference
US10440073B2 (en) 2017-04-11 2019-10-08 Cisco Technology, Inc. User interface for proximity based teleconference transfer
US10375125B2 (en) 2017-04-27 2019-08-06 Cisco Technology, Inc. Automatically joining devices to a video conference
US10404481B2 (en) 2017-06-06 2019-09-03 Cisco Technology, Inc. Unauthorized participant detection in multiparty conferencing by comparing a reference hash value received from a key management server with a generated roster hash value
US10375474B2 (en) 2017-06-12 2019-08-06 Cisco Technology, Inc. Hybrid horn microphone
US10477148B2 (en) 2017-06-23 2019-11-12 Cisco Technology, Inc. Speaker anticipation
US10516709B2 (en) 2017-06-29 2019-12-24 Cisco Technology, Inc. Files automatically shared at conference initiation
US10706391B2 (en) 2017-07-13 2020-07-07 Cisco Technology, Inc. Protecting scheduled meeting in physical room
US10091348B1 (en) 2017-07-25 2018-10-02 Cisco Technology, Inc. Predictive model for voice/video over IP calls
US10771621B2 (en) 2017-10-31 2020-09-08 Cisco Technology, Inc. Acoustic echo cancellation based sub band domain active speaker detection for audio and video conferencing applications
US10936281B2 (en) 2018-12-19 2021-03-02 International Business Machines Corporation Automatic slide page progression based on verbal and visual cues
CN111752449B (en) * 2020-05-28 2022-04-01 维沃移动通信有限公司 Screen control method and device
AU2021221388B2 (en) * 2021-08-23 2023-06-22 Canva Pty Ltd Systems and methods for remotely controlling electronic presentations
US12095828B2 (en) * 2021-12-30 2024-09-17 Harman International Industries, Incorporated In-vehicle communications and media mixing
CN115328372B (en) * 2022-07-30 2024-01-09 深圳乐播科技有限公司 Synchronous display method, synchronous display device, electronic equipment and storage medium
US20240283887A1 (en) * 2023-02-22 2024-08-22 Qualcomm Incorporated Speech-based visual indicator during communication session

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6339706B1 (en) * 1999-11-12 2002-01-15 Telefonaktiebolaget L M Ericsson (Publ) Wireless voice-activated remote control device
US7542068B2 (en) * 2000-01-13 2009-06-02 Polycom, Inc. Method and system for controlling multimedia video communication

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5767897A (en) * 1994-10-31 1998-06-16 Picturetel Corporation Video conferencing system
US6091408A (en) * 1997-08-13 2000-07-18 Z-Axis Corporation Method for presenting information units on multiple presentation units
US6185535B1 (en) * 1998-10-16 2001-02-06 Telefonaktiebolaget Lm Ericsson (Publ) Voice control of a user interface to service applications
US7233321B1 (en) * 1998-12-15 2007-06-19 Intel Corporation Pointing device with integrated audio input
US6317141B1 (en) * 1998-12-31 2001-11-13 Flashpoint Technology, Inc. Method and apparatus for editing heterogeneous media objects in a digital imaging device
US6526381B1 (en) * 1999-09-30 2003-02-25 Intel Corporation Remote control with speech recognition
EP1187046A1 (en) * 2000-09-08 2002-03-13 TELEFONAKTIEBOLAGET LM ERICSSON (publ) Method and system to inform a user about scheduling information
US6901270B1 (en) * 2000-11-17 2005-05-31 Symbol Technologies, Inc. Apparatus and method for wireless communication
CA2368404C (en) * 2001-01-18 2005-08-09 Research In Motion Limited Unified messaging system and method
US7058891B2 (en) * 2001-05-25 2006-06-06 Learning Tree International, Inc. Interface for a system of method of electronic presentations having multiple display screens with remote input
US7133062B2 (en) * 2003-07-31 2006-11-07 Polycom, Inc. Graphical user interface for video feed on videoconference terminal
US20060075429A1 (en) * 2004-04-30 2006-04-06 Vulcan Inc. Voice control of television-related information
TWI334703B (en) * 2004-09-02 2010-12-11 Inventec Multimedia & Telecom Voice-activated remote control system
US20060248210A1 (en) * 2005-05-02 2006-11-02 Lifesize Communications, Inc. Controlling video display mode in a video conferencing system
TWI297870B (en) * 2006-04-24 2008-06-11 Asustek Comp Inc Telephone system integrated with an electric device
US8358327B2 (en) * 2007-07-19 2013-01-22 Trinity Video Communications, Inc. CODEC-driven touch screen video conferencing control system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6339706B1 (en) * 1999-11-12 2002-01-15 Telefonaktiebolaget L M Ericsson (Publ) Wireless voice-activated remote control device
US7542068B2 (en) * 2000-01-13 2009-06-02 Polycom, Inc. Method and system for controlling multimedia video communication

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Baartman US 2007/0064095 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107147552A (en) * 2016-11-16 2017-09-08 北京和利时系统工程有限公司 The method and apparatus that letter reading is returned in a kind of remote control
CN106951171A (en) * 2017-03-14 2017-07-14 北京小米移动软件有限公司 The control method and device of virtual implementing helmet
CN110381285A (en) * 2019-07-19 2019-10-25 视联动力信息技术股份有限公司 A kind of meeting initiating method and device
US11366583B1 (en) 2021-02-02 2022-06-21 Bank Of America Corporation Computer-to-computer users# edit and event transfer and synchronization

Also Published As

Publication number Publication date
US9201527B2 (en) 2015-12-01
US20090254839A1 (en) 2009-10-08

Similar Documents

Publication Publication Date Title
US9201527B2 (en) Techniques to remotely manage a multimedia conference event
US9705691B2 (en) Techniques to manage recordings for multimedia conference events
US8316089B2 (en) Techniques to manage media content for a multimedia conference event
US8713440B2 (en) Techniques to manage communications resources for a multimedia conference event
CA2711463C (en) Techniques to generate a visual composition for a multimedia conference event
US20090319916A1 (en) Techniques to auto-attend multimedia conference events
US20090210490A1 (en) Techniques to automatically configure resources for a multimedia confrence event
US8275197B2 (en) Techniques to manage a whiteboard for multimedia conference events
US20100205540A1 (en) Techniques for providing one-click access to virtual conference events
US20090210491A1 (en) Techniques to automatically identify participants for a multimedia conference event
US20130063542A1 (en) System and method for configuring video data
US20160344780A1 (en) Method and system for controlling communications for video/audio-conferencing
Yang et al. Webdove: a Web-Based Collaboration System for Physical Tasks
Xiao et al. ErmdClime: enabling real-time multimedia discussion for collaborative learning in mobile environment

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:045111/0507

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION