US11956290B2 - Multi-media collaboration cursor/annotation control - Google Patents

Multi-media collaboration cursor/annotation control Download PDF

Info

Publication number
US11956290B2
US11956290B2 US14/638,960 US201514638960A US11956290B2 US 11956290 B2 US11956290 B2 US 11956290B2 US 201514638960 A US201514638960 A US 201514638960A US 11956290 B2 US11956290 B2 US 11956290B2
Authority
US
United States
Prior art keywords
cursor
user
multimedia conference
event
conference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/638,960
Other versions
US20160259522A1 (en
Inventor
Ignacio Miranda Gonzalez
Bryan Solan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avaya Inc
Original Assignee
Avaya Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avaya Inc filed Critical Avaya Inc
Assigned to AVAYA INC. reassignment AVAYA INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GONZALEZ, IGNACIO MIRANDA, SOLAN, BRYAN
Priority to US14/638,960 priority Critical patent/US11956290B2/en
Publication of US20160259522A1 publication Critical patent/US20160259522A1/en
Assigned to CITIBANK, N.A., AS ADMINISTRATIVE AGENT reassignment CITIBANK, N.A., AS ADMINISTRATIVE AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVAYA INC., AVAYA INTEGRATED CABINET SOLUTIONS INC., OCTEL COMMUNICATIONS CORPORATION, VPNET TECHNOLOGIES, INC.
Assigned to AVAYA INTEGRATED CABINET SOLUTIONS INC., AVAYA INC., VPNET TECHNOLOGIES, INC., OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL COMMUNICATIONS CORPORATION) reassignment AVAYA INTEGRATED CABINET SOLUTIONS INC. BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001 Assignors: CITIBANK, N.A.
Assigned to GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT reassignment GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVAYA INC., AVAYA INTEGRATED CABINET SOLUTIONS LLC, OCTEL COMMUNICATIONS LLC, VPNET TECHNOLOGIES, INC., ZANG, INC.
Assigned to CITIBANK, N.A., AS COLLATERAL AGENT reassignment CITIBANK, N.A., AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVAYA INC., AVAYA INTEGRATED CABINET SOLUTIONS LLC, OCTEL COMMUNICATIONS LLC, VPNET TECHNOLOGIES, INC., ZANG, INC.
Assigned to WILMINGTON TRUST, NATIONAL ASSOCIATION reassignment WILMINGTON TRUST, NATIONAL ASSOCIATION SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVAYA INC., AVAYA INTEGRATED CABINET SOLUTIONS LLC, AVAYA MANAGEMENT L.P., INTELLISIST, INC.
Assigned to WILMINGTON TRUST, NATIONAL ASSOCIATION, AS COLLATERAL AGENT reassignment WILMINGTON TRUST, NATIONAL ASSOCIATION, AS COLLATERAL AGENT INTELLECTUAL PROPERTY SECURITY AGREEMENT Assignors: AVAYA CABINET SOLUTIONS LLC, AVAYA INC., AVAYA MANAGEMENT L.P., INTELLISIST, INC.
Assigned to AVAYA INTEGRATED CABINET SOLUTIONS LLC, AVAYA INC., AVAYA HOLDINGS CORP., AVAYA MANAGEMENT L.P. reassignment AVAYA INTEGRATED CABINET SOLUTIONS LLC RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026 Assignors: CITIBANK, N.A., AS COLLATERAL AGENT
Assigned to WILMINGTON SAVINGS FUND SOCIETY, FSB [COLLATERAL AGENT] reassignment WILMINGTON SAVINGS FUND SOCIETY, FSB [COLLATERAL AGENT] INTELLECTUAL PROPERTY SECURITY AGREEMENT Assignors: AVAYA INC., AVAYA MANAGEMENT L.P., INTELLISIST, INC., KNOAHSOFT INC.
Assigned to CITIBANK, N.A., AS COLLATERAL AGENT reassignment CITIBANK, N.A., AS COLLATERAL AGENT INTELLECTUAL PROPERTY SECURITY AGREEMENT Assignors: AVAYA INC., AVAYA MANAGEMENT L.P., INTELLISIST, INC.
Assigned to INTELLISIST, INC., AVAYA MANAGEMENT L.P., AVAYA INC., AVAYA INTEGRATED CABINET SOLUTIONS LLC reassignment INTELLISIST, INC. RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386) Assignors: WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT
Assigned to AVAYA INTEGRATED CABINET SOLUTIONS LLC, INTELLISIST, INC., AVAYA INC., AVAYA MANAGEMENT L.P. reassignment AVAYA INTEGRATED CABINET SOLUTIONS LLC RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436) Assignors: WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT
Assigned to AVAYA MANAGEMENT L.P., OCTEL COMMUNICATIONS LLC, AVAYA INC., VPNET TECHNOLOGIES, INC., INTELLISIST, INC., CAAS TECHNOLOGIES, LLC, ZANG, INC. (FORMER NAME OF AVAYA CLOUD INC.), HYPERQUALITY II, LLC, HYPERQUALITY, INC., AVAYA INTEGRATED CABINET SOLUTIONS LLC reassignment AVAYA MANAGEMENT L.P. RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001) Assignors: GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT
Assigned to AVAYA LLC reassignment AVAYA LLC (SECURITY INTEREST) GRANTOR'S NAME CHANGE Assignors: AVAYA INC.
Publication of US11956290B2 publication Critical patent/US11956290B2/en
Application granted granted Critical
Assigned to WILMINGTON SAVINGS FUND SOCIETY, FSB, AS COLLATERAL AGENT reassignment WILMINGTON SAVINGS FUND SOCIETY, FSB, AS COLLATERAL AGENT INTELLECTUAL PROPERTY SECURITY AGREEMENT – SUPPLEMENT NO. 6 Assignors: AVAYA LLC, AVAYA MANAGEMENT L.P.
Assigned to CITIBANK, N.A., AS COLLATERAL AGENT reassignment CITIBANK, N.A., AS COLLATERAL AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVAYA LLC, AVAYA MANAGEMENT L.P.
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • H04L65/4038Arrangements for multi-party communication, e.g. for conferences with floor control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/169Annotation, e.g. comment data or footnotes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • H04L65/1089In-session procedures by adding media; by removing media
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences

Definitions

  • the systems and methods disclosed herein relate to multimedia systems and in particular to control systems for multimedia communications.
  • the mouse pointer and annotations are the instinctive virtual equivalents of the laser pointer and a whiteboard marker.
  • the presenter's cursor is visible to other users and only the presenter can control the cursor or annotate a document in the presentation.
  • the remaining participants in the multimedia communication currently do not have the ability to be involved in the same manner as in a face-to-face communication.
  • a first cursor controlled by a first user in a first location is provided.
  • a first event associated with the multimedia conference is detected.
  • a user e.g., the first user or a user other than the first user
  • control of one or more cursors in the multimedia conference can be handled in various ways to enhance the multimedia conference.
  • control of the first cursor can be switched from the first user to a second user at a second location (which may or may not be the same as the first location), a second cursor can be provided to the multimedia conference that is controlled by the second user, or control of the first cursor can be merged so that the first cursor can be controlled by both the first user and the second user.
  • FIG. 1 is a block diagram of a first illustrative system for controlling a cursor in a multimedia conference.
  • FIG. 2 is a diagram of a first view of a multimedia conference that uses a single cursor.
  • FIG. 3 is a diagram of a second view of a multimedia conference that uses two or more cursors.
  • FIG. 4 is a flow diagram of a process for controlling a cursor in a multimedia conference.
  • FIG. 5 is a flow diagram of privately controlling a cursor in a multimedia conference based on a sidebar communication.
  • FIG. 1 is a block diagram of a first illustrative system 100 for controlling a cursor in a multimedia conference.
  • the first illustrative system 100 comprises communication endpoints 101 A- 101 N, a network 110 , and a communication system 120 .
  • the communication endpoints 101 A- 101 N can be or may include any device that can communicate on the network 110 and is an endpoint in a communication session, such as, a Personal Computer (PC), a telephone, a video phone, a cellular telephone, a Personal Digital Assistant (PDA), a tablet device, a notebook device, and/or the like. As shown in FIG. 1 , any number of communication endpoints 101 A- 101 N may be connected to the network 110 . In addition, one or more of the communication endpoints 101 A- 101 N may be directly connected to the communication system 120 . The communication endpoints 101 A- 101 N are typically at different locations.
  • PC Personal Computer
  • PDA Personal Digital Assistant
  • the communication endpoint 101 A can be located in Denver, the communication endpoint 101 B can be located in New Jersey, and the communication endpoint 101 N can be located in Spain.
  • a communication endpoint 101 may have a single user associated therewith or multiple users associated therewith.
  • the latter type of communication endpoint 101 may also be referred to as a shared endpoint due to the fact that multiple users share input and/or output devices of the communication endpoint 101 .
  • the network 110 can be or may include any collection of communication equipment that can send and receive electronic communications, such as the Internet, a Wide Area Network (WAN), a Local Area Network (LAN), a Voice over IP Network (VoIP), the Public Switched Telephone Network (PSTN), a packet switched network, a circuit switched network, a cellular network, a video network, a multimedia network, a combination of these, and the like.
  • the network 110 can use a variety of electronic protocols, such as Ethernet, Internet Protocol (IP), Session Initiation Protocol (SIP), Integrated Services Digital Network (ISDN), video protocols, Extended Markup Language (XML) Hyper Text Markup Language (HTML) Requests, web sockets, Real Time Protocol (RTP), web protocols, and/or the like.
  • IP Internet Protocol
  • SIP Session Initiation Protocol
  • ISDN Integrated Services Digital Network
  • HTTP Extended Markup Language
  • HTTP Real Time Protocol
  • the network 110 is an electronic communication network that allows for sending of messages via packets and/or circuit switched communications.
  • the network 110 comprises the PSTN (or other voice network) and a data network, such as the Internet, corporate network, and/or LAN.
  • the PSTN is used for the audio portion of a multimedia conference.
  • the Internet/corporate network/LAN is used for the multimedia portion of the multimedia conference. For example, a user may call via a telephone for the audio portion using the PSTN. The user then may use their PC for the multimedia portion/cursor control via the Internet or corporate network by accessing a web URL.
  • the communication system 120 can be or may include any hardware/software that can establish and control a multimedia conference, such as a Private Branch Exchange (PBX), a communication manager, a video switch, a session manager, and/or the like.
  • the communication system 120 further comprises a mixer 121 , a conferencing application 122 , a conference profile 124 , and a web server 125 .
  • the communication system 120 can include custom hardware that can be used in management and control of the multimedia conference.
  • the communication system 120 may comprise a field programmable gate array or application specific processor, such as a digital signaling processor.
  • the mixer 121 can be or may include any hardware/software that can mix voice, video, multimedia, and/or text communications.
  • the mixer 121 may mix audio signals/streams or video signals/streams for a discussion in a multimedia conference between users of the communication endpoints 101 A- 101 N.
  • the mixer 121 is shown as part of the communication system 120 .
  • the mixer 121 may be distributed between the communication system 120 and the communication endpoints 101 A- 101 N.
  • the conferencing application 122 can be any software/hardware that can manage a multimedia conference between the communication endpoints 101 A- 101 N.
  • a multimedia conference is a conference that provides multimedia (e.g., display of documents) to two or more users.
  • the conferencing application 122 is shown as part of the communication system 120 . However, in another embodiment, the conferencing application 122 may be distributed between the communication system 120 and the communication endpoints 101 A- 101 N.
  • the conferencing application 122 also comprises a cursor control module 123 .
  • the cursor control module 123 can be or may include any hardware/software that can control a cursor in a multimedia conference or enable various users to control the cursor in the multimedia conference.
  • the conference profile 124 can be any file or set of information that contains information for controlling aspects of a multimedia conference, such as an American Standard Code for Information Exchange (ASCII) file, an Extensible Markup Language (XML) file, a text file, a binary file, and/or the like.
  • the conference profile 124 can include user preferences and/or administrator preferences on how to control one or more cursors in a multimedia conference.
  • the web server 125 can be or may any include hardware/software that can provide web services for the multimedia conference, such as an ApacheTM web server, a NginxTM web server, and Microsoft-IISTM web server, and/or the like.
  • the web server 125 can use a variety of protocols, such as XML HTTP Requests, web sockets, Java Server pages, and/or the like.
  • these elements 121 - 125 may be in one or more of the communication endpoints 101 A- 101 N.
  • a peer-to-peer environment that does not have a centralized communication system 120 .
  • FIG. 2 is a diagram of a first view 200 of a multimedia conference that uses a single cursor 220 A.
  • the first view 200 is a first example of a document 210 being displayed as part of a multimedia conference.
  • the document 210 is an individual slide in a presentation, which is an overview of revenue for the company ABC in 2014.
  • the first view 200 includes a single cursor 220 A.
  • the single cursor 220 A is shown as an arrow shaped movable icon.
  • movement of the single cursor 220 A can be controlled by two or more users of the communication endpoints 101 A- 101 N who are in the multimedia conference.
  • a first user at the communication endpoint 101 A can control the single cursor 220 A for a first period of time in the multimedia conference.
  • the second user at the communication endpoint 101 B can control the single cursor 220 A for a second period of time (that follows or coincides with the first period of time) in the multimedia conference.
  • the displayed icon for the single cursor 220 A may change during the multimedia conference.
  • the single cursor 220 A may also include a name (not shown) for the user currently controlling the single cursor 220 A (similar to the cursors 220 B and 220 C in FIG. 3 ).
  • the single cursor 220 A looks as shown with the arrow icon.
  • the shape of the single cursor 220 A can change to a different shaped icon, such as a diamond shaped icon or the arrow may change from an upward pointing arrow to a downward pointing arrow.
  • a second icon of the single cursor 220 A may be displayed when two users can control the single cursor 220 A at the same time.
  • the single cursor 220 A's icon may change to a square or a two-headed arrow when two users can control the single cursor 220 A at the same time.
  • the first user's name may be displayed by the square single cursor 220 A.
  • the second user moves the single cursor 220 A, the second user's name may be displayed by the square single cursor 220 A.
  • FIG. 3 is a diagram of a second view of a multimedia conference that uses two or more cursors 220 .
  • the second view 300 is a second example of the document 210 being displayed as part of the multimedia conference.
  • two cursors 220 B and 220 C are provided as part of the multimedia conference. Both of the cursors 220 A, 220 B, and/or 220 C may correspond to annotators.
  • An annotator is a cursor 220 can allows a user to annotate a view of the multimedia conference 200 / 300 .
  • the cursors 220 B and 220 C have a circle for an icon.
  • the second view 300 includes a multimedia conference control section 330 .
  • the multimedia conference control section 330 includes an add cursor button 331 , a remove cursor button 332 , and a conference participants section 333 .
  • the conference participants section 333 shows each of the participants in the multimedia conference.
  • the multimedia conference has the participants John, Sally, Sue, Fred, and Jim. The black dot by John indicates that John is the currently speaking participant.
  • the add cursor button 331 allows an individual user to add a cursor 220 .
  • the remove cursor button 332 allows the individual user to remove a cursor 220 . For example, if the second view 300 was for the user John at the communication endpoint 101 A, John could add a cursor 220 that he controls by clicking the add cursor button 331 . John could remove the cursor 220 that he controls by clicking on the remove cursor button 332 .
  • the adding and removing of a cursor 220 may be accomplished in various ways in addition to using the add cursor button 331 and the remove cursor button 332 .
  • a user or moderator could select a user name in the conference participants section 333 and drag the user name onto the document 210 to add a cursor 220 for control by the selected user.
  • a user could select a menu to add and/or remove a cursor 220 for an individual user.
  • a moderator of the multimedia conference may control the cursors 220 for each participant in the multimedia conference. For example, if the moderator was John, John can add and/or remove cursors 220 for Sally, Sue, Fred, and/or Jim to use in the multimedia conference.
  • FIG. 3 only shows two cursors 220 B and 220 C, additional cursors 220 may be presented and controlled by other users in the multimedia conference.
  • the icon for the cursors 220 B and 220 C may be displayed differently depending on implementation.
  • the icon for the cursor 220 B may be different from the icon 220 C based on defined user preferences.
  • the functionality of the cursors 220 B and 220 C may be different.
  • the cursor 220 B may always be controlled by a single user (John), while the cursor 220 C may be controlled by multiple users.
  • the cursor 220 C may be controlled by the user Sally during the first ten slides of a presentation and then controlled by a third user Fred for the last ten slides of the presentation.
  • the control of the cursor 220 B and 220 C may be defined in the conference profile 124 or administered at the start of the multimedia conference.
  • FIG. 4 is a flow diagram of a process for controlling one or multiple cursors 220 in a multimedia conference.
  • the communication endpoints 101 A- 101 N, the communication system 120 , the mixer 121 , the conferencing application 122 , the cursor control module 123 , and the web server 125 are stored-program-controlled entities, such as a computer or processor, which performs the method of FIGS. 4 - 5 and the processes described herein by executing program instructions stored in a tangible computer readable storage medium, such as a memory or disk.
  • a tangible computer readable storage medium such as a memory or disk.
  • the process starts in step 400 .
  • the mixer 121 establishes voice, video, and/or text portion of the multimedia conference between the communication endpoints 101 A- 101 N in step 402 .
  • the multimedia portion of conference is established by the conferencing application 122 .
  • the process of establishing the voice/video/text and multimedia can occur at the same time or separately.
  • the communication endpoints 101 A- 101 N can all go to a Universal Resource Identifier (URI) provided by the web server 125 to be connected to the multimedia conference (for both audio/video and multimedia).
  • URI Universal Resource Identifier
  • the communication endpoints 101 A- 101 N can make a voice call to a conferencing telephone number to have the mixer 121 establish a voice conference.
  • the users, via the URI served by the web server 125 can then receive the multimedia portion of the conference via the web server 125 .
  • the cursor control module 123 provides a first cursor 220 in the multimedia conference controlled by a first user at the communication endpoint 101 A in step 404 (e.g., as shown in FIG. 2 ).
  • the conferencing application 122 determines in step 406 if an event was detected.
  • An event may correspond to one or a series of actions or occurrences associated with the multimedia conference.
  • the event may include one or more events that are defined in the conference profile 124 . For example, the event can be based on detection of who is currently speaking in the multimedia conference.
  • the conferencing application 122 will detect the event (via the mixer 121 ) in step 406 .
  • the conferencing application 122 can detect who is currently speaking in various ways. For instance, the conferencing application 122 can analyze voice data received via Real Time Protocol (RTP) or via the Public Switched Telephone Network (PSTN) from a user to determine if the user is speaking over a defined threshold.
  • RTP Real Time Protocol
  • PSTN Public Switched Telephone Network
  • the conference profile 124 may contain individual user profiles.
  • the individual user profiles can be defined by individual users. For example, a user may be able to define different communication endpoints 101 that will be used during the multimedia conference, such as a PC being used for the multimedia portion/mouse control and a telephone being used for the voice portion of the multimedia conference.
  • the conferencing application 122 can detect the user's voice level (e.g., via the PSTN) to provide cursor 220 control on a separate communication device (e.g., the user's PC) connected via a different network, such as the Internet.
  • a period of time may be associated with an event.
  • the event may be to switch the cursor 220 to the currently speaking user.
  • the cursor 220 may not switch for a defined number of seconds after a user speaks before switching control of the cursor 220 to a new user.
  • the event can be that a user is speaking a defined percentage of time in the multimedia conference. For example, if the user at the communication endpoint 101 B is speaking 30% of the time in the multimedia conference, the user of communication endpoint 101 B will be provided a second cursor 220 (e.g., was shown in FIG. 3 ).
  • the event can be based on who is speaking the most in a defined period. For example, if John as spoken the most over the last five minutes, then John will control the cursor 220 .
  • the event can be based on amplitude. For instance, the loudest (or quietest person) may control the cursor 220 .
  • the event may be a voice command spoken by a user in the multimedia conference.
  • the user at the communication endpoint 101 B can say “add cursor” or “remove cursor” to have a second cursor 220 added or removed from the multimedia conference.
  • the event may be a user entering one or more Dual Tone Multi-Frequency (DTMF) tones during the multimedia conference.
  • DTMF Dual Tone Multi-Frequency
  • the user may enter *9 to add a cursor 220 controlled by the user to the multimedia conference or *8 to remove a cursor 220 .
  • the event may be based on an agenda for the multimedia conference.
  • the agenda may be for a slide presentation in the multimedia conference.
  • the agenda may define a lecture period where only the moderator controls a first cursor 220 and group discussion of where the group can control the first cursor 220 and/or a second cursor 220 .
  • Other events may include a gesture, such as a user raising their hand or shaking their head.
  • the event may be that the user is looking at a specific area of a presented document, the user pointing to the specific area of the presented document 210 , a stress level or mood of the user, the presence of the user being on camera (e.g., being in view of the user's camera), a command from the user, a sound level of the user (e.g., how loudly or softly the user is speaking, and/or the like.
  • Other examples of events that can control a cursor 220 can include a user walking out of a conference room or office, the user walking away from the multimedia conference, the user not paying attention to the multimedia conference, and/or the like.
  • the event may be based on monitoring brain activity of a user, such as a user with disabilities. Based on detection of the brain activity of the user, a cursor 220 can be added into the multimedia conference. All the above events can be associated with switching control of a cursor 220 , adding a cursor 220 , merging control of a cursor 220 , and/or removing a cursor 220 .
  • step 408 the process goes to step 408 .
  • the conferencing application 122 determines in step 408 if the multimedia conference is over (e.g., if the moderator ends the multimedia conference). If the multimedia conference is over in step 408 , the process ends in step 410 . Otherwise, if the multimedia conference is not over in step 408 , the process goes to step 406 .
  • the cursor module 123 determines if the event results in switching control of the cursor 220 to a different user. If the event results in switching control of the cursor 220 to the different user in step 412 , the cursor control module 123 switches control of the cursor 220 to the different user in step 414 .
  • the second user at the communication endpoint 101 B may speak a voice command, select a button, select a menu, or enter one or more DTMF tones to switch control of the cursor 220 to the second user. The process then goes back to step 406 to check for another event.
  • the cursor control module 123 determines if the event results in providing a new cursor 220 in step 416 . If the event results in providing a new cursor 220 in step 416 , the cursor control module 123 provides the new cursor 220 in step 418 and the process goes to step 406 .
  • the event that results in providing a new cursor 220 can be based on various criteria. For example, the cursor control module 123 can provide a new cursor 220 for a second user based on the second user speaking more loudly or becoming agitated.
  • the cursor control module 123 determines, in step 420 , if the event results in removing one or more cursors 220 . If the event does result in removing the one or more cursors 220 , in step 420 , the cursor control module 123 removes the one or more cursors 220 in step 422 .
  • the multimedia conference may be in a group discussion mode where each user in the multimedia conference can control an individual cursor 220 . Based on a command from the presenter, the cursor control module 123 can remove all the cursors 220 for the group in step 422 and only leave the cursor 220 of the moderator. The process then goes to step 406 .
  • the cursor control module 123 manages the cursor 220 as defined in step 424 and then goes to step 406 .
  • the cursor control module 123 can merge control of inputs received from two cursors 220 into a single cursor 220 .
  • the cursor control module 123 may remove a second cursor 220 controlled by a second user and make the first cursor 220 controllable by the first and second users at the same time.
  • the cursor control module 123 can add a second user to control a single cursor 220 .
  • Other types of events may be to unmerge control of a curser 220 by presenting a new cursor 220 and changing control of the previously merged cursor 220 .
  • an event may be a command to erase all annotations associated with one or more cursors 220 .
  • an individual user may send a command via DTMF or voice to erase all annotations made by the user.
  • a moderator may send a command via DTMF or voice to erase all annotations made by all users in the multimedia conference.
  • the event may result in highlighting a particular user's annotations or changing colors of annotations.
  • Other events may be to change a cursor 220 , a color of a cursor 220 , and/or to animate a cursor 220 .
  • Still over events may be to record the annotations and/or cursor 220 movements as part of the multimedia conference. When the annotations and/or cursor 220 movements are recorded, information regarding who is controlling the cursor can also be captured and made available during playback of the multimedia conference.
  • an event may results in two or more of the steps 414 , 418 , 422 , and 424 being implemented based on a single event.
  • an event in an agenda e.g., going to group discussion mode
  • FIG. 5 is a flow diagram of privately controlling a cursor 220 in a multimedia conference based on a sidebar communication. The process of FIG. 5 goes between steps 406 and 408 of FIG. 4 . If an event is not detected in step 406 , the conferencing application 122 checks, in step 502 , if a request to establish a sidebar communication has been received.
  • a sidebar communication can be a separate communication that occurs during the multimedia conference between two or more users that are involved in the multimedia conference. For instance, a user at the communication endpoint 101 A may have an Instant Messaging sidebar communication session with a user at the communication endpoint 101 B during a multimedia conference between the users at the communication endpoints 101 A- 101 N.
  • the sidebar communication may be a voice communication, a video communication, a text communication, and/or the like.
  • step 502 If a request to establish a sidebar communication is not received in step 502 , the process goes to step 408 . Otherwise, if a request to establish a sidebar communication has been received in step 502 , the conferencing application 122 establishes the sidebar communication session in step 504 .
  • the conferencing application 122 determines in step 506 if a request to setup private control of the cursor(s) 220 has been received.
  • a request to setup private control of the cursor(s) 220 can be any type of request, such as a DTMF tone(s), a voice command, a command from a graphical user interface, and/or the like.
  • a user via a graphical user interface, may indicate to provide private control when establishing the sidebar communication.
  • Private control of a cursor 220 can be for an individual user or two or more users to control one or more cursors 220 where at least one other participant in the conference cannot see the cursor(s) 220 and/or annotations associated with the cursor(s) 220 .
  • step 506 If a request to setup private control of the cursor(s) 220 has not been received in step 506 , the process goes to step 408 . Otherwise, if a request to setup private control of the cursor(s) 220 has been received in step 506 , the cursor control module 123 sets up private control of the cursor(s) 220 per defined rules (or user commands) in step 508 .
  • the rules can be stored in the conference profile 124 .
  • the rules can be based on a variety of conditions, such as, which users can see the cursor(s) 220 (i.e., only those in the sidebar communication), which users can annotate, if two or more users can control a single cursor 220 , if each user has their own cursor 220 , and/or the like.
  • Step 510 is a separate thread that is spun off and checks for the sidebar communication to end. Once the sidebar session ends in step 510 , the process goes to step 408 .
  • the private control of the sidebar communication may end before the sidebar communication ends.
  • the private control may be terminated by a user, even though the sidebar communication is still ongoing.
  • private control of one or more cursors 220 may exist while one or more other cursors 220 may not be under private control.
  • one cursor is under private control
  • another cursor controlled by a different or the same user may be displayed to all users in the multimedia conference.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

When a multimedia conference between multiple users at multiple locations is established, a first cursor controlled by a first user in a first location is provided. A first event associated with the multimedia conference is detected. In response to detecting the first event associated with the multimedia conference, control of cursors in the multimedia conference can be handled in various ways to enhance the multimedia conference. For example, control of the first cursor can be switched from the first user to a second user at a second location, a second cursor can be provided to the multimedia conference that is controlled by the second user, or control of the first cursor can be merged so that the first cursor can be controlled by both the first user and the second user. This allows for enhanced control of the cursor in the multimedia conference that more closely resembles an in-person conference.

Description

TECHNICAL FIELD
The systems and methods disclosed herein relate to multimedia systems and in particular to control systems for multimedia communications.
BACKGROUND
In face-to-face communications, people naturally and instinctively use not only voice, but also use gestures and movements, such as, pointing to direct a listener's focus while speaking. For example, each person involved in a face-to-face communication may use their hand or a laser pointer to draw attention to documents or diagrams being discussed during a presentation.
In multimedia conferences, the mouse pointer and annotations are the instinctive virtual equivalents of the laser pointer and a whiteboard marker. However, in current systems, only the presenter's cursor is visible to other users and only the presenter can control the cursor or annotate a document in the presentation. The remaining participants in the multimedia communication currently do not have the ability to be involved in the same manner as in a face-to-face communication.
SUMMARY
Systems and methods are provided to solve these and other problems and disadvantages of the prior art. When a multimedia conference between multiple users at multiple locations is established, a first cursor controlled by a first user in a first location is provided. A first event associated with the multimedia conference is detected. For example, a user (e.g., the first user or a user other than the first user) may issue a command via a menu in the multimedia conference to control the cursor. In response to detecting the first event associated with the multimedia conference, control of one or more cursors in the multimedia conference can be handled in various ways to enhance the multimedia conference. For example, control of the first cursor can be switched from the first user to a second user at a second location (which may or may not be the same as the first location), a second cursor can be provided to the multimedia conference that is controlled by the second user, or control of the first cursor can be merged so that the first cursor can be controlled by both the first user and the second user. These options allow for enhanced control of the cursor in the multimedia conference that more closely resembles an in-person meeting.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a first illustrative system for controlling a cursor in a multimedia conference.
FIG. 2 is a diagram of a first view of a multimedia conference that uses a single cursor.
FIG. 3 is a diagram of a second view of a multimedia conference that uses two or more cursors.
FIG. 4 is a flow diagram of a process for controlling a cursor in a multimedia conference.
FIG. 5 is a flow diagram of privately controlling a cursor in a multimedia conference based on a sidebar communication.
DETAILED DESCRIPTION
FIG. 1 is a block diagram of a first illustrative system 100 for controlling a cursor in a multimedia conference. The first illustrative system 100 comprises communication endpoints 101A-101N, a network 110, and a communication system 120.
The communication endpoints 101A-101N can be or may include any device that can communicate on the network 110 and is an endpoint in a communication session, such as, a Personal Computer (PC), a telephone, a video phone, a cellular telephone, a Personal Digital Assistant (PDA), a tablet device, a notebook device, and/or the like. As shown in FIG. 1 , any number of communication endpoints 101A-101N may be connected to the network 110. In addition, one or more of the communication endpoints 101A-101N may be directly connected to the communication system 120. The communication endpoints 101A-101N are typically at different locations. For example, the communication endpoint 101A can be located in Denver, the communication endpoint 101B can be located in New Jersey, and the communication endpoint 101N can be located in Spain. A communication endpoint 101 may have a single user associated therewith or multiple users associated therewith. The latter type of communication endpoint 101 may also be referred to as a shared endpoint due to the fact that multiple users share input and/or output devices of the communication endpoint 101.
The network 110 can be or may include any collection of communication equipment that can send and receive electronic communications, such as the Internet, a Wide Area Network (WAN), a Local Area Network (LAN), a Voice over IP Network (VoIP), the Public Switched Telephone Network (PSTN), a packet switched network, a circuit switched network, a cellular network, a video network, a multimedia network, a combination of these, and the like. The network 110 can use a variety of electronic protocols, such as Ethernet, Internet Protocol (IP), Session Initiation Protocol (SIP), Integrated Services Digital Network (ISDN), video protocols, Extended Markup Language (XML) Hyper Text Markup Language (HTML) Requests, web sockets, Real Time Protocol (RTP), web protocols, and/or the like. Thus, the network 110 is an electronic communication network that allows for sending of messages via packets and/or circuit switched communications.
In one embodiment, the network 110 comprises the PSTN (or other voice network) and a data network, such as the Internet, corporate network, and/or LAN. The PSTN is used for the audio portion of a multimedia conference. The Internet/corporate network/LAN is used for the multimedia portion of the multimedia conference. For example, a user may call via a telephone for the audio portion using the PSTN. The user then may use their PC for the multimedia portion/cursor control via the Internet or corporate network by accessing a web URL.
The communication system 120 can be or may include any hardware/software that can establish and control a multimedia conference, such as a Private Branch Exchange (PBX), a communication manager, a video switch, a session manager, and/or the like. The communication system 120 further comprises a mixer 121, a conferencing application 122, a conference profile 124, and a web server 125. The communication system 120 can include custom hardware that can be used in management and control of the multimedia conference. For example, the communication system 120 may comprise a field programmable gate array or application specific processor, such as a digital signaling processor.
The mixer 121 can be or may include any hardware/software that can mix voice, video, multimedia, and/or text communications. For example, the mixer 121 may mix audio signals/streams or video signals/streams for a discussion in a multimedia conference between users of the communication endpoints 101A-101N. In FIG. 1 , the mixer 121 is shown as part of the communication system 120. However, in another embodiment, the mixer 121 may be distributed between the communication system 120 and the communication endpoints 101A-101N.
The conferencing application 122 can be any software/hardware that can manage a multimedia conference between the communication endpoints 101A-101N. A multimedia conference is a conference that provides multimedia (e.g., display of documents) to two or more users. The conferencing application 122 is shown as part of the communication system 120. However, in another embodiment, the conferencing application 122 may be distributed between the communication system 120 and the communication endpoints 101A-101N.
The conferencing application 122 also comprises a cursor control module 123. The cursor control module 123 can be or may include any hardware/software that can control a cursor in a multimedia conference or enable various users to control the cursor in the multimedia conference.
The conference profile 124 can be any file or set of information that contains information for controlling aspects of a multimedia conference, such as an American Standard Code for Information Exchange (ASCII) file, an Extensible Markup Language (XML) file, a text file, a binary file, and/or the like. The conference profile 124 can include user preferences and/or administrator preferences on how to control one or more cursors in a multimedia conference.
The web server 125 can be or may any include hardware/software that can provide web services for the multimedia conference, such as an Apache™ web server, a Nginx™ web server, and Microsoft-IIS™ web server, and/or the like. The web server 125 can use a variety of protocols, such as XML HTTP Requests, web sockets, Java Server pages, and/or the like.
The above descriptions are where the mixer 121, the conferencing application 122, the cursor control module 123, the conference profile 124, and the web server 125 are part of a centralized communication system 120. However, in another embodiment, these elements 121-125 may be in one or more of the communication endpoints 101A-101N. For example, in a peer-to-peer environment that does not have a centralized communication system 120.
FIG. 2 is a diagram of a first view 200 of a multimedia conference that uses a single cursor 220A. The first view 200 is a first example of a document 210 being displayed as part of a multimedia conference. In this example, the document 210 is an individual slide in a presentation, which is an overview of revenue for the company ABC in 2014. The first view 200 includes a single cursor 220A. The single cursor 220A is shown as an arrow shaped movable icon. In this embodiment, movement of the single cursor 220A can be controlled by two or more users of the communication endpoints 101A-101N who are in the multimedia conference. For example, a first user at the communication endpoint 101A can control the single cursor 220A for a first period of time in the multimedia conference. Based on a defined event, such as a second user starting to speak, the second user at the communication endpoint 101B can control the single cursor 220A for a second period of time (that follows or coincides with the first period of time) in the multimedia conference.
The displayed icon for the single cursor 220A may change during the multimedia conference. For example, if the single cursor 220A can only be controlled by one user at a time, the single cursor 220A may also include a name (not shown) for the user currently controlling the single cursor 220A (similar to the cursors 220B and 220C in FIG. 3 ). Alternatively, when the first user controls the single cursor 220A, the single cursor 220A looks as shown with the arrow icon. When the second user controls the single cursor 220A, the shape of the single cursor 220A can change to a different shaped icon, such as a diamond shaped icon or the arrow may change from an upward pointing arrow to a downward pointing arrow.
In another embodiment, a second icon of the single cursor 220A may be displayed when two users can control the single cursor 220A at the same time. For example, the single cursor 220A's icon may change to a square or a two-headed arrow when two users can control the single cursor 220A at the same time. When a first user moves the single cursor 220A, the first user's name may be displayed by the square single cursor 220A. When the second user moves the single cursor 220A, the second user's name may be displayed by the square single cursor 220A.
FIG. 3 is a diagram of a second view of a multimedia conference that uses two or more cursors 220. The second view 300 is a second example of the document 210 being displayed as part of the multimedia conference. In this embodiment, two cursors 220B and 220C are provided as part of the multimedia conference. Both of the cursors 220A, 220B, and/or 220C may correspond to annotators. An annotator is a cursor 220 can allows a user to annotate a view of the multimedia conference 200/300. In FIG. 3 , the cursors 220B and 220C have a circle for an icon.
In addition, the second view 300 includes a multimedia conference control section 330. The multimedia conference control section 330 includes an add cursor button 331, a remove cursor button 332, and a conference participants section 333. The conference participants section 333 shows each of the participants in the multimedia conference. The multimedia conference has the participants John, Sally, Sue, Fred, and Jim. The black dot by John indicates that John is the currently speaking participant.
The add cursor button 331 allows an individual user to add a cursor 220. The remove cursor button 332 allows the individual user to remove a cursor 220. For example, if the second view 300 was for the user John at the communication endpoint 101A, John could add a cursor 220 that he controls by clicking the add cursor button 331. John could remove the cursor 220 that he controls by clicking on the remove cursor button 332.
The adding and removing of a cursor 220 may be accomplished in various ways in addition to using the add cursor button 331 and the remove cursor button 332. For example, a user or moderator could select a user name in the conference participants section 333 and drag the user name onto the document 210 to add a cursor 220 for control by the selected user. Alternatively, a user could select a menu to add and/or remove a cursor 220 for an individual user.
In one embodiment, a moderator of the multimedia conference may control the cursors 220 for each participant in the multimedia conference. For example, if the moderator was John, John can add and/or remove cursors 220 for Sally, Sue, Fred, and/or Jim to use in the multimedia conference.
In FIG. 3 , the cursor 220B is controlled by John (as indicated by John's name by the cursor 220B). John has annotated the item “COMPUTER REVENUE FOR 2014=$1.3 BILLION” in the document 210 as indicated by annotation 221B. The cursor 220C is controlled by Sally (as indicated by Sally's name by the cursor 220C). Sally has drawn an annotation 221C for the item “TELEVISION REVENUE FOR 2014=$1.35 BILLION” that states that this is “Too Low.” Although FIG. 3 only shows two cursors 220B and 220C, additional cursors 220 may be presented and controlled by other users in the multimedia conference.
The icon for the cursors 220B and 220C may be displayed differently depending on implementation. For example, the icon for the cursor 220B may be different from the icon 220C based on defined user preferences.
In addition, the functionality of the cursors 220B and 220C may be different. For example, the cursor 220B may always be controlled by a single user (John), while the cursor 220C may be controlled by multiple users. The cursor 220C may be controlled by the user Sally during the first ten slides of a presentation and then controlled by a third user Fred for the last ten slides of the presentation. The control of the cursor 220B and 220C may be defined in the conference profile 124 or administered at the start of the multimedia conference.
FIG. 4 is a flow diagram of a process for controlling one or multiple cursors 220 in a multimedia conference. Illustratively, the communication endpoints 101A-101N, the communication system 120, the mixer 121, the conferencing application 122, the cursor control module 123, and the web server 125 are stored-program-controlled entities, such as a computer or processor, which performs the method of FIGS. 4-5 and the processes described herein by executing program instructions stored in a tangible computer readable storage medium, such as a memory or disk. Although the methods described in FIGS. 4-5 are shown in a specific order, one of skill in the art would recognize that the steps in FIGS. 4-5 may be implemented in different orders and/or be implemented in a multi-threaded environment. Moreover, various steps may be omitted or added based on implementation.
The process starts in step 400. The mixer 121 establishes voice, video, and/or text portion of the multimedia conference between the communication endpoints 101A-101N in step 402. The multimedia portion of conference is established by the conferencing application 122. The process of establishing the voice/video/text and multimedia can occur at the same time or separately. For example, the communication endpoints 101A-101N can all go to a Universal Resource Identifier (URI) provided by the web server 125 to be connected to the multimedia conference (for both audio/video and multimedia). Alternatively, the communication endpoints 101A-101N can make a voice call to a conferencing telephone number to have the mixer 121 establish a voice conference. The users, via the URI served by the web server 125 can then receive the multimedia portion of the conference via the web server 125.
The cursor control module 123 provides a first cursor 220 in the multimedia conference controlled by a first user at the communication endpoint 101A in step 404 (e.g., as shown in FIG. 2 ). The conferencing application 122 determines in step 406 if an event was detected. An event may correspond to one or a series of actions or occurrences associated with the multimedia conference. The event may include one or more events that are defined in the conference profile 124. For example, the event can be based on detection of who is currently speaking in the multimedia conference. When a new user speaks in the multimedia conference, the conferencing application 122 will detect the event (via the mixer 121) in step 406.
The conferencing application 122 can detect who is currently speaking in various ways. For instance, the conferencing application 122 can analyze voice data received via Real Time Protocol (RTP) or via the Public Switched Telephone Network (PSTN) from a user to determine if the user is speaking over a defined threshold.
The conference profile 124 may contain individual user profiles. The individual user profiles can be defined by individual users. For example, a user may be able to define different communication endpoints 101 that will be used during the multimedia conference, such as a PC being used for the multimedia portion/mouse control and a telephone being used for the voice portion of the multimedia conference. The conferencing application 122 can detect the user's voice level (e.g., via the PSTN) to provide cursor 220 control on a separate communication device (e.g., the user's PC) connected via a different network, such as the Internet.
A period of time may be associated with an event. For example, the event may be to switch the cursor 220 to the currently speaking user. However, the cursor 220 may not switch for a defined number of seconds after a user speaks before switching control of the cursor 220 to a new user.
The event can be that a user is speaking a defined percentage of time in the multimedia conference. For example, if the user at the communication endpoint 101B is speaking 30% of the time in the multimedia conference, the user of communication endpoint 101B will be provided a second cursor 220 (e.g., was shown in FIG. 3 ).
The event can be based on who is speaking the most in a defined period. For example, if John as spoken the most over the last five minutes, then John will control the cursor 220. The event can be based on amplitude. For instance, the loudest (or quietest person) may control the cursor 220.
The event may be a voice command spoken by a user in the multimedia conference. For example, the user at the communication endpoint 101B can say “add cursor” or “remove cursor” to have a second cursor 220 added or removed from the multimedia conference.
The event may be a user entering one or more Dual Tone Multi-Frequency (DTMF) tones during the multimedia conference. For example, the user may enter *9 to add a cursor 220 controlled by the user to the multimedia conference or *8 to remove a cursor 220.
In one embodiment, the event may be based on an agenda for the multimedia conference. For example, the agenda may be for a slide presentation in the multimedia conference. Alternatively, the agenda may define a lecture period where only the moderator controls a first cursor 220 and group discussion of where the group can control the first cursor 220 and/or a second cursor 220.
Other events may include a gesture, such as a user raising their hand or shaking their head. The event may be that the user is looking at a specific area of a presented document, the user pointing to the specific area of the presented document 210, a stress level or mood of the user, the presence of the user being on camera (e.g., being in view of the user's camera), a command from the user, a sound level of the user (e.g., how loudly or softly the user is speaking, and/or the like.
Other examples of events that can control a cursor 220 can include a user walking out of a conference room or office, the user walking away from the multimedia conference, the user not paying attention to the multimedia conference, and/or the like.
In one embodiment, the event may be based on monitoring brain activity of a user, such as a user with disabilities. Based on detection of the brain activity of the user, a cursor 220 can be added into the multimedia conference. All the above events can be associated with switching control of a cursor 220, adding a cursor 220, merging control of a cursor 220, and/or removing a cursor 220.
If a defined event is not detected in step 406, the process goes to step 408. The conferencing application 122 determines in step 408 if the multimedia conference is over (e.g., if the moderator ends the multimedia conference). If the multimedia conference is over in step 408, the process ends in step 410. Otherwise, if the multimedia conference is not over in step 408, the process goes to step 406.
If an event is detected in step 406, the cursor module 123 determines if the event results in switching control of the cursor 220 to a different user. If the event results in switching control of the cursor 220 to the different user in step 412, the cursor control module 123 switches control of the cursor 220 to the different user in step 414. For example, the second user at the communication endpoint 101B may speak a voice command, select a button, select a menu, or enter one or more DTMF tones to switch control of the cursor 220 to the second user. The process then goes back to step 406 to check for another event.
If the event does not result in switching control of the cursor 220 to a different user in step 412, the cursor control module 123 determines if the event results in providing a new cursor 220 in step 416. If the event results in providing a new cursor 220 in step 416, the cursor control module 123 provides the new cursor 220 in step 418 and the process goes to step 406. The event that results in providing a new cursor 220 can be based on various criteria. For example, the cursor control module 123 can provide a new cursor 220 for a second user based on the second user speaking more loudly or becoming agitated.
If the event does not result in providing a new cursor 220 in step 416, the cursor control module 123 determines, in step 420, if the event results in removing one or more cursors 220. If the event does result in removing the one or more cursors 220, in step 420, the cursor control module 123 removes the one or more cursors 220 in step 422. For example, the multimedia conference may be in a group discussion mode where each user in the multimedia conference can control an individual cursor 220. Based on a command from the presenter, the cursor control module 123 can remove all the cursors 220 for the group in step 422 and only leave the cursor 220 of the moderator. The process then goes to step 406.
If the event does not result in removing the one or more cursors 220, the cursor control module 123 manages the cursor 220 as defined in step 424 and then goes to step 406. For example, the cursor control module 123 can merge control of inputs received from two cursors 220 into a single cursor 220. To merge control, the cursor control module 123 may remove a second cursor 220 controlled by a second user and make the first cursor 220 controllable by the first and second users at the same time. Alternatively, the cursor control module 123 can add a second user to control a single cursor 220. Other types of events may be to unmerge control of a curser 220 by presenting a new cursor 220 and changing control of the previously merged cursor 220.
Other events may be related to controlling the cursor 220 and/or managing usage of the cursor 220. For example, an event may be a command to erase all annotations associated with one or more cursors 220. For example, an individual user may send a command via DTMF or voice to erase all annotations made by the user. Alternatively, a moderator may send a command via DTMF or voice to erase all annotations made by all users in the multimedia conference. The event may result in highlighting a particular user's annotations or changing colors of annotations. Other events may be to change a cursor 220, a color of a cursor 220, and/or to animate a cursor 220. Still over events may be to record the annotations and/or cursor 220 movements as part of the multimedia conference. When the annotations and/or cursor 220 movements are recorded, information regarding who is controlling the cursor can also be captured and made available during playback of the multimedia conference.
The above descriptions describe the steps 412-424 as being implemented in series. However, in other embodiments, an event may results in two or more of the steps 414, 418, 422, and 424 being implemented based on a single event. For example, an event in an agenda (e.g., going to group discussion mode) may cause a first cursor 220 controlled by a first user (a presenter) to switch to a second user (a moderator) and also cause one or more additional cursors 220 (cursors for individuals in the group) to be provided in the multimedia conference.
FIG. 5 is a flow diagram of privately controlling a cursor 220 in a multimedia conference based on a sidebar communication. The process of FIG. 5 goes between steps 406 and 408 of FIG. 4 . If an event is not detected in step 406, the conferencing application 122 checks, in step 502, if a request to establish a sidebar communication has been received. A sidebar communication can be a separate communication that occurs during the multimedia conference between two or more users that are involved in the multimedia conference. For instance, a user at the communication endpoint 101A may have an Instant Messaging sidebar communication session with a user at the communication endpoint 101B during a multimedia conference between the users at the communication endpoints 101A-101N. The sidebar communication may be a voice communication, a video communication, a text communication, and/or the like.
If a request to establish a sidebar communication is not received in step 502, the process goes to step 408. Otherwise, if a request to establish a sidebar communication has been received in step 502, the conferencing application 122 establishes the sidebar communication session in step 504. The conferencing application 122 determines in step 506 if a request to setup private control of the cursor(s) 220 has been received. A request to setup private control of the cursor(s) 220 can be any type of request, such as a DTMF tone(s), a voice command, a command from a graphical user interface, and/or the like. For example, a user, via a graphical user interface, may indicate to provide private control when establishing the sidebar communication. Private control of a cursor 220 can be for an individual user or two or more users to control one or more cursors 220 where at least one other participant in the conference cannot see the cursor(s) 220 and/or annotations associated with the cursor(s) 220.
If a request to setup private control of the cursor(s) 220 has not been received in step 506, the process goes to step 408. Otherwise, if a request to setup private control of the cursor(s) 220 has been received in step 506, the cursor control module 123 sets up private control of the cursor(s) 220 per defined rules (or user commands) in step 508. The rules can be stored in the conference profile 124. The rules can be based on a variety of conditions, such as, which users can see the cursor(s) 220 (i.e., only those in the sidebar communication), which users can annotate, if two or more users can control a single cursor 220, if each user has their own cursor 220, and/or the like.
The private control of the cursor 220 ends in step 510 when the sidebar communication ends. Step 510, in one embodiment, is a separate thread that is spun off and checks for the sidebar communication to end. Once the sidebar session ends in step 510, the process goes to step 408.
In one embodiment, the private control of the sidebar communication may end before the sidebar communication ends. For example, the private control may be terminated by a user, even though the sidebar communication is still ongoing.
In another embodiment, private control of one or more cursors 220 may exist while one or more other cursors 220 may not be under private control. For example, while one cursor is under private control, another cursor controlled by a different or the same user may be displayed to all users in the multimedia conference.
Of course, various changes and modifications to the illustrative embodiment described above will be apparent to those skilled in the art. These changes and modifications can be made without departing from the spirit and the scope of the system and method and without diminishing its attendant advantages. The following claims specify the scope of the invention. Those skilled in the art will appreciate that the features described above can be combined in various ways to form multiple variations of the invention. As a result, the invention is not limited to the specific embodiments described above, but only by the following claims and their equivalents.

Claims (18)

What is claimed is:
1. A method comprising:
providing, by a microprocessor, a first cursor in a multimedia conference, wherein the first cursor is shared by a plurality of users in the multimedia conference, wherein the multimedia conference comprises the plurality of users on a plurality of communication endpoints at a plurality of locations;
recording, by the microprocessor, annotations and cursor movements as part of the multimedia conference, wherein information regarding who is controlling the first cursor is also captured and made available during playback of the multimedia conference, and wherein a color of the annotations and cursor movements is changed based on a user associated with the annotations and cursor movements; and
detecting an event associated with a stress level or mood of a second user of the plurality of users speaking during the multimedia conference and in response to the detected event, determining whether the detected event results in switching control of the first cursor; and
when the detected event results in switching control of the first cursor, switch control of the first cursor to the second user;
when the detected event does not result in switching control of the first cursor, add a second cursor in the multimedia conference, wherein the second cursor is controlled by the second user.
2. The method of claim 1, further comprising:
detecting a second event associated with the multimedia conference and in response to detecting the second event, further enabling at least one of:
switching control of the first cursor in the multimedia conference to a different user;
removing control of the first cursor in the multimedia conference from a user of the plurality of users; or
adding an additional cursor in the multimedia conference.
3. The method of claim 1, further comprising associating a time period for a user speaking to control the first cursor in the multimedia conference after the user stops speaking.
4. The method of claim 1 further comprising:
recording the multimedia conference, wherein the recording the multimedia conference comprises recording control of the first cursor in the multimedia conference and individual users associated with controlling the first cursor in the multimedia conference.
5. The method of claim 1, wherein an event associated with the multimedia conference further comprises detecting a group discussion period to switch control of the first cursor in the multimedia conference.
6. The method of claim 2, wherein the second event is at least one of:
a user walking out of a conference room or office;
the user walking away from the multimedia conference;
the user not paying attention to the multimedia conference;
the user looking at or pointing to a specific area of a presented document;
one or more Dual Tone Multi-Frequency (DTMF) tones;
a stress level or mood of the user; and
presence of the user being off camera.
7. The method of claim 2, wherein the second event is at least one of:
a controlling user walking out of a conference room or office; the controlling user walking away from the multimedia conference; and the controlling user not paying attention to the multimedia conference.
8. The method of claim 2, wherein the second event is a stress level or mood of a controlling user.
9. The method of claim 2, wherein the second event is a presence of a third user being off camera.
10. The method of claim 6, wherein the second event is the user not paying attention to the multimedia conference.
11. The method of claim 6, wherein the second event is a third user looking at or pointing to the specific area of the presented document.
12. The method of claim 6, wherein the second event is the presence of the user being off camera.
13. The method of claim 6, wherein the second event is the stress level or mood of the user.
14. A system comprising:
a memory, a microprocessor in communication with the memory, the microprocessor executes software modules, the software modules comprising:
a conferencing application that provides, a first cursor in a multimedia conference shared by a plurality of users in the multimedia conference, wherein the multimedia conference comprises the plurality of users on a plurality of communication endpoints at a plurality of locations;
the conferencing application records annotations and cursor movements as part of the multimedia conference, wherein information regarding who is controlling the first cursor is also captured and made available during playback of the multimedia conference, and wherein a color of the annotations and cursor movements is changed based on a user associated with the annotations and cursor movements; and
detecting an event associated with a stress level or mood of a second user of the plurality of users speaking during the multimedia conference and in response to the event, determining whether the detected event results in switching control of the first cursor and;
when the detected event results in switching control of the first cursor, switch control of the first cursor to the second user;
when the detected event does not result in switching control of the first cursor, add a second cursor in the multimedia conference, wherein the second cursor is controlled by the second user.
15. The system of claim 14, wherein:
the conferencing application detects a second event associated with the multimedia conference and in response to detecting the event, does at least one of:
switches control of the first cursor in the multimedia conference to a different user;
removes control of the first cursor in the multimedia conference from a user; or
adds an additional cursor in the multimedia conference.
16. The system of claim 14, wherein only displaying the first cursor in the multimedia conference on a first communication endpoint and on a second communication endpoint comprises:
a mixer that establishes a sidebar communication between the first communication endpoint and the second communication endpoint.
17. The system of claim 14, wherein a user is speaking in the multimedia conference, and wherein a time period for the speaking user to control the first cursor in the multimedia conference after the speaking user stops speaking is determined.
18. The system of claim 15, wherein the second event is at least one of:
a user walking out of a conference room or office;
the user walking away from the multimedia conference;
the user not paying attention to the multimedia conference;
the user looking at or pointing to a specific area of a presented document;
a stress level or mood of the user;
presence of the user being on camera;
one or more Dual Tone Multi-Frequency (DTMF) tones; and
presence of the user being off camera.
US14/638,960 2015-03-04 2015-03-04 Multi-media collaboration cursor/annotation control Active 2037-07-27 US11956290B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/638,960 US11956290B2 (en) 2015-03-04 2015-03-04 Multi-media collaboration cursor/annotation control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/638,960 US11956290B2 (en) 2015-03-04 2015-03-04 Multi-media collaboration cursor/annotation control

Publications (2)

Publication Number Publication Date
US20160259522A1 US20160259522A1 (en) 2016-09-08
US11956290B2 true US11956290B2 (en) 2024-04-09

Family

ID=56850737

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/638,960 Active 2037-07-27 US11956290B2 (en) 2015-03-04 2015-03-04 Multi-media collaboration cursor/annotation control

Country Status (1)

Country Link
US (1) US11956290B2 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11935449B2 (en) * 2018-01-22 2024-03-19 Sony Corporation Information processing apparatus and information processing method
US11249715B2 (en) 2020-06-23 2022-02-15 Switchboard Visual Technologies, Inc. Collaborative remote interactive platform
US11863600B2 (en) * 2021-06-30 2024-01-02 Dropbox, Inc. Techniques for efficient communication during a video collaboration session
US11875311B2 (en) * 2021-12-30 2024-01-16 Salesforce, Inc. Communication platform document as a communication channel
US11461480B1 (en) 2022-05-24 2022-10-04 Switchboard Visual Technologies, Inc. Synchronizing private data with reduced trust
US20240012986A1 (en) * 2022-07-06 2024-01-11 Microsoft Technology Licensing, Llc Enhanced Spreadsheet Presentation Using Spotlighting and Enhanced Spreadsheet Collaboration Using Live Typing

Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5583980A (en) * 1993-12-22 1996-12-10 Knowledge Media Inc. Time-synchronized annotation method
US5920694A (en) * 1993-03-19 1999-07-06 Ncr Corporation Annotation of computer video displays
US20020109728A1 (en) * 2000-12-18 2002-08-15 International Business Machines Corporation Method and apparatus for variable density scroll area
US20060181608A1 (en) * 2000-12-29 2006-08-17 Cisco Technology, Inc., A California Corporation Method and System for Participant Control of Privacy During Multiparty Communication Sessions
US20070005752A1 (en) * 2005-06-29 2007-01-04 Jitendra Chawla Methods and apparatuses for monitoring attention of a user during a collaboration session
US20070100952A1 (en) * 2005-10-27 2007-05-03 Yen-Fu Chen Systems, methods, and media for playback of instant messaging session histrory
US20070123243A1 (en) * 2005-09-26 2007-05-31 Nec Corporation Remote control
US20080040137A1 (en) * 2004-09-03 2008-02-14 Jong-Gu Lee Internet-Based Discussion System And Method Thereof, Record Media Recorded Discussion Method
US20080096597A1 (en) * 2004-04-21 2008-04-24 Brahmananda Vempati Providing Push-to-Talk Communications in a Telecommunications Network
US20080103906A1 (en) * 2006-10-26 2008-05-01 Gurvinder Singh Online publishing of multimedia content
US20090210789A1 (en) * 2008-02-14 2009-08-20 Microsoft Corporation Techniques to generate a visual composition for a multimedia conference event
US20090210228A1 (en) * 2008-02-15 2009-08-20 George Alex K System for Dynamic Management of Customer Direction During Live Interaction
US20090292999A1 (en) * 2008-05-21 2009-11-26 Smart Technologies Ulc Desktop sharing method and system
US20100191799A1 (en) * 2009-01-26 2010-07-29 Fiedorowicz Jeff A Collaborative browsing and related methods and systems
US20100199191A1 (en) * 2009-02-03 2010-08-05 Seiko Epson Corporation Collaborative work apparatus and method of controlling collaborative work
US20100251177A1 (en) * 2009-03-30 2010-09-30 Avaya Inc. System and method for graphically managing a communication session with a context based contact set
US20100253689A1 (en) * 2009-04-07 2010-10-07 Avaya Inc. Providing descriptions of non-verbal communications to video telephony participants who are not video-enabled
US7954049B2 (en) * 2006-05-15 2011-05-31 Microsoft Corporation Annotating multimedia files along a timeline
US20120020641A1 (en) * 2010-07-23 2012-01-26 Hidenori Sakaniwa Content reproduction apparatus
US20120204118A1 (en) * 2011-02-08 2012-08-09 Lefar Marc P Systems and methods for conducting and replaying virtual meetings
US20120226997A1 (en) * 2011-03-02 2012-09-06 Cisco Technology, Inc. System and method for managing conversations for a meeting session in a network environment
US20120296914A1 (en) * 2011-05-19 2012-11-22 Oracle International Corporation Temporally-correlated activity streams for conferences
US20130022189A1 (en) * 2011-07-21 2013-01-24 Nuance Communications, Inc. Systems and methods for receiving and processing audio signals captured using multiple devices
US20130086155A1 (en) * 2011-09-30 2013-04-04 Calgary Scientific Inc. Uncoupled application extensions including interactive digital surface layer for collaborative remote application sharing and annotating
US20130198657A1 (en) * 2010-04-30 2013-08-01 American Teleconferencing Services, Ltd. Integrated Public/Private Online Conference
US20130249788A1 (en) * 2012-03-22 2013-09-26 Satoshi Mitsui Information processing apparatus, computer program product, and projection system
US20140059569A1 (en) * 2012-08-24 2014-02-27 Casio Computer Co., Ltd. Data processing apparatus including plurality of applications and method
US8843816B2 (en) * 2008-04-25 2014-09-23 Microsoft Corporation Document collaboration by transforming and reflecting a document object model
US20150058723A1 (en) * 2012-05-09 2015-02-26 Apple Inc. Device, Method, and Graphical User Interface for Moving a User Interface Object Based on an Intensity of a Press Input
US20150149404A1 (en) * 2013-11-27 2015-05-28 Citrix Systems, Inc. Collaborative online document editing
US20150162000A1 (en) * 2013-12-10 2015-06-11 Harman International Industries, Incorporated Context aware, proactive digital assistant
US9092533B1 (en) * 2011-03-01 2015-07-28 iBlinks Inc. Live, real time bookmarking and sharing of presentation slides
US20160057390A1 (en) * 2014-08-20 2016-02-25 Cisco Technology, Inc. Obtaining replay of audio during a conference session
US20160065625A1 (en) * 2014-08-26 2016-03-03 Cisco Technology, Inc. Notification of Change in Online Conferencing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9386272B2 (en) * 2014-06-27 2016-07-05 Intel Corporation Technologies for audiovisual communication using interestingness algorithms

Patent Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5920694A (en) * 1993-03-19 1999-07-06 Ncr Corporation Annotation of computer video displays
US5583980A (en) * 1993-12-22 1996-12-10 Knowledge Media Inc. Time-synchronized annotation method
US20020109728A1 (en) * 2000-12-18 2002-08-15 International Business Machines Corporation Method and apparatus for variable density scroll area
US20060181608A1 (en) * 2000-12-29 2006-08-17 Cisco Technology, Inc., A California Corporation Method and System for Participant Control of Privacy During Multiparty Communication Sessions
US20080096597A1 (en) * 2004-04-21 2008-04-24 Brahmananda Vempati Providing Push-to-Talk Communications in a Telecommunications Network
US20080040137A1 (en) * 2004-09-03 2008-02-14 Jong-Gu Lee Internet-Based Discussion System And Method Thereof, Record Media Recorded Discussion Method
US20070005752A1 (en) * 2005-06-29 2007-01-04 Jitendra Chawla Methods and apparatuses for monitoring attention of a user during a collaboration session
US20070123243A1 (en) * 2005-09-26 2007-05-31 Nec Corporation Remote control
US20070100952A1 (en) * 2005-10-27 2007-05-03 Yen-Fu Chen Systems, methods, and media for playback of instant messaging session histrory
US7954049B2 (en) * 2006-05-15 2011-05-31 Microsoft Corporation Annotating multimedia files along a timeline
US20080103906A1 (en) * 2006-10-26 2008-05-01 Gurvinder Singh Online publishing of multimedia content
US20090210789A1 (en) * 2008-02-14 2009-08-20 Microsoft Corporation Techniques to generate a visual composition for a multimedia conference event
US20090210228A1 (en) * 2008-02-15 2009-08-20 George Alex K System for Dynamic Management of Customer Direction During Live Interaction
US8843816B2 (en) * 2008-04-25 2014-09-23 Microsoft Corporation Document collaboration by transforming and reflecting a document object model
US20090292999A1 (en) * 2008-05-21 2009-11-26 Smart Technologies Ulc Desktop sharing method and system
US20100191799A1 (en) * 2009-01-26 2010-07-29 Fiedorowicz Jeff A Collaborative browsing and related methods and systems
US20100199191A1 (en) * 2009-02-03 2010-08-05 Seiko Epson Corporation Collaborative work apparatus and method of controlling collaborative work
US20100251177A1 (en) * 2009-03-30 2010-09-30 Avaya Inc. System and method for graphically managing a communication session with a context based contact set
US20100253689A1 (en) * 2009-04-07 2010-10-07 Avaya Inc. Providing descriptions of non-verbal communications to video telephony participants who are not video-enabled
US20130198657A1 (en) * 2010-04-30 2013-08-01 American Teleconferencing Services, Ltd. Integrated Public/Private Online Conference
US20120020641A1 (en) * 2010-07-23 2012-01-26 Hidenori Sakaniwa Content reproduction apparatus
US20120204118A1 (en) * 2011-02-08 2012-08-09 Lefar Marc P Systems and methods for conducting and replaying virtual meetings
US9092533B1 (en) * 2011-03-01 2015-07-28 iBlinks Inc. Live, real time bookmarking and sharing of presentation slides
US20120226997A1 (en) * 2011-03-02 2012-09-06 Cisco Technology, Inc. System and method for managing conversations for a meeting session in a network environment
US20120296914A1 (en) * 2011-05-19 2012-11-22 Oracle International Corporation Temporally-correlated activity streams for conferences
US20130022189A1 (en) * 2011-07-21 2013-01-24 Nuance Communications, Inc. Systems and methods for receiving and processing audio signals captured using multiple devices
US20130086155A1 (en) * 2011-09-30 2013-04-04 Calgary Scientific Inc. Uncoupled application extensions including interactive digital surface layer for collaborative remote application sharing and annotating
US20130249788A1 (en) * 2012-03-22 2013-09-26 Satoshi Mitsui Information processing apparatus, computer program product, and projection system
US20150058723A1 (en) * 2012-05-09 2015-02-26 Apple Inc. Device, Method, and Graphical User Interface for Moving a User Interface Object Based on an Intensity of a Press Input
US20140059569A1 (en) * 2012-08-24 2014-02-27 Casio Computer Co., Ltd. Data processing apparatus including plurality of applications and method
US20150149404A1 (en) * 2013-11-27 2015-05-28 Citrix Systems, Inc. Collaborative online document editing
US20150162000A1 (en) * 2013-12-10 2015-06-11 Harman International Industries, Incorporated Context aware, proactive digital assistant
US20160057390A1 (en) * 2014-08-20 2016-02-25 Cisco Technology, Inc. Obtaining replay of audio during a conference session
US20160065625A1 (en) * 2014-08-26 2016-03-03 Cisco Technology, Inc. Notification of Change in Online Conferencing

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Bomgar: Use Annotations to Enhance Screen Sharing," Sep. 21, 2014, retrieved from https://web.archive.org/web/20140921052500/http://www.bomgar.com/products/features/annotations, 8 pages.
"Screenhero: Better than working in the same room," Sep. 27, 2014, retrieved from https://web.archive.org/web/20140927025147/https://screenhero.com/, 9 pages.

Also Published As

Publication number Publication date
US20160259522A1 (en) 2016-09-08

Similar Documents

Publication Publication Date Title
US11956290B2 (en) Multi-media collaboration cursor/annotation control
US10757050B2 (en) System and method for topic based segregation in instant messaging
TWI530191B (en) Techniques to manage media content for a multimedia conference event
US9329833B2 (en) Visual audio quality cues and context awareness in a virtual collaboration session
US9372543B2 (en) Presentation interface in a virtual collaboration session
TWI549518B (en) Techniques to generate a visual composition for a multimedia conference event
RU2488227C2 (en) Methods for automatic identification of participants for multimedia conference event
US9398059B2 (en) Managing information and content sharing in a virtual collaboration session
US10403287B2 (en) Managing users within a group that share a single teleconferencing device
KR101059681B1 (en) How to implement a computer to manage virtual meeting room communication sessions
US20150149540A1 (en) Manipulating Audio and/or Speech in a Virtual Collaboration Session
US9020120B2 (en) Timeline interface for multi-modal collaboration
US10218749B2 (en) Systems, methods, and computer programs for establishing a screen share session for a remote voice call
US20120017149A1 (en) Video whisper sessions during online collaborative computing sessions
US20130007635A1 (en) Teleconferencing adjunct and user interface to support temporary topic-based exclusions of specific participants
US9923982B2 (en) Method for visualizing temporal data
US20090319916A1 (en) Techniques to auto-attend multimedia conference events
US20120281057A1 (en) Collaboration appliance and methods thereof
US20160344780A1 (en) Method and system for controlling communications for video/audio-conferencing
TW202147834A (en) Synchronizing local room and remote sharing
US20130332832A1 (en) Interactive multimedia systems and methods
EP2618551A1 (en) Providing a roster and other information before joining a participant into an existing call
WO2017205227A1 (en) Monitoring network events
US20140040369A1 (en) Systems and Methods for Providing a Cue When a Participant Joins a Conference
JP2024531403A (en) Ambient ad-hoc multimedia collaboration in group-based communication systems

Legal Events

Date Code Title Description
AS Assignment

Owner name: AVAYA INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GONZALEZ, IGNACIO MIRANDA;SOLAN, BRYAN;SIGNING DATES FROM 20150225 TO 20150303;REEL/FRAME:035088/0699

AS Assignment

Owner name: CITIBANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK

Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS INC.;OCTEL COMMUNICATIONS CORPORATION;AND OTHERS;REEL/FRAME:041576/0001

Effective date: 20170124

AS Assignment

Owner name: AVAYA INTEGRATED CABINET SOLUTIONS INC., CALIFORNIA

Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531

Effective date: 20171128

Owner name: OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL COMMUNICATIONS CORPORATION), CALIFORNIA

Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531

Effective date: 20171128

Owner name: AVAYA INC., CALIFORNIA

Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531

Effective date: 20171128

Owner name: VPNET TECHNOLOGIES, INC., CALIFORNIA

Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531

Effective date: 20171128

Owner name: AVAYA INTEGRATED CABINET SOLUTIONS INC., CALIFORNI

Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531

Effective date: 20171128

Owner name: OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL

Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531

Effective date: 20171128

AS Assignment

Owner name: GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT, NEW YORK

Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045034/0001

Effective date: 20171215

Owner name: GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT, NEW Y

Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045034/0001

Effective date: 20171215

AS Assignment

Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK

Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045124/0026

Effective date: 20171215

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

AS Assignment

Owner name: WILMINGTON TRUST, NATIONAL ASSOCIATION, MINNESOTA

Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA MANAGEMENT L.P.;INTELLISIST, INC.;AND OTHERS;REEL/FRAME:053955/0436

Effective date: 20200925

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: WILMINGTON TRUST, NATIONAL ASSOCIATION, AS COLLATERAL AGENT, DELAWARE

Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA INC.;INTELLISIST, INC.;AVAYA MANAGEMENT L.P.;AND OTHERS;REEL/FRAME:061087/0386

Effective date: 20220712

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

AS Assignment

Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001

Effective date: 20230403

Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001

Effective date: 20230403

Owner name: AVAYA INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001

Effective date: 20230403

Owner name: AVAYA HOLDINGS CORP., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001

Effective date: 20230403

AS Assignment

Owner name: WILMINGTON SAVINGS FUND SOCIETY, FSB (COLLATERAL AGENT), DELAWARE

Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA MANAGEMENT L.P.;AVAYA INC.;INTELLISIST, INC.;AND OTHERS;REEL/FRAME:063742/0001

Effective date: 20230501

AS Assignment

Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK

Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA INC.;AVAYA MANAGEMENT L.P.;INTELLISIST, INC.;REEL/FRAME:063542/0662

Effective date: 20230501

AS Assignment

Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: CAAS TECHNOLOGIES, LLC, NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: HYPERQUALITY II, LLC, NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: HYPERQUALITY, INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: ZANG, INC. (FORMER NAME OF AVAYA CLOUD INC.), NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: VPNET TECHNOLOGIES, INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: OCTEL COMMUNICATIONS LLC, NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: INTELLISIST, INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: AVAYA INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622

Effective date: 20230501

Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023

Effective date: 20230501

Owner name: INTELLISIST, INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023

Effective date: 20230501

Owner name: AVAYA INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023

Effective date: 20230501

Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023

Effective date: 20230501

Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359

Effective date: 20230501

Owner name: INTELLISIST, INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359

Effective date: 20230501

Owner name: AVAYA INC., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359

Effective date: 20230501

Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359

Effective date: 20230501

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: AVAYA LLC, DELAWARE

Free format text: (SECURITY INTEREST) GRANTOR'S NAME CHANGE;ASSIGNOR:AVAYA INC.;REEL/FRAME:065019/0231

Effective date: 20230501

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK

Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA LLC;AVAYA MANAGEMENT L.P.;REEL/FRAME:068554/0522

Effective date: 20240828