US20110196930A1 - Methods and apparatuses for reporting based on attention of a user during a collaboration session - Google Patents

Methods and apparatuses for reporting based on attention of a user during a collaboration session Download PDF

Info

Publication number
US20110196930A1
US20110196930A1 US13090839 US201113090839A US2011196930A1 US 20110196930 A1 US20110196930 A1 US 20110196930A1 US 13090839 US13090839 US 13090839 US 201113090839 A US201113090839 A US 201113090839A US 2011196930 A1 US2011196930 A1 US 2011196930A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
collaboration session
content
participant
device
participant devices
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13090839
Inventor
Jitendra Chawla
David Knight
Edward Wong
Manish Kumar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cisco Technology Inc
Original Assignee
Cisco Webex LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/22Tracking the activity of the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation, e.g. computer aided management of electronic mail or groupware; Time management, e.g. calendars, reminders, meetings or time accounting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/24Presence management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements or protocols for real-time communications
    • H04L65/40Services or applications
    • H04L65/4007Services involving a main real-time session and one or more additional parallel sessions
    • H04L65/4015Services involving a main real-time session and one or more additional parallel sessions where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference

Abstract

In one embodiment, a collaboration session is conducted between a plurality of participant devices. The content shared during the collaboration session is recorded. The recorded content indicates content that was shared at various times during the collaboration session. An activity status of at least some participant devices is recorded at various times during the collaboration session. The activity status of the at least some participant devices is correlated with the content that was being shared at each of a plurality of times during the collaboration session. For a selected time of the plurality of times during the collaboration session, the content that was being shared in the collaboration session at the selected time and an indication of the activity status of the at least some participant devices at the selected time is displayed.

Description

    RELATED APPLICATIONS
  • This Application is a continuation of U.S. patent application Ser. No. 11/172,184 filed on Jun. 29, 2005 by Jitendra Chawla et al., entitled “Methods and Apparatuses for Reporting Based on Attention of a User During a Collaboration Session”, the contents of which are incorporated by reference herein in their entirety.
  • Application Ser. No. 11/172,184 claims the benefit of Provisional Application 60/872,400 filed on Sep. 20, 2004 by Jitendra Chawla et al., entitled “Attention Indicator for Use in a Web Conference”, the contents of which are incorporated by reference herein in their entirety as well.
  • FIELD OF INVENTION
  • The present invention relates generally to monitoring attention of a user and, more particularly, to reporting based on attention of a user during a collaboration session.
  • BACKGROUND
  • There has been an increased use in collaboration sessions that are Internet or web-based to communicate with employees, vendors, and clients. During these collaboration sessions, information is typically exchanged between multiple participants. This exchanged information may include audio, graphical, and/or textual information. Often times the ongoing effectiveness of the collaboration session depends on the participants paying attention in the collaboration session. There are many reasons as to why the participant fails to pay attention in the collaboration session. Several examples include exchanged information that is poorly designed, participants falling behind during the collaboration session, lack of interest on the part of the participant, and technical difficulties that prevent the participant from following the collaboration session.
  • SUMMARY
  • In one embodiment, the methods and apparatuses view content that corresponds with a particular time period during a collaboration session; identify a plurality of attendee devices participating in the collaboration session at the particular time period; determine an attention status of each attendee device at the particular time period; and display the attention status of the plurality of attendee devices at the particular time period.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate and explain one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session. In the drawings,
  • FIG. 1 is a diagram illustrating an environment within which the methods and apparatuses for reporting based on attention of a user during a collaboration session are implemented;
  • FIG. 2 is a simplified block diagram illustrating one embodiment in which the methods and apparatuses for reporting based on attention of a user during a collaboration session are implemented;
  • FIG. 3 is a simplified block diagram illustrating a system, consistent with one embodiment of the methods and apparatuses reporting based on attention of a user during a collaboration session;
  • FIG. 4 is an exemplary record for use with the methods and apparatuses for reporting based on attention of a user during a collaboration session;
  • FIG. 5 is a flow diagram consistent with one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session;
  • FIG. 6 is a flow diagram consistent with one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session;
  • FIG. 7 is a flow diagram consistent with one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session;
  • FIG. 8 is a flow diagram consistent with one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session;
  • FIG. 9 illustrates an exemplary screen shot showing one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session; and
  • FIG. 10 illustrates an exemplary screen shot showing one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session.
  • DETAILED DESCRIPTION
  • The following detailed description of the methods and apparatuses for reporting based on attention of a user during a collaboration session refers to the accompanying drawings. The detailed description is not intended to limit the methods and apparatuses for reporting based on attention of a user during a collaboration session. Instead, the scope of the methods and apparatuses for reporting based on attention of a user during a collaboration session is defined by the appended claims and equivalents. Those skilled in the art will recognize that many other implementations are possible, consistent with the present invention.
  • References to a “device” include a device utilized by a user such as a desktop computer, a portable computer, a personal digital assistant, a video phone, a landline telephone, a cellular telephone, and a device capable of receiving/transmitting an electronic signal.
  • References to a desktop are directed to an entire portion of a display area of a corresponding device.
  • References to a collaboration session include a plurality of devices that are configured to view content submitted by one of the devices.
  • References to a participant device include devices that are participating in the collaboration session.
  • References to a presenter device include a device that is participant and shares content shared with other participants.
  • References to an attendee device include a device that is a participant and receives content shared by another participant device. The attendees are capable of view content that is offered by the presenter device. In some instances, the attendee devices are capable of modifying the content shared by the presenter device.
  • In one embodiment, the methods and apparatuses for reporting based on attention of a user during a collaboration session detects the attention of an attendee device during a collaboration session. In one embodiment, different attributes of the attendee device that reflect the attention of the user of the attendee device are detected, recorded, and analyzed to determine whether the user of the attendee device is paying attention to the content associated with the collaboration session. In one embodiment, by indicating the attention of the user of the attendee device, the attendee device may be prompted to pay attention. Further, the presenter device may be alerted to the attention status of the attendee devices. Additionally, the attention status for each of the attendee devices may be correlated to the content presented during the collaboration session at predetermined times such that the attention of the attendee devices may be determined at any point during the collaboration session.
  • FIG. 1 is a diagram illustrating an environment within which the methods and apparatuses for reporting based on attention of a user during a collaboration session are implemented. The environment includes an electronic device 110 (e.g., a computing platform configured to act as a client device, such as a computer, a personal digital assistant, and the like), a user interface 115, a network 120 (e.g., a local area network, a home network, the Internet), and a server 130 (e.g., a computing platform configured to act as a server).
  • In one embodiment, one or more user interface 115 components are made integral with the electronic device 110 (e.g., keypad and video display screen input and output interfaces in the same housing such as a personal digital assistant). In other embodiments, one or more user interface 115 components (e.g., a keyboard, a pointing device such as a mouse, a trackball, etc.), a microphone, a speaker, a display, a camera are physically separate from, and are conventionally coupled to, electronic device 110. In one embodiment, the user utilizes interface 115 to access and control content and applications stored in electronic device 110, server 130, or a remote storage device (not shown) coupled via network 120.
  • In accordance with the invention, embodiments of reporting based on attention of a user during a collaboration session below are executed by an electronic processor in electronic device 110, in server 130, or by processors in electronic device 110 and in server 130 acting together. Server 130 is illustrated in FIG. 1 as being a single computing platform, but in other instances are two or more interconnected computing platforms that act as a server.
  • FIG. 2 is a simplified diagram illustrating an exemplary architecture in which the methods and apparatuses for reporting based on attention of a user during a collaboration session are implemented. The exemplary architecture includes a plurality of electronic devices 202, a server device 210, and a network 201 connecting electronic devices 202 to server 210 and each electronic device 202 to each other. The plurality of electronic devices 202 are each configured to include a computer-readable medium 209, such as random access memory, coupled to an electronic processor 208. Processor 208 executes program instructions stored in the computer-readable medium 209. In one embodiment, a unique user operates each electronic device 202 via an interface 115 as described with reference to FIG. 1.
  • The server device 210 includes a processor 211 coupled to a computer-readable medium 212. In one embodiment, the server device 210 is coupled to one or more additional external or internal devices, such as, without limitation, a secondary data storage element, such as database 240.
  • In one instance, processors 208 and 211 are manufactured by Intel Corporation, of Santa Clara, Calif. In other instances, other microprocessors are used.
  • In one embodiment, the plurality of client devices 202 and the server 210 include instructions for a customized application for reporting based on attention of a user during a collaboration session. In one embodiment, the plurality of computer-readable media 209 and 212 contain, in part, the customized application. Additionally, the plurality of client devices 202 and the server 210 are configured to receive and transmit electronic messages for use with the customized application. Similarly, the network 210 is configured to transmit electronic messages for use with the customized application.
  • One or more user applications are stored in media 209, in media 212, or a single user application is stored in part in one media 209 and in part in media 212. In one instance, a stored user application, regardless of storage location, is made customizable based on reporting based on attention of a user during a collaboration session as determined using embodiments described below.
  • FIG. 3 illustrates one embodiment of a system 300. In one embodiment, the system 300 is embodied within the server 130. In another embodiment, the system 300 is embodied within the electronic device 110. In yet another embodiment, the system 300 is embodied within both the electronic device 110 and the server 130.
  • In one embodiment, the system 300 includes a content detection module 310, an interface detection module 320, a storage module 330, an interface module 340, a control module 350, a duration module 360, a scoring module 370, and a record module 380.
  • In one embodiment, the control module 350 communicates with the content detection module 310, the interface detection module 320, the storage module 330, the interface module 340, the duration module 360, the scoring module 370, and the record module 380. In one embodiment, the control module 350 coordinates tasks, requests, and communications between the content detection module 310, the interface detection module 320, the storage module 330, the interface module 340, the duration module 360, the scoring module 370, and the record module 380.
  • In one embodiment, the content detection module 310 detects content that is utilized by the user in connection with the device. In one embodiment, the content is utilized in connection with multiple devices within a collaboration session between multiple parties. For example, the collaboration session may include a data conference or a video conference through a network, a phone line, and/or the Internet.
  • In one embodiment, the content is a document utilized within a collaboration session. In another embodiment, the content is audio visual media that is utilized within a collaboration session.
  • In one embodiment, the content detection module 310 detects the location of the content within each device. For example, the content detection module 310 is capable of detecting whether the content is currently being displayed on the device. Further, when there are multiple pieces of content on a particular device, the content detection module 310 is capable of detecting whether the content that is the subject of the collaboration session is currently the priority content and shown on the display relative to the other pieces of content. For example, the subject of the collaboration session is currently on top and shown on the display relative to other content.
  • In another embodiment, the content detection module 310 detects the size of the content that is being displayed on the device. For example, the content detection module 310 is capable of determining the size of the window that displays the content being utilized within the collaboration session.
  • In yet another embodiment, the content detection module 310 detects the displayed location of the content that is the subject of the collaboration session. For example, centrally positioned locations on a display device may be considered better locations for a user to view the content that is the subject of the collaboration session. Similarly, locations that are on the periphery of the display device may be considered less desirable.
  • In one embodiment, the interface detection module 320 monitors input from various interface devices connected to devices that are participating in the collaboration session. These various interface devices include a keyboard, a pointing device, a microphone, a telephone, a video camera, and the like. In one embodiment, the interface detection module 320 detects when the cursor is moved within the device participating in the collaboration session. In another embodiment, the interface detection module 320 monitors the voice transmissions originating from the device participating in the collaboration session. In yet another embodiment, the interface detection module 320 detects any activity by the device participating in the collaboration session.
  • For example, in one embodiment, the presence of input from the device participating in a collaboration session indicates that the device is being actively utilized and that the user may be viewing the device.
  • In one embodiment, the storage module 330 stores a record including a list of attributes associated with the each device participating in a collaboration session. An exemplary list of attributes is shown in a record 400 within FIG. 4.
  • In one embodiment, the interface module 340 receives a signal from one of the electronic devices 110. In one embodiment, the electronic devices 110 are participating in a collaboration session. In another embodiment, the interface module 340 delivers a signal to one of the electronic devices 110.
  • In one embodiment, the duration module 360 monitors the duration of the interaction of the devices participating in the collaboration session. For example, if one of the devices has the content located in a prominent location, then the duration module 360 monitors the length of time that the content is shown in the prominent location. In another example, the duration module 360 also monitors the duration of voice transmissions and/or use of other interface devices.
  • In another embodiment, the duration module 360 also monitors the frequency of interaction of the devices participating in the collaboration session. If the device participating in the collaboration session is frequently being utilized during the duration of the collaboration session, there is a higher chance that a user of the device is viewing the device during the collaboration session.
  • In one embodiment, the scoring module 370 receives information from the content detection module 310, the interface detection module 320, and the duration module 360. In one embodiment, the scoring module 370 determines the level of participation of the user of each device participating in the collaboration session is considered active. In one embodiment, if the level of participation exceeds an arbitrary threshold, then the device is considered “active”. Similarly, if the level of participation is below an arbitrary threshold, then the device is considered “inactive”.
  • In one embodiment, the level of participation among each device is utilized to determine the level of participation for all participants within the collaboration session.
  • In one embodiment, the record module 380 stores the collaboration session. In one embodiment, the record module 380 stores the content that is presented during the collaboration session. In another embodiment, the record module 380 stores the annotations and comments produced by the participants of the collaboration session. In yet another embodiment, the record module 380 the activity levels of each of the devices participating in the collaboration session.
  • In another embodiment, the record module 380 stores the participation values for each device participating in the collaboration session and corresponding content that is the subject of the collaboration session. In one embodiment, the participation values for each device are stored for a particular moment that corresponds with the same moment during the collaboration session.
  • The system 300 in FIG. 3 is shown for exemplary purposes and is merely one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session. Additional modules may be added to the system 300 without departing from the scope of the methods and apparatuses for reporting based on attention of a user during a collaboration session. Similarly, modules may be combined or deleted without departing from the scope of the methods and apparatuses for reporting based on attention of a user during a collaboration session.
  • FIG. 4 illustrates an exemplary record 400 for use with the methods and apparatuses for reporting based on attention of a user during a collaboration session. In one embodiment, the record 400 illustrates an exemplary record associated with the participation of a particular device during a collaboration session.
  • In one embodiment, there are multiple records such that each record 400 is associated with a particular device. Further, each device corresponds with multiple records wherein each record 400 for a particular device corresponds to a particular collaboration session.
  • In one embodiment, the record 400 includes a content activity field 410, an interface activity field 420, a duration field 430, a scoring field 440, and an aggregated scoring field 450. In one embodiment, the record 400 resides within the storage module 330. In one embodiment, the record 400 describes attributes detected through the system 300.
  • In one embodiment, the content activity field 410 includes information related to the content activity associated with the particular device during the collaboration session. In one embodiment, the content activity field 410 indicates a time stamp when the content activity is detected through the content detection module 310. For example, a time stamp that describes when the content that is the subject of the collaboration session is pushed to the background is recorded and stored within the content activity field 410.
  • In one embodiment, the interface activity field 420 includes information related to the interface activity associated with the particular device during the collaboration session. In one embodiment, the content activity field 420 indicates a time stamp when the interface activity is detected through the content detection module 320. For example, a time stamp that describes when the cursor is utilized by a user of the particular device during the collaboration session is recorded and stored within the interface activity field 420.
  • In one embodiment, the duration field 430 includes the length of time that corresponds to activity attributed from the content and/or the interface device associated with the particular device during the collaboration session. In one embodiment, the duration field 430 receives information through the duration module 360.
  • In one embodiment, the scoring field 440 includes a score that represents a rating of both the duration and the activity level of the particular device during the collaboration session. In one embodiment, the scoring field 440 receives information through the scoring module 370.
  • In one embodiment, the aggregated scoring field 450 includes a score that represents an overall rating of all the devices during the collaboration session. In one embodiment, the overall rating is based on both the duration and the activity level of all the devices during the collaboration session. In one embodiment, the aggregated scoring field 450 receives information through the scoring module 370.
  • The flow diagrams as depicted in FIGS. 5, 6, 7 and 8 are one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session. The blocks within the flow diagrams can be performed in a different sequence without departing from the spirit of the methods and apparatuses for reporting based on attention of a user during a collaboration session. Further, blocks can be deleted, added, or combined without departing from the spirit of the methods and apparatuses for reporting based on attention of a user during a collaboration session.
  • The flow diagram in FIG. 5 illustrates capturing activities occurring on a device that is participating on a collaboration session according to one embodiment of the invention.
  • In Block 510, the content is detected. In one embodiment, the content is one of the items utilized within the collaboration session. In one embodiment, the content is detected through the content detection module 310. For example, the manner in which the content is displayed on the device that is participating in the collaboration session is detected. The manner in which the content is displayed includes the size of the content being displayed, the priority of the content (whether the content is displayed on the top-most window), and/or the position of the content within the display area.
  • In Block 520, the interface is detected. In one embodiment, the interface input is detected from one of the devices participating in the collaboration session. In one embodiment, the interface input is detected through the interface detection module 320. For example, movement of the cursor or selection of items by devices during the collaboration session is detected.
  • In Block 530, if interface input is detected, then the input activity is recorded within the Block 540. If no interface input is detected, then the absence of input activity is recorded within the Block 550.
  • In Block 530, if a change in the display of the content is detected, then the content activity is recorded within the Block 540. If no change in the display of the content is detected, then the absence of content activity is recorded within the Block 550.
  • In Block 560, a time stamp is applied to either the recordation of either activity or absence of activity. In one embodiment, the time stamp is accorded a real time of day such as 1:30 PM. In another embodiment, the time stamp is accorded a time relative to the collaboration session such as 2 minutes 15 seconds into the collaboration session. In either embodiment, the granularity of the time stamp may be adjustable. For example, the time stamp can occur every minute, every second, or every fraction of a second depending on the collaboration session.
  • In one embodiment, the activity information that arises from the content detection module 310 and the interface detection module 320 are recorded within a record 400 of the storage module 330.
  • The flow diagram in FIG. 6 illustrates determining levels of activity from devices participating in a collaboration session according to one embodiment of the invention.
  • In Block 610, activity information is received. In one embodiment, the activity information is recorded within FIG. 5.
  • In Block 620, the activity information associated with a device participating in the collaboration session is compared against a predetermined threshold. In one embodiment, the predetermined threshold includes parameters from the content detection module 310, the interface detection module 320, and the duration module 360. In another embodiment, the predetermined threshold is measured in terms of the parameter corresponding with the scoring module 370.
  • In one embodiment, the predetermined threshold is assigned to differentiate between devices that show indication that they are active during the collaboration session versus devices that show indication that they are inactive during the collaboration session.
  • In one embodiment the predetermined threshold is based on a cumulative score over the period of the collaboration session. In another embodiment, the predetermine threshold is based on a periodic score that is reset at particular intervals.
  • In Block 630, if the activity from a particular device participating in the collaboration session exceeds the predetermined threshold, then the activity associated with the particular device is considered active and recorded in the Block 640.
  • In Block 630, if the activity from a particular device participating in the collaboration session falls below the predetermined threshold, then the activity associated with the particular device is considered inactive and recorded in the Block 650.
  • In Block 660, the attendee device that falls below the predetermined threshold is notified. In one embodiment, the notification includes one of a buzzer sound from the attendee device, a visual indication on the attendee device, and the content that is the subject of the collaboration session being placed in the forefront. In another embodiment, the attendee device is notified by a pop-up window notifying the user of the attendee device that additional attention is requested.
  • In another embodiment, the presenter device is also notified when the attendee device falls below the predetermined threshold.
  • The flow diagram in FIG. 7 illustrates displaying the status of the devices participating in a collaboration session according to one embodiment of the invention.
  • In Block 710, current activity status of each device participating in the collaboration session is received. In one embodiment, the current activity status is received during the collaboration session. In one embodiment, the current activity status of a device may be either active or inactive. In one embodiment, the determination of whether the device is either active or inactive depends on factors such as content detection, interface detection, and duration of the content and/or interface detection. An exemplary determination of the current activity status is found in FIG. 6.
  • In Block 720, the current activity status of each device participating in the collaboration session is displayed to one of the devices participating in the collaboration session. In one embodiment, the device that receives the displayed current activity status is the host or originator of the collaboration session. In another embodiment, the device that receives the displayed current activity status is the current presenter of the collaboration session; the presenter of the collaboration session may change multiple times during the collaboration session.
  • An exemplary screen shot illustrating a display of the current activity status for devices participating in the collaboration session is shown in FIG. 10.
  • In Block 730, notification regarding the current activity status is received by one of the devices participating in the collaboration session. In one embodiment, the notification indicates that a predetermined number of devices within the collaboration session that are inactive. In another embodiment, the notification indicates that a predetermined percentage of devices within the collaboration session that are inactive.
  • In one embodiment, the notification is presented to the device associated with current presenter in the collaboration session. In another embodiment, the notification is presented to the device associated with host or originator of the collaboration session.
  • In one instance, when a particular device associated with the current presenter receives a notification, then the current presenter is able to modify the presentation to other devices with the hope of increasing the number of devices that are actively participating in the collaboration session.
  • The flow diagram in FIG. 8 illustrates displaying the status of the devices participating in a collaboration session according to one embodiment of the invention.
  • In Block 810, the collaboration session is recorded. In one embodiment, the comments made by various devices participating in the collaboration session as well as the content presented during the collaboration session is recorded. In one embodiment, a time is recorded simultaneously with the collaboration session. For example, a time stamp is recorded periodically as the collaboration session is also recorded. As an exemplary scenario, the time stamp periodically occurs every second. In one embodiment, the time stamp refers to the actual time such as 1:25:33 PM. In another embodiment, the time stamp is relative to the start time of the collaboration session.
  • In Block 820, the activity status and the time stamp corresponding to the activity status is recorded. In one embodiment, the activity status of each device participating in the collaboration session is recorded. An exemplary determination of the current activity status is found in FIG. 6.
  • In one embodiment, the activity status may change over the course of the collaboration session. The activity status for each device may be recorded at a predetermined interval. The predetermined interval may range for a fraction of a second to multiple seconds.
  • In Block 830, the activity status for each device at a given time is correlated to the collaboration session. In one embodiment, the time stamps corresponding to the activity status are matched with time stamps corresponding to the recorded collaboration session. For example, the activity status for device # 3 at time t1 is matched to the content presented at the collaboration session at the same time t1.
  • In Block 840, the activity status and content presented at the collaboration session are displayed.
  • An exemplary screen shot illustrating the display of the current activity status for devices participating in the collaboration session and the content presented at the collaboration session are shown in FIG. 9.
  • FIG. 9 illustrates an exemplary screen shot 900 that shows one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session. In one embodiment, the screen shot 900 includes a visual display area 910, a time line 920, and an activity display 930.
  • In one embodiment, the visual display area 910 displays content utilized during the collaboration session. In one embodiment, the content is textual, graphical, audio, and/or video. Further, the content may be annotations, conversations, and content presented during the collaboration session.
  • In one embodiment, the time line 920 graphically displays a timing diagram that shows the duration of at least a portion of the collaboration session. In one embodiment, different times are shown on the time line 920 such as t0, t1, and t2. For example, t0 may represent the beginning of the collaboration session, and t2 may represent the termination of the collaboration session. In one embodiment, a marker 921 shows a current time location of the collaboration session. For example, the marker 921 associates the content displayed within the graphical display area 910 with a location of the collaboration session.
  • In one embodiment, the activity display 930 graphically illustrates the percentage of devices that are actively participating in the collaboration session. In this example, gradations of 100%, 50%, and 0% are shown. However, any number of intervals may be utilized without departing from the spirit or scope of the invention. In one embodiment, at times t0, t1, and t2, the activity display 930 highlights those participation percentages as milestone time periods. In one embodiment, the marker 921 shows the percentage of device participation at a particular time.
  • In another embodiment, the activity display 930 graphically illustrates the actual number of devices that are actively participating in the collaboration session.
  • FIG. 10 illustrates an exemplary screen shot 1000 that shows one embodiment of the methods and apparatuses for reporting based on attention of a user during a collaboration session. In one embodiment, the screen shot 1000 includes a graphical display area 1010, a time line 1020, and an activity display 1030.
  • In one embodiment, the visual display area 1010 displays content utilized during the collaboration session. In one embodiment, the content is textual, graphical, audio, and/or video. Further, the content may be annotations, conversations, and content presented during the collaboration session.
  • In one embodiment, the time line 1020 graphically displays a timing diagram that shows the duration of at least a portion of the collaboration session. In one embodiment, different times are shown on the time line 1020 such as t0, t1, and t2. For example, t0 may represent the beginning of the collaboration session, and t2 may represent the termination of the collaboration session. In one embodiment, a marker 1021 shows a current time location of the collaboration session. For example, the marker 1021 associates the content displayed within the visual display area 1010 with a location of the collaboration session.
  • In one embodiment, the activity display 1030 graphically identifies devices that are actively participating in the collaboration session. In this example, pictorial representations 1031 of each device are shown. In one embodiment, a different color of the pictorial representations 1031 represents whether the device is actively participating or inactive. In another embodiment, a highlighted pictorial representation 1031 represents that the device is actively participating in the collaboration session. However, any number of pictorial and/or textual representations may be utilized without departing from the spirit or scope of the invention.
  • In one embodiment, at times t0, t1, and t2, the activity display 1030 highlights those devices that are actively participating as milestone time periods. In one embodiment, the marker 1021 shows the devices that are actively participating at a particular time.
  • In another embodiment, the activity display 930 graphically illustrates the actual number of devices that are actively participating in the collaboration session.
  • The foregoing descriptions of specific embodiments of the invention have been presented for purposes of illustration and description. The invention may be applied to a variety of other applications.
  • They are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed, and naturally many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the Claims appended hereto and their equivalents.

Claims (20)

1. A method comprising:
conducting a collaboration session between a plurality of participant devices, at least one participant device operating as a presenter device that shares content during the collaboration session, and at least one participant device operating as an attendee device that receives and displays the content shared during the collaboration session; recording the content shared during the collaboration session, the recorded content indicating content that was shared at various times during the collaboration session;
recording an activity status of at least some participant devices at various times during the collaboration session;
correlating the recorded activity status of the at least some participant devices with the content that was being shared, at each of a plurality of times during the collaboration session; and
displaying, on an electronic device, for a selected time of the plurality of times during the collaboration session, the content that was being shared in the collaboration session at the selected time and an indication of the activity status of the at least some participant devices at the selected time.
2. The method of claim 1, wherein the activity status indicates whether the participant device is considered active or inactive in the collaboration session.
3. The method of claim 1, wherein the indication of the activity status of the at least some participant devices at the selected time is a graphic indication of a percentage of the at least some participant devices that are considered active in the collaboration session at the selected time.
4. The method of claim 1, wherein the indication of the activity status of the at least some participant devices is a graph indicating a percentage of the at least some participant devices that are considered active in the collaboration session, with a marker in the graph indicating the selected time.
5. The method of claim 1, wherein the displaying further comprises:
displaying a timing diagram showing a duration of the collaboration session, the timing diagram including a marker indicating the selected time.
6. The method of claim 1 wherein the indication of the activity status of the at least some participant devices includes one or more pictorial representations of participant devices, each pictorial representation indicating whether a corresponding participant device is considered active, or indicating whether a corresponding participant device is considered inactive, at the selected time.
7. The method of claim 1, further comprising:
detecting a manner in which the content is displayed on the at least some participant devices during the collaboration; and
determining the activity status of each participant device of the at least some participant devices at various times during the collaboration session based on the manner in which the content is displayed on the participant device at the times.
8. The method of claim 1 further comprising:
detecting interface input on the at least some participant devices during the collaboration session; and
determining the activity status of each participant device of the at least some participant devices at various times during the collaboration session based on any interface input detected on the participant device at the times.
9. The method of claim 1, further comprising:
detecting a manner in which the content is displayed on the at least some participant devices during the collaboration session;
detecting interface input on the at least some participant devices during the collaboration session; and
determining the activity status of each participant device of the at least some participant devices at various times during the collaboration session based both on the manner in which the content is displayed on the participant device at the times and any interface input detected on the participant device at the times.
10. The method of claim 1, further comprising:
determining a level of participation of each of the least some participant devices at various times during the collaboration session; and
determining whether each of the at least some participant devices is active or inactive in the collaboration session at the various times based on whether the participant device's level of participation exceeds a threshold.
11. The method of claim 1, wherein the recording the content shared during the collaboration session comprises periodically recording corresponding time stamps in association with recorded content, and the recording an activity status comprises recording corresponding time stamps in association with the activity status, and the correlating comprises matching time stamps corresponding to the activity status with time stamps corresponding to the recorded content.
12. The method of claim 1 wherein the electronic device is the presenter device.
13. An apparatus, comprising:
a processor; and
a memory configured to store instructions for execution on the processor, the instructions including instructions that when executed are operable to
record content shared during a collaboration session among a plurality of participant devices, the recorded content to provide a record of what content was shared at what times during the collaboration session,
record an activity status of at least some participant devices at a number of times during a collaboration session, the recorded activity status to provide a record of what participant devices are considered active at what times during the collaboration session,
correlate the recorded activity status of the at least some participant devices with the content that was shared, at each of a plurality of times during the collaboration session, and
display, for a selected time of the plurality of times, the content that was shared in the collaboration session at the selected time and an indication of the activity status of the at least some participant devices at the selected time.
14. The apparatus of claim 13, wherein the indication of the activity status of the at least some participant devices at the selected time is a graphic indication of a percentage of the at least some participant devices that are considered active in the collaboration session at the selected time.
15. The apparatus of claim 13, the indication of the activity status of the at least some participant devices is a graph indicating a percentage of the at least some participant devices that are considered active in the collaboration session, with a marker in the graph indicating the selected time.
16. The apparatus of claim 13, wherein the instructions further include instructions that when executed are operable to display a timing diagram showing a duration of the collaboration session, the timing diagram including a marker indicating the selected time.
17. The apparatus of claim 13, wherein the indication of the activity status of the at least some participant devices includes one or more pictorial representations of participant devices, each pictorial representation indicating whether a corresponding participant device is considered active, or indicating whether a corresponding participant device is considered inactive, at the selected time.
18. The apparatus of claim 13, wherein the instructions further include instructions that when executed are operable to determine a level of participation of each of the least some participant devices during the collaboration session, and determine whether each of the at least some participant devices is active or inactive in the collaboration session based on whether the participant device's level of participation exceeds a threshold.
19. The apparatus of claim 13, wherein the instructions that when executed are operable to record content shared during the collaboration session comprise instructions that when executed are operable to periodically record corresponding time stamps in association with recorded content, and the instructions that when executed are operable to record an activity status comprise instructions that when executed are operable to record corresponding time stamps in association with the activity status, and the instructions that when executed are operable to correlate comprise instructions that when executed are operable to match time stamps corresponding to the activity status with time stamps corresponding to the recorded content.
20. An apparatus, comprising:
means for conducting a collaboration session between a plurality of participant devices, at least one participant device operating as a presenter device that shares content during the collaboration session, and at least one participant device operating as an attendee device that receives and displays the content shared during the collaboration session;
means for recording the content shared during the collaboration session, the recorded content indicating content that was shared at various times during the collaboration session;
means for recording an activity status of at least some participant devices at various times during the collaboration session;
means for correlating the recorded activity status of the at least some participant devices with the content that was being shared, at each of a plurality of times during the collaboration session; and
means for displaying for a selected time of the plurality of times during the collaboration session, the content that was being shared in the collaboration session at the selected time and an indication of the activity status of the at least some participant devices at the selected time.
US13090839 2004-09-20 2011-04-20 Methods and apparatuses for reporting based on attention of a user during a collaboration session Abandoned US20110196930A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US87240004 true 2004-09-20 2004-09-20
US11172184 US7945619B1 (en) 2004-09-20 2005-06-29 Methods and apparatuses for reporting based on attention of a user during a collaboration session
US13090839 US20110196930A1 (en) 2004-09-20 2011-04-20 Methods and apparatuses for reporting based on attention of a user during a collaboration session

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13090839 US20110196930A1 (en) 2004-09-20 2011-04-20 Methods and apparatuses for reporting based on attention of a user during a collaboration session

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11172184 Continuation US7945619B1 (en) 2004-09-20 2005-06-29 Methods and apparatuses for reporting based on attention of a user during a collaboration session

Publications (1)

Publication Number Publication Date
US20110196930A1 true true US20110196930A1 (en) 2011-08-11

Family

ID=43981644

Family Applications (2)

Application Number Title Priority Date Filing Date
US11172184 Active 2028-02-16 US7945619B1 (en) 2004-09-20 2005-06-29 Methods and apparatuses for reporting based on attention of a user during a collaboration session
US13090839 Abandoned US20110196930A1 (en) 2004-09-20 2011-04-20 Methods and apparatuses for reporting based on attention of a user during a collaboration session

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11172184 Active 2028-02-16 US7945619B1 (en) 2004-09-20 2005-06-29 Methods and apparatuses for reporting based on attention of a user during a collaboration session

Country Status (1)

Country Link
US (2) US7945619B1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130100142A1 (en) * 2007-10-24 2013-04-25 Social Communications Company Interfacing with a spatial virtual communication environment
US20130132480A1 (en) * 2011-11-17 2013-05-23 Hitachi, Ltd. Event Data Processing Apparatus
US20160337213A1 (en) * 2015-05-15 2016-11-17 General Electric Company System and method for integrating collaboration modes
USRE46309E1 (en) 2007-10-24 2017-02-14 Sococo, Inc. Application sharing
US20170093993A1 (en) * 2015-09-24 2017-03-30 International Business Machines Corporation Determining and displaying user awareness of information
US9755966B2 (en) 2007-10-24 2017-09-05 Sococo, Inc. Routing virtual area based communications
US9762641B2 (en) 2007-10-24 2017-09-12 Sococo, Inc. Automated real-time data stream switching in a shared virtual area communication environment
US9853922B2 (en) 2012-02-24 2017-12-26 Sococo, Inc. Virtual area communications
US10003624B2 (en) 2009-01-15 2018-06-19 Sococo, Inc. Realtime communications and network browsing client
US10158689B2 (en) 2007-10-24 2018-12-18 Sococo, Inc. Realtime kernel

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8918458B2 (en) * 2005-04-20 2014-12-23 International Business Machines Corporation Utilizing group statistics for groups of participants in a human-to-human collaborative tool
US8281003B2 (en) * 2008-01-03 2012-10-02 International Business Machines Corporation Remote active window sensing and reporting feature
US20090210352A1 (en) * 2008-02-20 2009-08-20 Purplecomm, Inc., A Delaware Corporation Website presence marketplace
US8539057B2 (en) * 2008-02-20 2013-09-17 Purplecomm, Inc. Website presence
US9336527B2 (en) * 2008-02-20 2016-05-10 Purplecomm, Inc. Collaborative website presence
US8296374B2 (en) 2008-10-29 2012-10-23 International Business Machines Corporation Controlling the presence information of activity participants
US20100198742A1 (en) * 2009-02-03 2010-08-05 Purplecomm, Inc. Online Social Encountering
US8373741B2 (en) * 2009-11-20 2013-02-12 At&T Intellectual Property I, Lp Apparatus and method for collaborative network in an enterprise setting
US20110271208A1 (en) * 2010-04-30 2011-11-03 American Teleconferencing Services Ltd. Location-Aware Conferencing With Entertainment Options
US8655823B1 (en) * 2011-03-23 2014-02-18 Emc Corporation Event management system based on machine logic
JP6191248B2 (en) * 2013-06-04 2017-09-06 富士通株式会社 Information processing apparatus and an information processing program
US9477934B2 (en) 2013-07-16 2016-10-25 Sap Portals Israel Ltd. Enterprise collaboration content governance framework

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6002397A (en) * 1997-09-30 1999-12-14 International Business Machines Corporation Window hatches in graphical user interface
US6148328A (en) * 1998-01-29 2000-11-14 International Business Machines Corp. Method and system for signaling presence of users in a networked environment
US6301246B1 (en) * 1998-08-17 2001-10-09 Siemens Information And Communication Networks, Inc. Silent monitoring in a voice-over-data-network environment
US20020049786A1 (en) * 2000-01-25 2002-04-25 Autodesk, Inc Collaboration framework
US6400392B1 (en) * 1995-04-11 2002-06-04 Matsushita Electric Industrial Co., Ltd. Video information adjusting apparatus, video information transmitting apparatus and video information receiving apparatus
US20020138624A1 (en) * 2001-03-21 2002-09-26 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Collaborative web browsing
US20020143916A1 (en) * 2000-05-11 2002-10-03 Dennis Mendiola Method and system for tracking the online status of active users of an internet-based instant messaging system
US20020143876A1 (en) * 2001-02-06 2002-10-03 Boyer David Gray Apparatus and method for use in collaboration services
US6519639B1 (en) * 1999-07-21 2003-02-11 Microsoft Corporation System and method for activity monitoring and reporting in a computer network
US20030055897A1 (en) * 2001-09-20 2003-03-20 International Business Machines Corporation Specifying monitored user participation in messaging sessions
US20030052911A1 (en) * 2001-09-20 2003-03-20 Koninklijke Philips Electronics N.V. User attention-based adaptation of quality level to improve the management of real-time multi-media content delivery and distribution
US20030101219A1 (en) * 2000-10-06 2003-05-29 Tetsujiro Kondo Communication system, communication device, seating-order determination device, communication method, recording medium, group-determination-table generating method, and group-determination-table generating device
US20030223182A1 (en) * 2002-05-29 2003-12-04 Kabushiki Kaisha Toshiba Information processing apparatus and window size control method used in the same unit
US6691162B1 (en) * 1999-09-21 2004-02-10 America Online, Inc. Monitoring users of a computer network
WO2004014059A2 (en) * 2002-08-02 2004-02-12 Collabo-Technology, Inc. Method and apparatus for processing image-based events in a meeting management system
US20040128359A1 (en) * 2000-03-16 2004-07-01 Horvitz Eric J Notification platform architecture
US6782350B1 (en) * 2001-04-27 2004-08-24 Blazent, Inc. Method and apparatus for managing resources
US20050055412A1 (en) * 2003-09-04 2005-03-10 International Business Machines Corporation Policy-based management of instant message windows
US20050099492A1 (en) * 2003-10-30 2005-05-12 Ati Technologies Inc. Activity controlled multimedia conferencing
US6917567B2 (en) * 2001-06-15 2005-07-12 Fujitsu Limited Coil driving system, information storage apparatus, and driving method
US20070100393A1 (en) * 2002-05-24 2007-05-03 Whitehurst Todd K Treatment of movement disorders by brain stimulation
US7373608B2 (en) * 2004-10-07 2008-05-13 International Business Machines Corporation Apparatus, system and method of providing feedback to an e-meeting presenter
US7386473B2 (en) * 1996-09-03 2008-06-10 Nielsen Media Research, Inc. Content display monitoring by a processing system
US7507091B1 (en) * 2003-05-19 2009-03-24 Microsoft Corporation Analyzing cognitive involvement

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6917587B1 (en) 2001-02-01 2005-07-12 Cisco Technology, Inc. Method and apparatus for recovering a call resource from a call session
US7139797B1 (en) * 2002-04-10 2006-11-21 Nortel Networks Limited Presence information based on media activity

Patent Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6400392B1 (en) * 1995-04-11 2002-06-04 Matsushita Electric Industrial Co., Ltd. Video information adjusting apparatus, video information transmitting apparatus and video information receiving apparatus
US7386473B2 (en) * 1996-09-03 2008-06-10 Nielsen Media Research, Inc. Content display monitoring by a processing system
US6002397A (en) * 1997-09-30 1999-12-14 International Business Machines Corporation Window hatches in graphical user interface
US6148328A (en) * 1998-01-29 2000-11-14 International Business Machines Corp. Method and system for signaling presence of users in a networked environment
US6301246B1 (en) * 1998-08-17 2001-10-09 Siemens Information And Communication Networks, Inc. Silent monitoring in a voice-over-data-network environment
US6519639B1 (en) * 1999-07-21 2003-02-11 Microsoft Corporation System and method for activity monitoring and reporting in a computer network
US6631412B1 (en) * 1999-07-21 2003-10-07 Microsoft Corporation System and method for activity monitoring and reporting in a computer network
US6691162B1 (en) * 1999-09-21 2004-02-10 America Online, Inc. Monitoring users of a computer network
US20020049786A1 (en) * 2000-01-25 2002-04-25 Autodesk, Inc Collaboration framework
US20040128359A1 (en) * 2000-03-16 2004-07-01 Horvitz Eric J Notification platform architecture
US20020143916A1 (en) * 2000-05-11 2002-10-03 Dennis Mendiola Method and system for tracking the online status of active users of an internet-based instant messaging system
US20030101219A1 (en) * 2000-10-06 2003-05-29 Tetsujiro Kondo Communication system, communication device, seating-order determination device, communication method, recording medium, group-determination-table generating method, and group-determination-table generating device
US20020143876A1 (en) * 2001-02-06 2002-10-03 Boyer David Gray Apparatus and method for use in collaboration services
US20020138624A1 (en) * 2001-03-21 2002-09-26 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Collaborative web browsing
US6782350B1 (en) * 2001-04-27 2004-08-24 Blazent, Inc. Method and apparatus for managing resources
US6917567B2 (en) * 2001-06-15 2005-07-12 Fujitsu Limited Coil driving system, information storage apparatus, and driving method
US20030052911A1 (en) * 2001-09-20 2003-03-20 Koninklijke Philips Electronics N.V. User attention-based adaptation of quality level to improve the management of real-time multi-media content delivery and distribution
US20030055897A1 (en) * 2001-09-20 2003-03-20 International Business Machines Corporation Specifying monitored user participation in messaging sessions
US20070100393A1 (en) * 2002-05-24 2007-05-03 Whitehurst Todd K Treatment of movement disorders by brain stimulation
US20030223182A1 (en) * 2002-05-29 2003-12-04 Kabushiki Kaisha Toshiba Information processing apparatus and window size control method used in the same unit
WO2004014059A2 (en) * 2002-08-02 2004-02-12 Collabo-Technology, Inc. Method and apparatus for processing image-based events in a meeting management system
US7507091B1 (en) * 2003-05-19 2009-03-24 Microsoft Corporation Analyzing cognitive involvement
US20050055412A1 (en) * 2003-09-04 2005-03-10 International Business Machines Corporation Policy-based management of instant message windows
US20050099492A1 (en) * 2003-10-30 2005-05-12 Ati Technologies Inc. Activity controlled multimedia conferencing
US7373608B2 (en) * 2004-10-07 2008-05-13 International Business Machines Corporation Apparatus, system and method of providing feedback to an e-meeting presenter

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE46309E1 (en) 2007-10-24 2017-02-14 Sococo, Inc. Application sharing
US20130104057A1 (en) * 2007-10-24 2013-04-25 Social Communications Company Interfacing with a spatial virtual communication environment
US9762641B2 (en) 2007-10-24 2017-09-12 Sococo, Inc. Automated real-time data stream switching in a shared virtual area communication environment
US9755966B2 (en) 2007-10-24 2017-09-05 Sococo, Inc. Routing virtual area based communications
US9411489B2 (en) * 2007-10-24 2016-08-09 Sococo, Inc. Interfacing with a spatial virtual communication environment
US9483157B2 (en) * 2007-10-24 2016-11-01 Sococo, Inc. Interfacing with a spatial virtual communication environment
US20130100142A1 (en) * 2007-10-24 2013-04-25 Social Communications Company Interfacing with a spatial virtual communication environment
US10158689B2 (en) 2007-10-24 2018-12-18 Sococo, Inc. Realtime kernel
US10003624B2 (en) 2009-01-15 2018-06-19 Sococo, Inc. Realtime communications and network browsing client
US9111242B2 (en) * 2011-11-17 2015-08-18 Hitachi, Ltd. Event data processing apparatus
US20130132480A1 (en) * 2011-11-17 2013-05-23 Hitachi, Ltd. Event Data Processing Apparatus
US9853922B2 (en) 2012-02-24 2017-12-26 Sococo, Inc. Virtual area communications
US20160337213A1 (en) * 2015-05-15 2016-11-17 General Electric Company System and method for integrating collaboration modes
US20170093993A1 (en) * 2015-09-24 2017-03-30 International Business Machines Corporation Determining and displaying user awareness of information

Also Published As

Publication number Publication date Type
US7945619B1 (en) 2011-05-17 grant

Similar Documents

Publication Publication Date Title
Tyler et al. When can I expect an email response? A study of rhythms in email usage
Khalil et al. Context-aware telephony: privacy preferences and sharing patterns
US7719975B2 (en) Method and system for communication session under conditions of bandwidth starvation
US7444379B2 (en) Method for automatically setting chat status based on user activity in local environment
US20090323916A1 (en) Notification to absent teleconference invitees
US20070239869A1 (en) User interface for user presence aggregated across multiple endpoints
US20050050151A1 (en) Scalable instant messaging architecture
US7062533B2 (en) Specifying monitored user participation in messaging sessions
US8103725B2 (en) Communication using delegates
US20110029622A1 (en) Systems and methods for group communications
US7502797B2 (en) Supervising monitoring and controlling activities performed on a client device
US20080320082A1 (en) Reporting participant attention level to presenter during a web-based rich-media conference
US20110161987A1 (en) Scaling notifications of events in a social networking system
US20100088246A1 (en) System for, and method of, managing a social network
US20060242232A1 (en) Automatically limiting requests for additional chat sessions received by a particula user
US20140087687A1 (en) Method and system for collecting and presenting historical communication data for a mobile device
US20100049852A1 (en) Resource management of social network applications
US7894849B2 (en) Mobile personal services platform for providing feedback
US20030004773A1 (en) Scheduling system with methods for polling to determine best date and time
US8943145B1 (en) Customer support via social network
US20100138492A1 (en) Method and apparatus for multimedia collaboration using a social network system
US20090222330A1 (en) System and method for determining like-mindedness
US20070100939A1 (en) Method for improving attentiveness and participation levels in online collaborative operating environments
US6731308B1 (en) Mechanism for reciprocal awareness of intent to initiate and end interaction among remote users
US20070100986A1 (en) Methods for improving interactive online collaboration using user-defined sensory notification or user-defined wake-ups

Legal Events

Date Code Title Description
AS Assignment

Owner name: CISCO WEBEX LLC, DELAWARE

Free format text: CHANGE OF NAME;ASSIGNOR:WEBEX COMMUNICATIONS, INC.;REEL/FRAME:027033/0756

Effective date: 20091005

Owner name: CISCO TECHNOLOGY, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CISCO WEBEX LLC;REEL/FRAME:027033/0764

Effective date: 20111006