US20210367802A1 - Meeting summary generation - Google Patents

Meeting summary generation Download PDF

Info

Publication number
US20210367802A1
US20210367802A1 US17/308,640 US202117308640A US2021367802A1 US 20210367802 A1 US20210367802 A1 US 20210367802A1 US 202117308640 A US202117308640 A US 202117308640A US 2021367802 A1 US2021367802 A1 US 2021367802A1
Authority
US
United States
Prior art keywords
meeting
participant
participants
content
during
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/308,640
Inventor
Krishna Yarlagadda
Harish Rajamani
Nava DAVULURI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huddl Inc
Original Assignee
Huddl Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huddl Inc filed Critical Huddl Inc
Priority to US17/308,640 priority Critical patent/US20210367802A1/en
Assigned to HUDDL Inc. reassignment HUDDL Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YARLAGADDA, KRISHNA, RAJAMANI, HARISH, DAVULURI, NAVA
Publication of US20210367802A1 publication Critical patent/US20210367802A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/34Browsing; Visualisation therefor
    • G06F16/345Summarisation for human users
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/954Navigation, e.g. using categorised browsing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1093Calendar-based scheduling for persons or groups
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1097Time management, e.g. calendars, reminders, meetings or time accounting using calendar-based scheduling for task assignment
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/18Speech classification or search using natural language modelling
    • G10L15/1815Semantic context, e.g. disambiguation of the recognition hypotheses based on word meaning
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/57Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for processing of video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1818Conference organisation arrangements, e.g. handling schedules, setting up parameters needed by nodes to attend a conference, booking network resources, notifying involved parties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1831Tracking arrangements for later retrieval, e.g. recording contents, participants activities or behavior, network status
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1069Session establishment or de-establishment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1096Supplementary features, e.g. call forwarding or call holding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/155Conference systems involving storage of or access to video conference sessions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8549Creating video summaries, e.g. movie trailer

Definitions

  • the presently disclosed embodiments are related, in general, to a summary generation of an online meeting. More particularly, the presently disclosed embodiments are related to a method and a system for generating a summary of one or more action points for each of a plurality of participants in the online meeting based on content consumed by the plurality of participants during the online meeting.
  • Meetings may be a common everyday occurrence especially for members of an organization. Groups of people may assemble often to discuss one or more determined topics. By way of example, there may be a status meeting, a budget meeting, a staff meeting, a product development meeting, a patent disclosure meeting, a board meeting, and the like. Meetings may be viewed by organizations as an important vehicle for facilitating communication amongst group members for the purpose of disseminating knowledge, problem solving, brainstorming and/or the like. Accordingly, many users of the organization may spend a large portion of their time in one or more meetings.
  • a method and a system for generating a summary of one or more action points for each of a plurality of participants in a meeting is provided substantially as shown in, and/or described in connection with, at least one of the figures, as set forth more completely in the claims.
  • FIG. 1 is a block diagram that illustrates a system environment for generating a summary of one or more action points for each of a plurality of participants in a meeting, in accordance with at least one exemplary embodiment of the disclosure
  • FIG. 2 is a block diagram that illustrates a central server configured to generate a summary of one or more action points for each of a plurality of participants in the meeting, in accordance with at least one exemplary embodiment of the disclosure;
  • FIG. 3 is a flowchart that illustrates a method for displaying the generated summary of one or more action points for each of a plurality of participants in the meeting, in accordance with at least one exemplary embodiment of the disclosure.
  • FIG. 4 is a block diagram of an exemplary computer system for generating a summary of one or more action points for each of a plurality of participants in the meeting, in accordance with various exemplary embodiments of the present disclosure.
  • the illustrated embodiments provide a method and a system for generating a summary of one or more action points for each of a plurality of participants in a first meeting.
  • the method may be implemented by a central server including one or more processors.
  • the method may include monitoring in real time, content consumed by the plurality of participants during the first meeting and for a first defined time before the first meeting.
  • the monitored content may be associated with each of the plurality of participants.
  • the method may include tracking one or more updates performed by the plurality of participants to the monitored content during and post the first meeting.
  • the method may include identifying a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants.
  • the method may include generating the summary of the one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting.
  • the method may include displaying the summary to each of the plurality of participants before a second defined time of the start time of the second meeting.
  • displaying the summary may include opening content for which the one or more updates may have been performed by the plurality of participants.
  • FIG. 1 is a block diagram that illustrates a system environment for generating a summary of one or more action points for each of a plurality of participants in a meeting, in accordance with at least one exemplary embodiment of the disclosure.
  • the system environment 100 may include a plurality of electronic devices, such as 102 , 104 and 106 , which are associated with a plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , a communication network 108 , a database server 110 , and a central server 112 .
  • Each of the plurality of electronic devices 102 , 104 , and 106 that are associated with the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , may be communicatively coupled with the database server 110 , and the central server 112 , via the communication network 108 .
  • the plurality of electronic devices may refer to a computing device used by a participant who has joined an online meeting to collaboratively work with a remaining plurality of participants.
  • the plurality of electronic devices such as electronic device 102 , 104 and 106 may comprise of one or more processors and one or more memories.
  • the one or more memories may include computer readable code that may be executable by the one or more processors to perform predetermined operations.
  • the plurality of electronic devices, such as electronic device 102 , 104 and 106 may present a user-interface to the participant for performing one or more interactions on the electronic device.
  • Examples of the plurality of electronic devices may include, but are not limited to, a personal computer, a laptop, a personal digital assistant (PDA), a mobile device, a tablet, or any other computing device.
  • PDA personal digital assistant
  • the plurality of participants may be utilizing the electronic device 102 , the electronic device 104 and the electronic device 106 , respectively as shown in FIG. 1 .
  • the plurality of participants such as Participant A 102 a , Participant B 104 a , and Participant C 106 a may interact with the plurality of electronic devices, such as electronic device 102 , 104 and 106 by performing one or more interactions on the user-interface presented to each of the respective participants of the associated electronic device.
  • the communication network 108 may include a communication medium through which each of the plurality of electronic devices, such as 102 , 104 and 106 , the database server 110 , and the central server 112 may communicate with each other. Such a communication may be performed, in accordance with various wired and wireless communication protocols. Examples of such wired and wireless communication protocols include, but are not limited to, Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), ZigBee, EDGE, infrared (IR), IEEE 802.11, 802.16, 2G, 3G, 4G, 5G, 6G cellular communication protocols, and/or Bluetooth (BT) communication protocols.
  • TCP/IP Transmission Control Protocol and Internet Protocol
  • UDP User Datagram Protocol
  • HTTP Hypertext Transfer Protocol
  • FTP File Transfer Protocol
  • EDGE ZigBee
  • EDGE infrared
  • the communication network 108 may include, but is not limited to, the Internet, a cloud network, a Wireless Fidelity (Wi-Fi) network, a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a telephone line (POTS), and/or a Metropolitan Area Network (MAN).
  • Wi-Fi Wireless Fidelity
  • WLAN Wireless Local Area Network
  • LAN Local Area Network
  • POTS telephone line
  • MAN Metropolitan Area Network
  • the plurality of electronic devices may include a database server 110 .
  • the database server 110 may refer to a computing device that may be configured to store files associated with one or more applications installed on the electronic device.
  • the database server 110 may be configured to store information, such as, but not limited to, content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during a first meeting and for a first defined time before the first meeting, calendar information associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , meeting data, the one or more interactions, and/or the summary comprising of one or more action points.
  • information such as, but not limited to, content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during a first meeting and for a first defined time before the first meeting, calendar information associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , meeting data, the one or more interactions,
  • the plurality of electronic devices may communicate with the database server 110 using one or more protocols such as, but not limited to, Open Database Connectivity (ODBC) protocol and Java Database Connectivity (JDBC) protocol.
  • the database server 110 may include a special purpose operating system specifically configured to perform one or more database operations on at least one of the meeting data, the calendar information, the summary, the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , and the like. Examples of database operations may include, but are not limited to, Select, Insert, Update, and Delete.
  • the database server 110 may include hardware that may be configured to perform one or more predetermined operations.
  • the database server 110 may be realized through various technologies such as, but not limited to, Microsoft® SQL Server, Oracle®, IBM DB2®, Microsoft Access®, PostgreSQL®, MySQL® and SQLite®, and the like.
  • the scope of the disclosure is not limited to realizing the plurality of electronic devices, such as electronic device 102 , 104 and 106 and the database server 110 as separate entities.
  • the database server 110 may be realized as an application program installed on and/or running on the electronic device without departing from the scope of the disclosure.
  • the central server 112 may refer to a computing device or a software framework hosting an application or a software service.
  • the central server 112 may be implemented to execute procedures such as, but not limited to, programs, routines, or scripts stored in one or more memories for supporting the hosted application or the software service.
  • the hosted application or the software service may be configured to perform one or more predetermined operations.
  • the central server 112 may be realized through various types of application servers such as, but are not limited to, a Java application server, a .NET framework application server, a Base4 application server, a PHP framework application server, or any other application server framework.
  • the “meeting” mentioned in the disclosure herein refers to an online meeting conducted via one or more video conferencing tools.
  • the “meeting” may involve the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a interacting with each other and/or discussing one or more topics.
  • the meeting may be organized face to face or on a virtual platform over the communication network 108 .
  • Meeting on the virtual platform may involve the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , joining a meeting session (created by the central server 112 in the communication network 108 ) using the respective plurality of electronic devices, such as 102 , 104 and 106 .
  • each meeting may have an associated metadata.
  • the metadata may include at least a type of the meeting. Examples of types of meetings may include Status Update Meetings, Decision-Making Meetings, Problem-Solving Meetings, Team-Building Meetings, Idea-Sharing Meetings, Innovation Meetings, and the like.
  • the central server 112 may be configured to generate the meeting data.
  • the meeting data may include data associated with the one or more interactions performed by the participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , during the meeting.
  • the one or more interactions may include events performed by the participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , by using the electronic devices or one or more device peripherals attached to the electronic devices.
  • the events may comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
  • the first meeting may correspond to a meeting that may involve the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a interacting with each other and/or discussing the one or more topics.
  • the second meeting may correspond to a meeting that may involve the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a interacting with each other and/or discussing the one or more topics.
  • the second meeting may have the same plurality of participants as that of the first meeting and may also include one or more new participants.
  • an agenda of either of the first meeting and the second meeting may form a subset of other. In other words, one or more points that may be in the agenda of the first meeting may also be present in the agenda of the second meeting and thus, the second meeting may be scheduled in continuation to the first meeting at a later point of time.
  • the meeting may be conducted on a video conferencing tool.
  • Video conferencing tools may enable online communication for audio meetings, video meetings, and seminars between the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • the video conferencing tools may have one or more built-in features such as chat, screen sharing, recording, and the like.
  • the video conferencing tools used for hosting the meeting may help to enhance collaboration within employees in an organization.
  • the employees may host or attend virtual meetings with fellow employees, company partners, and/or customers. Examples of such video conferencing tools that may be utilized to conduct the meeting may include Skype®, Zoom®, Microsoft Teams®, Cisco Webex Meetings®, and the like.
  • the central server 112 may be configured to create a meeting session.
  • the central server 112 may receive a request from an electronic device, such as 102 associated with a participant A 102 a such as an organizer of the first meeting, to create the meeting session.
  • the organizer of the first meeting one of the participant, such as Participant A 102 a
  • Such meeting joining information associated with the first meeting may include at least an agenda of the first meeting, one or more topics to be discussed during the first meeting, a time duration of the first meeting, a schedule of the first meeting, meeting notes carried forwarded from previous meetings, and/or the like.
  • the central server 112 may be configured to create the meeting session. Additionally, the central server 112 may share the meeting joining information with the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a.
  • the plurality of electronic devices may enable the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a to join the first meeting based on the received meeting joining information. Further, during the first meeting, the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , may speak or share their respective video feeds. Additionally, or alternatively, the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , may share other content amongst each other in order to facilitate the discussions in the first meeting. The other content may include, but is not limited to, presentation content, screen sharing content, file sharing content, and/or the like.
  • each of the plurality of electronic devices may enable the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , to consume the content shared during the first meeting. Further, each of the plurality of electronic devices, such as 102 , 104 and 106 may enable the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , to perform the one or more interactions on the content being consumed.
  • the meeting data comprises data associated with the one or more interactions performed by the participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting.
  • the one or more interactions may include events performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a by using the plurality of electronic devices, such as 102 , 104 and 106 or one or more device peripherals attached to the plurality of electronic devices, such as 102 , 104 and 106 .
  • the events may include at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
  • each of the plurality of electronic devices may transmit the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting to the central server 112 .
  • the consumed content may include at least one of one or more applications, one or more files, one or more libraries, audio content, video content, meeting notes, presentation content, screen sharing content, and/or file sharing content that may be accessed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting.
  • the content may be associated with a plurality of attributes comprising an author name, a last modified time, a last modified date, a last modified author, a version number, a title, a category, a total editing time, a file size, a creation date, a last accessed date, a last accessed author name, a sharing status, an online status, and the like.
  • the central server 112 may be configured to receive the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting.
  • the first defined time may be user configurable.
  • the central server 112 may monitor in real time, the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting.
  • the monitored content may be associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a.
  • the central server 112 may be configured to track one or more updates performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a to the monitored content during and post the first meeting.
  • the tracked one or more updates may be stored in the central server 112 or alternatively may be transmitted to the database server 110 for storing.
  • the central server 112 may be configured to identify a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • the first meeting and the second meeting may have the same plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a and agenda of either of the first meeting and the second meeting may form a subset of other.
  • the central server 112 may be configured to generate the summary of one or more action points for each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a based on the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting.
  • the central server 112 in order to generate the summary of the one or more action points, may be configured to generate the meeting data associated with each participant of the first meeting.
  • the meeting data may include data associated with one or more interactions performed by each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting.
  • the one or more interactions may include events performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a . by using the electronic device or one or more device peripherals attached to the electronic device.
  • the events may include at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
  • the meeting data may further include a transcript of each of the audio generated during the first meeting, the content shared during the first meeting, and/or meeting notes input by each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a.
  • the central server 112 may be configured to train a machine learning (ML) model associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a based on the meeting data associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a , respectively.
  • the central server 112 may be configured to further train the ML model based on metadata associated with the first meeting.
  • the metadata associated with the first meeting may include at least a type of the first meeting. Examples of types of meetings may include Status Update Meetings, Decision-Making Meetings, Problem-Solving Meetings, Team-Building Meetings, Idea-Sharing Meetings, Innovation Meetings, and the like.
  • the central server 112 may be configured to determine the one or more action points to be performed by each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a before the start time of the second meeting based on the trained ML. Further, the central server 112 may be configured to display the summary including the one or more action points to each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a before a second defined time of the start time of the second meeting.
  • displaying the summary may include opening content for which the one or more updates may have been performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • the second defined time may be user configurable.
  • the central server 112 may be configured to transmit the generated summary of the one or more action points to each of the plurality of electronic devices, such as electronic device 102 , 104 and 106 . Further, in an exemplary embodiment, the central server 112 may be configured to open and/or initiate content on each of the plurality of electronic devices, such as electronic device 102 , 104 and 106 before the second defined time of the start time of the second meeting.
  • the content that may be opened and/or initiated may correspond to at least one of the one or more applications, the one or more files, the one or more libraries, the audio content, the video content, the meeting notes, the presentation content, the screen sharing content, and/or the file sharing content that was accessed by the plurality of participants during the first meeting and the one or more updates may have been performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a.
  • FIG. 2 is a block diagram that illustrates a central server configured to generate a summary of one or more action points for each of a plurality of participants in the meeting, in accordance with at least one exemplary embodiment of the disclosure.
  • FIG. 2 has been explained in conjunction with the elements of FIG. 1 .
  • the central server 112 which may include a processor 202 , a non-transitory computer readable medium 203 , a memory 204 , a transceiver 206 , an input/output unit 208 , a monitoring unit 210 , and a summary generation unit 212 .
  • the processor 202 may be communicatively coupled to the non-transitory computer readable medium 203 , the memory 204 , the transceiver 206 , the input/output unit 208 , the monitoring unit 210 , and the summary generation unit 212 and may operate in conjunction with each other to generate the summary. Further, the transceiver 206 may be communicatively coupled to the communication network 108 .
  • the processor 202 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to execute a set of instructions stored in the memory 204 .
  • the processor 202 may be implemented based on several processor technologies known in the art.
  • the processor 202 operates in coordination with the non-transitory computer readable medium 203 , the transceiver 206 , the input/output unit 208 , the monitoring unit 210 , and the summary generation unit 212 to generate the summary.
  • Examples of the processor 202 include, but not limited to, an X86-based processor, a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (ASIC) processor, a Complex Instruction Set Computing (CISC) processor, and/or other processor.
  • RISC Reduced Instruction Set Computing
  • ASIC Application-Specific Integrated Circuit
  • CISC Complex Instruction Set Computing
  • the non-transitory computer readable medium 203 may include any tangible or non-transitory storage media or memory media such as electronic, magnetic, or optical media—e.g., disk or CD/DVD-ROM coupled to processor 202 .
  • the memory 204 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to store the set of instructions, which are executed by the processor 202 .
  • the memory 204 may be configured to store one or more programs, routines, or scripts that are executed in coordination with the processor 202 .
  • the memory 204 may be implemented based on a Random Access Memory (RAM), a Read-Only Memory (ROM), a Hard Disk Drive (HDD), a storage server, and/or a Secure Digital (SD) card.
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • HDD Hard Disk Drive
  • SD Secure Digital
  • the transceiver 206 comprises of suitable logic, circuitry, interfaces, and/or code that may be configured to receive the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting, via the communication network 108 .
  • the transceiver 206 may be further configured to receive the meeting data and the metadata associated with the first meeting.
  • the transceiver 206 may be further configured to transmit the generated summary to each of the plurality of electronic devices, such as electronic device 102 , 104 and 106 .
  • the transceiver 206 may implement one or more known technologies to support wired or wireless communication with the communication network 108 .
  • the transceiver 206 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a Universal Serial Bus (USB) device, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and/or a local buffer.
  • the transceiver 206 may communicate via wireless communication with networks, such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN).
  • networks such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN).
  • networks such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and
  • the wireless communication may use any of a plurality of communication standards, protocols and technologies, such as: Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS).
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • W-CDMA wideband code division multiple access
  • CDMA code division multiple access
  • TDMA time division multiple access
  • Wi-Fi e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n
  • VoIP voice over Internet Protocol
  • Wi-MAX a protocol for email, instant messaging
  • the input/output unit 208 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to display the generated summary.
  • the input/output unit 208 comprises of various input and output devices that are configured to communicate with the processor 202 .
  • Examples of the input devices include, but are not limited to, a keyboard, a mouse, a joystick, a touch screen, a microphone, a camera, and/or a docking station.
  • Examples of the output devices include, but are not limited to, a display screen and/or a speaker.
  • the display screen may be configured to display the generated summary.
  • the monitoring unit 210 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to monitor in real time, the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting.
  • the monitoring unit 210 may be further configured to track the one or more updates performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a to the monitored content during and post the first meeting.
  • the monitoring unit 210 may be further configured to identify the start time of the second meeting scheduled in continuation to the first meeting based on the calendar information associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a.
  • the summary generation unit 212 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to generate the summary of one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting.
  • the summary generation unit 212 may be configured to generate the meeting data associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • the summary generation unit 212 may be configured to train the ML model associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a based on the meeting data.
  • the plurality of electronic devices may enable the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a to connect to the first meeting for working collaboratively and to discuss one or more topics during the first meeting.
  • the first meeting (scheduled on Dec. 30, 2020 from 11:00 am to 12 pm) may be of type “Status Update Meeting” and the agenda of the meeting may include the below points:
  • the plurality of electronic devices may enable the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a to consume content during the first meeting.
  • the content may include at least one of the one or more applications, the one or more files, the one or more libraries, the audio content, the video content, the meeting notes, the presentation content, the screen sharing content, and/or the file sharing content that may be accessed by the plurality of participants during the first meeting and post the first meeting.
  • the content may be associated with the plurality of attributes comprising the author name, the last modified time, the last modified date, the last modified author, the version number, the title, the category, the total the editing time, the file size, the creation date, the last accessed date, the last accessed author name, the sharing status, and the online status.
  • the Participant A 102 a may have opened a File 1. Further, Participant B 104 a may have spoken that “Participant A 102 a needs to update the File 1 before the next meeting to include the new software features”. Additionally, Participant C 106 a may have initiated an Application 1 via a screen sharing session during the first meeting to explain one or more new features that need to be included in the software before the release. Further, Participant A 102 a may have spoken that “Let us discuss about hiring of developers in the next meeting. Participant C 106 a must update the File 2 containing details of eligible candidates”. Additionally, Participant B 104 a may have accessed an Application 2 before a first defined time before the first meeting. In an example, File 1 may have the below mentioned plurality of attributes.
  • each of the plurality of electronic devices may enable each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a to perform the one or more interactions on the content being consumed.
  • the one or more interactions may include events performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a by using the electronic device or one or more device peripherals attached to the electronic device.
  • the events may include at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
  • the meeting data comprises data associated with the one or more interactions performed by plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting.
  • each of the plurality of electronic devices such as electronic device 102 , 104 and 106 may be configured to transmit in real time the consumed content and the meeting data associated with the first meeting to the central server 112 .
  • the transceiver 206 may be configured to receive in real time the meeting data and the consumed content by plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting.
  • the first defined time may be user configurable. For example, the user configurable time may be 15 minutes before the scheduled time of the meeting.
  • the monitoring unit 210 may be configured to monitor the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting.
  • the monitored content is associated with each of the plurality of participants. For example, continuing with the previous example, the monitoring unit 210 may monitor and determine that File 1 was opened by Participant A 102 a and the Participant A 102 a may have performed a hover event. Further, the monitoring unit 210 may monitor and determine that the screen sharing session was initiated by the Participant C 106 a and the Application 1 was initiated during the meeting.
  • the monitoring unit 210 may monitor and determine that Participant B 104 a may have accessed an Application 2 ten minutes (the first defined time) before the first meeting. Additionally, the monitoring unit 210 may monitor and determine the audio of Participant A 102 a and also the click event performed by the Participant A 102 a on File 2.
  • the monitoring unit 210 may be configured to track the one or more updates performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a to the monitored content during the first meeting and post the first meeting.
  • Participants A 102 a may make edits to File 2 during the meeting and the Participant C 106 a may have made one or more updates to the one or more libraries post the first meeting to include new software features.
  • the monitoring unit 210 may be configured to identify the start time of the second meeting scheduled in continuation to the first meeting based on the calendar information associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • calendar information for each of the Participant A 102 a , Participant B 104 a , and Participant C 106 a may show a follow up “Status Update Meeting” with agenda similar to that of the first meeting.
  • the first meeting and the second meeting may have same plurality of participants and agenda of either of the first meeting and the second meeting may forms a subset of other.
  • the second meeting may be scheduled on Jan. 1, 2021 from 11:00 am to 12:00 pm and includes the same plurality of participants i.e. Participant A 102 a , Participant B 104 a , and Participant C 106 a and the agenda of the second meeting may be as below:
  • the summary generation unit 212 may be configured to generate the summary of the one or more action points for each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a based on the meeting data and the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting and for the first defined time before the first meeting.
  • the summary generation unit 212 may be configured to generate the meeting data associated with each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a of the first meeting.
  • the meeting data may include data associated with the one or more interactions performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting.
  • the meeting data may further include a transcript of the audio generated by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the first meeting, transcript of the content shared by the participant during the first meeting, and/or meeting notes inputted by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a during the meeting.
  • the summary generation unit 212 may be configured to train the machine learning (ML) model associated with the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a based on the meeting data. Additionally, the summary generation unit 212 may be configured to train the ML model based on metadata associated with the first meeting. In an exemplary embodiment, the metadata associated with the first meeting may include at least a type of the first meeting.
  • ML machine learning
  • the one or more interactions performed by each participant and the content consumed by each participant may be monitored and tracked to create the machine learning model associated with each participant.
  • the machine learning model may comprise of one or more graphs including one or more nodes.
  • each of the one or more nodes may represent the content consumed by the participant and traversal amongst such one or more nodes may be based on the meeting data.
  • the ML model may be trained using either one of: one or more supervised learning techniques, one or more semi-supervised learning techniques, or one or more unsupervised learning techniques.
  • the ML model may be trained based on the type of the meeting. Examples of types of meetings may include Status Update Meetings, Decision-Making Meetings, Problem-Solving Meetings, Team-Building Meetings, Idea-Sharing Meetings, Innovation Meetings, and the like.
  • the summary generation unit 212 may be configured to determine the one or more action points to be performed by the participant before the start time of the second meeting based on the trained ML. In an exemplary embodiment, the summary generation unit 212 may be configured to the train the ML model based on the tracked one or more updates associated with the consumed content and the meeting data.
  • the monitoring unit 210 may be configured to track one or more updates made by the Participant A 102 a to File 1 and determine changes in the plurality of attributes associated with File 1 before the second defined time of the start time of the second meeting. Further, the monitoring unit 210 may be configured to track one or more updates made by the Participant C 106 a to File 2 and determine one or more changes in the plurality of attributes associated with File 2 before the second defined time of the start time of the second meeting.
  • the Participant A 102 a may have made one or mode edits to the File 1 to include new software features. Further, Participant C 106 a may have made one or mode edits to the File 2 to include details of one or more eligible candidates for hiring in the development team. Such one or more updates may be tracked by the monitoring unit 210 and the plurality of attributes associated with each of the File 1 and File 2 may be changed. In an example, File 1 may have the below mentioned updated plurality of attributes.
  • the summary generation unit 212 may be configured to the summary of the one or more action points for each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting.
  • the summary generation unit 212 may be configured to generate the summary of one or more action points for plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • the generated summary of one or more action points may include the below points:
  • the input/output unit 208 may be configured to display the summary to each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a before the second defined time (15 minutes) of the start time of the second meeting.
  • displaying the summary may include opening content (one or more files, or one or more applications) for which the one or more updates have been performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • the second defined time may be user configurable.
  • the summary generation unit 212 may be configured to display the one or more updates performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a and may also display the change in the plurality of attributes associated with the content that was consumed during and post the first meeting before the second defined time of the start time of the second meeting.
  • the transceiver 206 may be configured to transmit the generated summary of the one or more action points to each of the plurality of electronic devices, such as electronic device 102 , 104 and 106 .
  • the plurality of electronic devices, such as electronic device 102 , 104 and 106 may be configured to display the summary to each of the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a before the second defined time (15 minutes) of the start time of the second meeting.
  • displaying the summary may include opening content (one or more files, or one or more applications) for which the one or more updates have been performed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • the second defined time may be user configurable.
  • the summary may be displayed as a pop-up to the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a .
  • the summary generation unit 212 may automatically understand the content consumed by the plurality of participants, such as Participant A 102 a , Participant B 104 a , and Participant C 106 a in the first meeting and provide the summary before the start of the second meeting to enable the participants to work in a more collaborative and effective manner.
  • FIG. 3 is a flowchart that illustrates a method for displaying the generated summary of one or more action points for each of a plurality of participants in the meeting, in accordance with at least one exemplary embodiment of the disclosure.
  • the method 300 starts at 302 and proceeds to 304 .
  • a central server 112 may be configured to monitor in real time, content consumed by a plurality of participants during a first meeting and for a first defined time before the first meeting.
  • the central server 112 may be configured to track one or more updates performed by the plurality of participants to the monitored content during and post the first meeting.
  • the central server 112 may be configured to identify a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants.
  • the central server 112 may be configured to generate a summary of one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting.
  • the central server 112 may be configured to display the summary to each of the plurality of participants before a second defined time of the start time of the second meeting.
  • displaying the summary may comprise opening content for which the one or more updates may have been performed by the plurality of participants. Control passes to end operation 314 .
  • FIG. 4 is a block diagram of an exemplary computer system for implementing a dynamically generated news feed, in accordance with various exemplary embodiments of the present disclosure.
  • an exemplary computer system 401 which may comprise a central processing unit (“CPU” or “processor”) 402 , an I/O interface 403 , an input device 404 , an output device 405 , a transceiver 406 , a network interface 407 , a communication network 408 , devices, such as 409 , 410 and 411 , storage interface 412 , one or more memory devices, such as RAM 413 , ROM 414 , and memory device 415 .
  • CPU central processing unit
  • I/O interface 403 an input device 404
  • an output device 405 a transceiver 406
  • a network interface 407 a communication network 408
  • devices such as 409 , 410 and 411
  • storage interface 412 such as RAM 413 , ROM 414 , and memory device 415 .
  • Computer system 401 may be used for generating and displaying the summary.
  • the computer system 401 may comprise a central processing unit (“CPU” or “processor”) 402 .
  • Processor 402 may comprise at least one data processor for executing program components for executing user- or system-generated requests.
  • a user may include a person, a person using a device such as those included in this disclosure, or such a device itself.
  • the processor 402 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
  • the processor 402 may include a microprocessor, such as AMD Athlon, Duron or Opteron, ARM's application, embedded or secure processors, IBM PowerPC, Intel's Core, Itanium, Xeon, Celeron or other line of processors, etc.
  • the processor 402 may be implemented using mainframe, distributed processor, multi-core, parallel, grid, or other architectures. Some exemplary embodiments may utilize embedded technologies like application-specific integrated circuits (ASICs), digital signal processors (DSPs), Field Programmable Gate Arrays (FPGAs), etc.
  • ASICs application-specific integrated circuits
  • DSPs digital signal processors
  • FPGAs Field Programmable Gate Arrays
  • the processor 402 may be disposed in communication with one or more input/output (I/O) devices via I/O interface 403 .
  • the I/O interface 403 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n/b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
  • CDMA code-division multiple access
  • HSPA+ high-speed packet access
  • GSM global system for mobile communications
  • LTE long-term evolution
  • WiMax wireless wide area network
  • the computer system 401 may communicate with one or more I/O devices.
  • the input device 404 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, sensor (e.g., accelerometer, light sensor, GPS, gyroscope, proximity sensor, or the like), stylus, scanner, storage device, transceiver, video device/source, visors, etc.
  • Output device 405 may be a printer, fax machine, video display (e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, or the like), audio speaker, etc.
  • video display e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, or the like
  • audio speaker etc.
  • a transceiver 406 may be disposed in connection with the processor 402 . The transceiver may facilitate various types of wireless transmission or reception.
  • the transceiver may include an antenna operatively connected to a transceiver chip (e.g., Texas Instruments WiLink WL1283, Broadcom BCM4750IUB8, Infineon Technologies X-Gold 618-PMB9800, or the like), providing IEEE 802.11a/b/g/n, Bluetooth, FM, global positioning system (GPS), 2G/3G HSDPA/HSUPA communications, etc.
  • a transceiver chip e.g., Texas Instruments WiLink WL1283, Broadcom BCM4750IUB8, Infineon Technologies X-Gold 618-PMB9800, or the like
  • IEEE 802.11a/b/g/n e.g., Texas Instruments WiLink WL1283, Broadcom BCM4750IUB8, Infineon Technologies X-Gold 618-PMB9800, or the like
  • IEEE 802.11a/b/g/n e.g., Bluetooth, FM, global positioning system (GPS), 2G/3G HSDPA/HS
  • the processor 402 may be disposed in communication with a communication network 408 via a network interface 407 .
  • the network interface 407 may communicate with the communication network 408 .
  • the network interface may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc.
  • the communication network 408 may include, for example, a direct interconnection, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, etc.
  • the computer system 401 may communicate with devices 409 , 410 , and 411 .
  • These devices may include, without limitation, personal computer(s), server(s), fax machines, printers, scanners, various mobile devices such as cellular telephones, smartphones (e.g., Apple iPhone®, Blackberry®, Android®-based phones, etc.), tablet computers, eBook readers (Amazon Kindle®, Nook, etc.), laptop computers, notebooks, gaming consoles (Microsoft Xbox®, Nintendo DS®, Sony PlayStation®, etc.), or the like.
  • the computer system 1101 may itself embody one or more of these devices.
  • the processor 402 may be disposed in communication with one or more memory devices (e.g., RAM 413 , ROM 414 , etc.) via a storage interface 412 .
  • the storage interface may connect to memory devices including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as serial advanced technology attachment (SATA), integrated drive electronics (IDE), IEEE-1394, universal serial bus (USB), fiber channel, small computer systems interface (SCSI), etc.
  • the memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, redundant array of independent discs (RAID), solid-state memory devices, solid-state drives, etc.
  • the memory devices may store a collection of program or database components, including, without limitation, an operating system 416 , user interface application 417 , web browser 418 , mail server 419 , mail client 420 , user/application data 421 (e.g., any data variables or data records discussed in this disclosure), etc.
  • the operating system 416 may facilitate resource management and operation of the computer system 401 .
  • Operating systems include, without limitation, Apple Macintosh OS X, UNIX, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSD, NetBSD, OpenBSD, etc.), Linux distributions (e.g., Red Hat, Ubuntu, Kubuntu, etc.), IBM OS/2, Microsoft Windows (XP, Vista/7/8, etc.), Apple iOS, Google Android, Blackberry OS, or the like.
  • User interface 417 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities.
  • user interfaces may provide computer interaction interface elements on a display system operatively connected to the computer system 401 , such as cursors, icons, check boxes, menus, scrollers, windows, widgets, etc.
  • GUIs Graphical user interfaces
  • GUIs may be employed, including, without limitation, Apple Macintosh operating systems' Aqua, IBM OS/2, Microsoft Windows (e.g., Aero, Metro, etc.), Unix X-Windows, web interface libraries (e.g., ActiveX, Java, Javascript, AJAX, HTML, Adobe Flash, etc.), or the like.
  • the computer system 401 may implement a web browser 1118 stored program component.
  • the web browser may be a hypertext viewing application, such as Microsoft Internet Explorer, Google Chrome, Mozilla Firefox, Apple Safari, etc. Secure web browsing may be provided using HTTPS (secure hypertext transport protocol), secure sockets layer (SSL), Transport Layer Security (TLS), etc. Web browsers may utilize facilities such as AJAX, DHTML, Adobe Flash, JavaScript, Java, application programming interfaces (APIs), etc.
  • the computer system 401 may implement a mail server 519 stored program component.
  • the mail server may be an Internet mail server such as Microsoft Exchange, or the like.
  • the mail server may utilize facilities such as ASP, ActiveX, ANSI C++/C#, Microsoft .NET, CGI scripts, Java, JavaScript, PERL, PHP, Python, WebObjects, etc.
  • the mail server may utilize communication protocols such as internet message access protocol (IMAP), messaging application programming interface (MAPI), Microsoft Exchange, post office protocol (POP), simple mail transfer protocol (SMTP), or the like.
  • IMAP internet message access protocol
  • MAPI messaging application programming interface
  • POP post office protocol
  • SMTP simple mail transfer protocol
  • the computer system 401 may implement a mail client 420 stored program component.
  • the mail client may be a mail viewing application, such as Apple Mail, Microsoft Entourage, Microsoft Outlook, Mozilla Thunderbird, etc.
  • computer system 401 may store user/application data 421 , such as the data, variables, records, etc. as described in this disclosure.
  • databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
  • databases may be implemented using standardized data structures, such as an array, hash, linked list, struct, structured text file (e.g., XML), table, or as object-oriented databases (e.g., using ObjectStore, Poet, Zope, etc.).
  • object-oriented databases e.g., using ObjectStore, Poet, Zope, etc.
  • Such databases may be consolidated or distributed, sometimes among the various computer systems discussed above in this disclosure. It is to be understood that the structure and operation of the computer or database component may be combined, consolidated, or distributed in any working combination.
  • a computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored.
  • a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform operations or stages consistent with the exemplary embodiments described herein.
  • the term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
  • the methods and systems may identify a start time of the second meeting scheduled in continuation to the first meeting and may generate summary of one or more action points associated with each participant of the second meeting based on content consumed by the participants in the first meeting.
  • Disclosed methods and systems effectively monitor and track in real time one or more updates performed by the plurality of participants to the content consumed during and post the first meeting and thus may generate an accurate summary of action points for each of the participant, thereby providing a context to the second meeting and also ensure that the action items discussed on the first meeting have been actioned upon before the second meeting is initiated. This ensures that the second meeting is much more productive.
  • a trained machine learning model is used to generate the summary hence the generated summary of the action points for each of the participants is much accurate.
  • the present disclosure may be realized in hardware, or a combination of hardware and software.
  • the present disclosure may be realized in a centralized fashion, in at least one computer system, or in a distributed fashion, where different elements may be spread across several interconnected computer systems.
  • a computer system or other apparatus adapted for carrying out the methods described herein may be suited.
  • a combination of hardware and software may be a general-purpose computer system with a computer program that, when loaded and executed, may control the computer system such that it carries out the methods described herein.
  • the present disclosure may be realized in hardware that comprises a portion of an integrated circuit that also performs other functions.
  • any of the aforementioned operations and/or system modules may be suitably replaced, reordered, or removed, and additional operations and/or system modules may be inserted, depending on the needs of a particular application.
  • the systems of the aforementioned exemplary embodiments may be implemented using a wide variety of suitable processes and system modules, and are not limited to any particular computer hardware, software, middleware, firmware, microcode, and the like.
  • the claims can encompass exemplary embodiments for hardware and software, or a combination thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Computer Interaction (AREA)
  • General Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Hardware Design (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Telephonic Communication Services (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Provided is a method and a system for generating a summary of action points for participants in a meeting. The method is implemented by a central server including one or more processors. The method includes monitoring in real time, content consumed by participants during a first meeting and for a first defined time before the first meeting. The method includes tracking updates performed by participants to the content during and post the first meeting. The method includes identifying start time of a second meeting scheduled in continuation to first meeting based on calendar information associated with participants. The method includes generating summary of action points for participants based on content consumed by the participants during the first meeting and for the first defined time before the first meeting. The method includes displaying the summary to the participants before a second defined time of the start time of the second meeting.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS/INCORPORATION BY REFERENCE
  • This application makes reference to, claims priority to, and claims benefit from U.S. Provisional Application Ser. No. 63/028,123, which was filed on May 21, 2020.
  • The above referenced application is hereby incorporated herein by reference in its entirety.
  • TECHNICAL FIELD
  • The presently disclosed embodiments are related, in general, to a summary generation of an online meeting. More particularly, the presently disclosed embodiments are related to a method and a system for generating a summary of one or more action points for each of a plurality of participants in the online meeting based on content consumed by the plurality of participants during the online meeting.
  • BACKGROUND
  • Meetings may be a common everyday occurrence especially for members of an organization. Groups of people may assemble often to discuss one or more determined topics. By way of example, there may be a status meeting, a budget meeting, a staff meeting, a product development meeting, a patent disclosure meeting, a board meeting, and the like. Meetings may be viewed by organizations as an important vehicle for facilitating communication amongst group members for the purpose of disseminating knowledge, problem solving, brainstorming and/or the like. Accordingly, many users of the organization may spend a large portion of their time in one or more meetings.
  • Conventionally, unproductive meetings may be incredibly expensive for the organization. This may be specifically true when information and responsibilities derived from the meetings may not be captured, tracked or managed efficiently. Often, a person who may attend the meeting may collect a multitude of information. The information may include action items, contact information, data files (including audio and video files), key points, decisions, and the like. However, the meeting participants may often fail to properly organize and distribute the information after the meeting.
  • Further, corporate users may have very busy schedules and, as such, it is quite hard to schedule meetings with a large audience. It may be especially difficult when the meeting has a long duration due to the large number of agenda items which need to be discussed with the large audience. For instance, if a meeting is to be 3 hours long and the invitee list has 20 or more invitees then it is unlikely in today's business environment that a host of the meeting will be able to find a 3-hour span where each and every invitee will have no conflicts. Thus, the meeting may have to be conducted in smaller timings where the agenda may be divided into multiple meetings. However, for such related and subsequent meetings many a times the participants would not have taken action on one or more items that may have been discussed during a previous meeting.
  • Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of described systems with some aspects of the present disclosure, as set forth in the remainder of the present application and with reference to the drawings.
  • SUMMARY
  • A method and a system for generating a summary of one or more action points for each of a plurality of participants in a meeting is provided substantially as shown in, and/or described in connection with, at least one of the figures, as set forth more completely in the claims.
  • These and other features and advantages of the present disclosure may be appreciated from a review of the following detailed description of the present disclosure, along with the accompanying figures in which like reference numerals refer to like parts throughout.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a block diagram that illustrates a system environment for generating a summary of one or more action points for each of a plurality of participants in a meeting, in accordance with at least one exemplary embodiment of the disclosure;
  • FIG. 2 is a block diagram that illustrates a central server configured to generate a summary of one or more action points for each of a plurality of participants in the meeting, in accordance with at least one exemplary embodiment of the disclosure;
  • FIG. 3 is a flowchart that illustrates a method for displaying the generated summary of one or more action points for each of a plurality of participants in the meeting, in accordance with at least one exemplary embodiment of the disclosure; and
  • FIG. 4 is a block diagram of an exemplary computer system for generating a summary of one or more action points for each of a plurality of participants in the meeting, in accordance with various exemplary embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • The illustrated embodiments provide a method and a system for generating a summary of one or more action points for each of a plurality of participants in a first meeting. The method may be implemented by a central server including one or more processors. The method may include monitoring in real time, content consumed by the plurality of participants during the first meeting and for a first defined time before the first meeting. In an exemplary embodiment, the monitored content may be associated with each of the plurality of participants. The method may include tracking one or more updates performed by the plurality of participants to the monitored content during and post the first meeting. The method may include identifying a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants. The method may include generating the summary of the one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting. The method may include displaying the summary to each of the plurality of participants before a second defined time of the start time of the second meeting. In an exemplary embodiment, displaying the summary may include opening content for which the one or more updates may have been performed by the plurality of participants.
  • FIG. 1 is a block diagram that illustrates a system environment for generating a summary of one or more action points for each of a plurality of participants in a meeting, in accordance with at least one exemplary embodiment of the disclosure. Referring to FIG. 1, the system environment 100 may include a plurality of electronic devices, such as 102, 104 and 106, which are associated with a plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, a communication network 108, a database server 110, and a central server 112. Each of the plurality of electronic devices 102, 104, and 106 that are associated with the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, may be communicatively coupled with the database server 110, and the central server 112, via the communication network 108.
  • The plurality of electronic devices, such as electronic device 102, 104 and 106 may refer to a computing device used by a participant who has joined an online meeting to collaboratively work with a remaining plurality of participants. The plurality of electronic devices, such as electronic device 102, 104 and 106 may comprise of one or more processors and one or more memories. The one or more memories may include computer readable code that may be executable by the one or more processors to perform predetermined operations. In an exemplary embodiment, the plurality of electronic devices, such as electronic device 102, 104 and 106 may present a user-interface to the participant for performing one or more interactions on the electronic device. Examples of the plurality of electronic devices, such as electronic device 102, 104 and 106 may include, but are not limited to, a personal computer, a laptop, a personal digital assistant (PDA), a mobile device, a tablet, or any other computing device.
  • The plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a may be utilizing the electronic device 102, the electronic device 104 and the electronic device 106, respectively as shown in FIG. 1. The plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a may interact with the plurality of electronic devices, such as electronic device 102, 104 and 106 by performing one or more interactions on the user-interface presented to each of the respective participants of the associated electronic device.
  • In an exemplary embodiment, the communication network 108 may include a communication medium through which each of the plurality of electronic devices, such as 102, 104 and 106, the database server 110, and the central server 112 may communicate with each other. Such a communication may be performed, in accordance with various wired and wireless communication protocols. Examples of such wired and wireless communication protocols include, but are not limited to, Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), ZigBee, EDGE, infrared (IR), IEEE 802.11, 802.16, 2G, 3G, 4G, 5G, 6G cellular communication protocols, and/or Bluetooth (BT) communication protocols. The communication network 108 may include, but is not limited to, the Internet, a cloud network, a Wireless Fidelity (Wi-Fi) network, a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a telephone line (POTS), and/or a Metropolitan Area Network (MAN).
  • In an exemplary embodiment, the plurality of electronic devices, such as the electronic devices 102, 104 and 106 may include a database server 110. In an exemplary embodiment, the database server 110 may refer to a computing device that may be configured to store files associated with one or more applications installed on the electronic device. Further, the database server 110 may be configured to store information, such as, but not limited to, content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during a first meeting and for a first defined time before the first meeting, calendar information associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, meeting data, the one or more interactions, and/or the summary comprising of one or more action points.
  • In an exemplary embodiment, the plurality of electronic devices, such as electronic device 102, 104 and 106 may communicate with the database server 110 using one or more protocols such as, but not limited to, Open Database Connectivity (ODBC) protocol and Java Database Connectivity (JDBC) protocol. In an exemplary embodiment, the database server 110 may include a special purpose operating system specifically configured to perform one or more database operations on at least one of the meeting data, the calendar information, the summary, the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, and the like. Examples of database operations may include, but are not limited to, Select, Insert, Update, and Delete. In an exemplary embodiment, the database server 110 may include hardware that may be configured to perform one or more predetermined operations. In an exemplary embodiment, the database server 110 may be realized through various technologies such as, but not limited to, Microsoft® SQL Server, Oracle®, IBM DB2®, Microsoft Access®, PostgreSQL®, MySQL® and SQLite®, and the like.
  • A person having ordinary skill in the art will appreciate that the scope of the disclosure is not limited to realizing the plurality of electronic devices, such as electronic device 102, 104 and 106 and the database server 110 as separate entities. In an exemplary embodiment, the database server 110 may be realized as an application program installed on and/or running on the electronic device without departing from the scope of the disclosure.
  • In an exemplary embodiment, the central server 112 may refer to a computing device or a software framework hosting an application or a software service. In an embodiment, the central server 112 may be implemented to execute procedures such as, but not limited to, programs, routines, or scripts stored in one or more memories for supporting the hosted application or the software service. In an embodiment, the hosted application or the software service may be configured to perform one or more predetermined operations. The central server 112 may be realized through various types of application servers such as, but are not limited to, a Java application server, a .NET framework application server, a Base4 application server, a PHP framework application server, or any other application server framework.
  • The “meeting” mentioned in the disclosure herein refers to an online meeting conducted via one or more video conferencing tools. The “meeting” may involve the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a interacting with each other and/or discussing one or more topics. In some examples, the meeting may be organized face to face or on a virtual platform over the communication network 108. Meeting on the virtual platform may involve the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, joining a meeting session (created by the central server 112 in the communication network 108) using the respective plurality of electronic devices, such as 102, 104 and 106.
  • Further, each meeting may have an associated metadata. The metadata may include at least a type of the meeting. Examples of types of meetings may include Status Update Meetings, Decision-Making Meetings, Problem-Solving Meetings, Team-Building Meetings, Idea-Sharing Meetings, Innovation Meetings, and the like. Additionally, during each meeting, the central server 112 may be configured to generate the meeting data. The meeting data may include data associated with the one or more interactions performed by the participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, during the meeting. In an exemplary embodiment, the one or more interactions may include events performed by the participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, by using the electronic devices or one or more device peripherals attached to the electronic devices. In an exemplary embodiment, the events may comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
  • In an exemplary embodiment, the first meeting may correspond to a meeting that may involve the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a interacting with each other and/or discussing the one or more topics.
  • In an exemplary embodiment, the second meeting may correspond to a meeting that may involve the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a interacting with each other and/or discussing the one or more topics. In an exemplary embodiment, the second meeting may have the same plurality of participants as that of the first meeting and may also include one or more new participants. Additionally, an agenda of either of the first meeting and the second meeting may form a subset of other. In other words, one or more points that may be in the agenda of the first meeting may also be present in the agenda of the second meeting and thus, the second meeting may be scheduled in continuation to the first meeting at a later point of time.
  • In an exemplary embodiment, the meeting (first meeting and second meeting) may be conducted on a video conferencing tool. Video conferencing tools may enable online communication for audio meetings, video meetings, and seminars between the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. The video conferencing tools may have one or more built-in features such as chat, screen sharing, recording, and the like. The video conferencing tools used for hosting the meeting may help to enhance collaboration within employees in an organization. The employees may host or attend virtual meetings with fellow employees, company partners, and/or customers. Examples of such video conferencing tools that may be utilized to conduct the meeting may include Skype®, Zoom®, Microsoft Teams®, Cisco Webex Meetings®, and the like.
  • In operation, before conducting the first meeting over the communication network 108 the central server 112 may be configured to create a meeting session. Prior to creation of the meeting session, the central server 112 may receive a request from an electronic device, such as 102 associated with a participant A 102 a such as an organizer of the first meeting, to create the meeting session. Along with the request, the organizer of the first meeting (one of the participant, such as Participant A 102 a) may define meeting joining information associated with the first meeting. Such meeting joining information associated with the first meeting may include at least an agenda of the first meeting, one or more topics to be discussed during the first meeting, a time duration of the first meeting, a schedule of the first meeting, meeting notes carried forwarded from previous meetings, and/or the like. After receiving the meeting joining information, the central server 112 may be configured to create the meeting session. Additionally, the central server 112 may share the meeting joining information with the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a.
  • The plurality of electronic devices, such as 102, 104 and 106 may enable the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a to join the first meeting based on the received meeting joining information. Further, during the first meeting, the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, may speak or share their respective video feeds. Additionally, or alternatively, the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, may share other content amongst each other in order to facilitate the discussions in the first meeting. The other content may include, but is not limited to, presentation content, screen sharing content, file sharing content, and/or the like.
  • In an exemplary embodiment, each of the plurality of electronic devices, such as 102, 104 and 106 may enable the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, to consume the content shared during the first meeting. Further, each of the plurality of electronic devices, such as 102, 104 and 106 may enable the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, to perform the one or more interactions on the content being consumed. In an exemplary embodiment, the meeting data comprises data associated with the one or more interactions performed by the participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting. In an exemplary embodiment, the one or more interactions may include events performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a by using the plurality of electronic devices, such as 102, 104 and 106 or one or more device peripherals attached to the plurality of electronic devices, such as 102, 104 and 106. In an exemplary embodiment, the events may include at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
  • In an exemplary embodiment, each of the plurality of electronic devices, such as 102, 104 and 106 may transmit the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting to the central server 112. In an exemplary embodiment, the consumed content may include at least one of one or more applications, one or more files, one or more libraries, audio content, video content, meeting notes, presentation content, screen sharing content, and/or file sharing content that may be accessed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting. Further, the content may be associated with a plurality of attributes comprising an author name, a last modified time, a last modified date, a last modified author, a version number, a title, a category, a total editing time, a file size, a creation date, a last accessed date, a last accessed author name, a sharing status, an online status, and the like.
  • During the meeting, the central server 112 may be configured to receive the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting. The first defined time may be user configurable.
  • Additionally, the central server 112 may monitor in real time, the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting. In an exemplary embodiment, the monitored content may be associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a.
  • Based on the real time monitoring, the central server 112 may be configured to track one or more updates performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a to the monitored content during and post the first meeting. In an exemplary embodiment, the tracked one or more updates may be stored in the central server 112 or alternatively may be transmitted to the database server 110 for storing.
  • Further, the central server 112 may be configured to identify a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. In an exemplary embodiment, the first meeting and the second meeting may have the same plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a and agenda of either of the first meeting and the second meeting may form a subset of other.
  • Upon identification of the start time of the scheduled second meeting, the central server 112 may be configured to generate the summary of one or more action points for each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a based on the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting. In an exemplary embodiment, in order to generate the summary of the one or more action points, the central server 112 may be configured to generate the meeting data associated with each participant of the first meeting. In an exemplary embodiment, the meeting data may include data associated with one or more interactions performed by each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting. In an exemplary embodiment, the one or more interactions may include events performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. by using the electronic device or one or more device peripherals attached to the electronic device. In an exemplary embodiment, the events may include at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
  • In an exemplary embodiment, the meeting data may further include a transcript of each of the audio generated during the first meeting, the content shared during the first meeting, and/or meeting notes input by each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a.
  • After the meeting data is generated, the central server 112 may be configured to train a machine learning (ML) model associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a based on the meeting data associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a, respectively. In an exemplary embodiment, the central server 112 may be configured to further train the ML model based on metadata associated with the first meeting. In an exemplary embodiment, the metadata associated with the first meeting may include at least a type of the first meeting. Examples of types of meetings may include Status Update Meetings, Decision-Making Meetings, Problem-Solving Meetings, Team-Building Meetings, Idea-Sharing Meetings, Innovation Meetings, and the like.
  • After the ML model is trained, the central server 112 may be configured to determine the one or more action points to be performed by each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a before the start time of the second meeting based on the trained ML. Further, the central server 112 may be configured to display the summary including the one or more action points to each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a before a second defined time of the start time of the second meeting. In an exemplary embodiment, displaying the summary may include opening content for which the one or more updates may have been performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. In an exemplary embodiment, the second defined time may be user configurable.
  • Additionally, the central server 112 may be configured to transmit the generated summary of the one or more action points to each of the plurality of electronic devices, such as electronic device 102, 104 and 106. Further, in an exemplary embodiment, the central server 112 may be configured to open and/or initiate content on each of the plurality of electronic devices, such as electronic device 102, 104 and 106 before the second defined time of the start time of the second meeting. In an exemplary embodiment, the content that may be opened and/or initiated may correspond to at least one of the one or more applications, the one or more files, the one or more libraries, the audio content, the video content, the meeting notes, the presentation content, the screen sharing content, and/or the file sharing content that was accessed by the plurality of participants during the first meeting and the one or more updates may have been performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a.
  • FIG. 2 is a block diagram that illustrates a central server configured to generate a summary of one or more action points for each of a plurality of participants in the meeting, in accordance with at least one exemplary embodiment of the disclosure. FIG. 2 has been explained in conjunction with the elements of FIG. 1. Referring to FIG. 2, there is shown the central server 112, which may include a processor 202, a non-transitory computer readable medium 203, a memory 204, a transceiver 206, an input/output unit 208, a monitoring unit 210, and a summary generation unit 212. The processor 202 may be communicatively coupled to the non-transitory computer readable medium 203, the memory 204, the transceiver 206, the input/output unit 208, the monitoring unit 210, and the summary generation unit 212 and may operate in conjunction with each other to generate the summary. Further, the transceiver 206 may be communicatively coupled to the communication network 108.
  • The processor 202 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to execute a set of instructions stored in the memory 204. The processor 202 may be implemented based on several processor technologies known in the art. The processor 202 operates in coordination with the non-transitory computer readable medium 203, the transceiver 206, the input/output unit 208, the monitoring unit 210, and the summary generation unit 212 to generate the summary. Examples of the processor 202 include, but not limited to, an X86-based processor, a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (ASIC) processor, a Complex Instruction Set Computing (CISC) processor, and/or other processor.
  • The non-transitory computer readable medium 203 may include any tangible or non-transitory storage media or memory media such as electronic, magnetic, or optical media—e.g., disk or CD/DVD-ROM coupled to processor 202.
  • The memory 204 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to store the set of instructions, which are executed by the processor 202. In an exemplary embodiment, the memory 204 may be configured to store one or more programs, routines, or scripts that are executed in coordination with the processor 202. The memory 204 may be implemented based on a Random Access Memory (RAM), a Read-Only Memory (ROM), a Hard Disk Drive (HDD), a storage server, and/or a Secure Digital (SD) card.
  • The transceiver 206 comprises of suitable logic, circuitry, interfaces, and/or code that may be configured to receive the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting, via the communication network 108. The transceiver 206 may be further configured to receive the meeting data and the metadata associated with the first meeting. The transceiver 206 may be further configured to transmit the generated summary to each of the plurality of electronic devices, such as electronic device 102, 104 and 106. The transceiver 206 may implement one or more known technologies to support wired or wireless communication with the communication network 108. In an exemplary embodiment, the transceiver 206 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a Universal Serial Bus (USB) device, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and/or a local buffer. The transceiver 206 may communicate via wireless communication with networks, such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN). The wireless communication may use any of a plurality of communication standards, protocols and technologies, such as: Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS).
  • The input/output unit 208 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to display the generated summary. The input/output unit 208 comprises of various input and output devices that are configured to communicate with the processor 202. Examples of the input devices include, but are not limited to, a keyboard, a mouse, a joystick, a touch screen, a microphone, a camera, and/or a docking station. Examples of the output devices include, but are not limited to, a display screen and/or a speaker. The display screen may be configured to display the generated summary.
  • The monitoring unit 210 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to monitor in real time, the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting. The monitoring unit 210 may be further configured to track the one or more updates performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a to the monitored content during and post the first meeting. The monitoring unit 210 may be further configured to identify the start time of the second meeting scheduled in continuation to the first meeting based on the calendar information associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a.
  • The summary generation unit 212 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to generate the summary of one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting. The summary generation unit 212 may be configured to generate the meeting data associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. The summary generation unit 212 may be configured to train the ML model associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a based on the meeting data.
  • In operation, the plurality of electronic devices, such as 102, 104 and 106 may enable the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a to connect to the first meeting for working collaboratively and to discuss one or more topics during the first meeting. For example, the first meeting (scheduled on Dec. 30, 2020 from 11:00 am to 12 pm) may be of type “Status Update Meeting” and the agenda of the meeting may include the below points:
      • Decide new features to be included in the new release of the software
      • Check current software development status
      • Decide software update release date
      • Check updates regarding hiring of candidates in the development team
  • In an exemplary embodiment, the plurality of electronic devices, such as 102, 104 and 106, may enable the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a to consume content during the first meeting. In an exemplary embodiment, the content may include at least one of the one or more applications, the one or more files, the one or more libraries, the audio content, the video content, the meeting notes, the presentation content, the screen sharing content, and/or the file sharing content that may be accessed by the plurality of participants during the first meeting and post the first meeting. In an exemplary embodiment, the content may be associated with the plurality of attributes comprising the author name, the last modified time, the last modified date, the last modified author, the version number, the title, the category, the total the editing time, the file size, the creation date, the last accessed date, the last accessed author name, the sharing status, and the online status.
  • For example, during the first meeting, the Participant A 102 a may have opened a File 1. Further, Participant B 104 a may have spoken that “Participant A 102 a needs to update the File 1 before the next meeting to include the new software features”. Additionally, Participant C 106 a may have initiated an Application 1 via a screen sharing session during the first meeting to explain one or more new features that need to be included in the software before the release. Further, Participant A 102 a may have spoken that “Let us discuss about hiring of developers in the next meeting. Participant C 106 a must update the File 2 containing details of eligible candidates”. Additionally, Participant B 104 a may have accessed an Application 2 before a first defined time before the first meeting. In an example, File 1 may have the below mentioned plurality of attributes.
      • File Size: 84 kb
      • Author name: XYZ
      • Last accessed date: Dec. 30, 2020
      • Last modified date: Nov. 30, 2020
      • Last accessed author name: Participant A 102 a
        Similarly, File 2 may have the below mentioned plurality of attributes.
      • File Size: 102 kb
      • Author name: ABC
      • Last accessed date: Oct. 10, 2020
      • Last modified date: Oct. 10, 2020
      • Last accessed author name: Participant A 102 a
  • Once the first meeting is in progress, each of the plurality of electronic devices, such as 102, 104 and 106 may enable each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a to perform the one or more interactions on the content being consumed. In an exemplary embodiment, the one or more interactions may include events performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a by using the electronic device or one or more device peripherals attached to the electronic device. In an exemplary embodiment, the events may include at least one of a mouse click event, a scrolling event, a typing event, or a hover event. In an exemplary embodiment, the meeting data comprises data associated with the one or more interactions performed by plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting. In an exemplary embodiment, each of the plurality of electronic devices, such as electronic device 102, 104 and 106 may be configured to transmit in real time the consumed content and the meeting data associated with the first meeting to the central server 112. The transceiver 206 may be configured to receive in real time the meeting data and the consumed content by plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting. In an exemplary embodiment, the first defined time may be user configurable. For example, the user configurable time may be 15 minutes before the scheduled time of the meeting.
  • Upon receiving the consumed content and the meeting data in real time, the monitoring unit 210 may be configured to monitor the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting. In an exemplary embodiment, the monitored content is associated with each of the plurality of participants. For example, continuing with the previous example, the monitoring unit 210 may monitor and determine that File 1 was opened by Participant A 102 a and the Participant A 102 a may have performed a hover event. Further, the monitoring unit 210 may monitor and determine that the screen sharing session was initiated by the Participant C 106 a and the Application 1 was initiated during the meeting. Further, the monitoring unit 210 may monitor and determine that Participant B 104 a may have accessed an Application 2 ten minutes (the first defined time) before the first meeting. Additionally, the monitoring unit 210 may monitor and determine the audio of Participant A 102 a and also the click event performed by the Participant A 102 a on File 2.
  • Further, the monitoring unit 210 may be configured to track the one or more updates performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a to the monitored content during the first meeting and post the first meeting. For example, Participant A 102 a may make edits to File 2 during the meeting and the Participant C 106 a may have made one or more updates to the one or more libraries post the first meeting to include new software features.
  • Additionally, the monitoring unit 210 may be configured to identify the start time of the second meeting scheduled in continuation to the first meeting based on the calendar information associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. For example, calendar information for each of the Participant A 102 a, Participant B 104 a, and Participant C 106 a may show a follow up “Status Update Meeting” with agenda similar to that of the first meeting. In an exemplary embodiment, the first meeting and the second meeting may have same plurality of participants and agenda of either of the first meeting and the second meeting may forms a subset of other.
  • For example, the second meeting may be scheduled on Jan. 1, 2021 from 11:00 am to 12:00 pm and includes the same plurality of participants i.e. Participant A 102 a, Participant B 104 a, and Participant C 106 a and the agenda of the second meeting may be as below:
      • Check status of the new features that were decided to be included in the new release of the software in the first meeting conducted on Dec. 30, 2020
      • Check current software development status
      • Determine software update release date
      • Check updates regarding hiring of developers in the development team
  • After identifying the start time of the second meeting scheduled in continuation to the first meeting, the summary generation unit 212 may be configured to generate the summary of the one or more action points for each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a based on the meeting data and the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting and for the first defined time before the first meeting.
  • In an exemplary embodiment, in order to generate the summary, the summary generation unit 212 may be configured to generate the meeting data associated with each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a of the first meeting. In an exemplary embodiment, the meeting data may include data associated with the one or more interactions performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting. In an exemplary embodiment, the meeting data may further include a transcript of the audio generated by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the first meeting, transcript of the content shared by the participant during the first meeting, and/or meeting notes inputted by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a during the meeting.
  • Further, the summary generation unit 212 may be configured to train the machine learning (ML) model associated with the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a based on the meeting data. Additionally, the summary generation unit 212 may be configured to train the ML model based on metadata associated with the first meeting. In an exemplary embodiment, the metadata associated with the first meeting may include at least a type of the first meeting.
  • In an exemplary embodiment, the one or more interactions performed by each participant and the content consumed by each participant may be monitored and tracked to create the machine learning model associated with each participant. In an exemplary embodiment, the machine learning model may comprise of one or more graphs including one or more nodes. In an exemplary embodiment, each of the one or more nodes may represent the content consumed by the participant and traversal amongst such one or more nodes may be based on the meeting data. In an exemplary embodiment, the ML model may be trained using either one of: one or more supervised learning techniques, one or more semi-supervised learning techniques, or one or more unsupervised learning techniques. In an exemplary embodiment, the ML model may be trained based on the type of the meeting. Examples of types of meetings may include Status Update Meetings, Decision-Making Meetings, Problem-Solving Meetings, Team-Building Meetings, Idea-Sharing Meetings, Innovation Meetings, and the like.
  • Upon training the ML model associated with each participant, the summary generation unit 212 may be configured to determine the one or more action points to be performed by the participant before the start time of the second meeting based on the trained ML. In an exemplary embodiment, the summary generation unit 212 may be configured to the train the ML model based on the tracked one or more updates associated with the consumed content and the meeting data.
  • In an exemplary embodiment, the monitoring unit 210 may be configured to track one or more updates made by the Participant A 102 a to File 1 and determine changes in the plurality of attributes associated with File 1 before the second defined time of the start time of the second meeting. Further, the monitoring unit 210 may be configured to track one or more updates made by the Participant C 106 a to File 2 and determine one or more changes in the plurality of attributes associated with File 2 before the second defined time of the start time of the second meeting.
  • For example, the Participant A 102 a may have made one or mode edits to the File 1 to include new software features. Further, Participant C 106 a may have made one or mode edits to the File 2 to include details of one or more eligible candidates for hiring in the development team. Such one or more updates may be tracked by the monitoring unit 210 and the plurality of attributes associated with each of the File 1 and File 2 may be changed. In an example, File 1 may have the below mentioned updated plurality of attributes.
      • File Size: 94 kb
      • Author name: XYZ
      • Last accessed date: Jan. 1, 2021
      • Last modified date: Jan. 1, 2021
      • Last accessed author name: Participant A 102 a
        Similarly, File 2 may have the below mentioned updated plurality of attributes.
      • File Size: 143 kb
      • Author name: ABC
      • Last accessed date: Jan. 1, 2021
      • Last modified date: Dec. 31, 2020
      • Last accessed author name: Participant C 106 a
  • Upon tracking the one or more updates performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a to the monitored content during and post the first meeting and after identification of the start time of the second meeting scheduled in continuation to the first meeting, the summary generation unit 212 may be configured to the summary of the one or more action points for each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting.
  • For example, in continuation of the previous example, the summary generation unit 212 may be configured to generate the summary of one or more action points for plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. For example, the generated summary of one or more action points may include the below points:
      • Open File 1 on electronic device 102 associated with Participant A 102 a before the second defined time (15 minutes) of the start time of the second meeting so that Participant A 102 a may make any required updates based on the discussion in the last meeting.
      • Open the updated File 1 on each of the plurality of electronic devices, such as electronic device 102, 104 and 106 and display the updates made to File 1 by Participant A 102 a.
      • Initiate before the second defined time (15 minutes) of the start time of the second meeting the Application 1 on the electronic device 106 associated with Participant C 106 a because the Participant C 106 a had initiated Application 1 during the first meeting to discuss about the one or more new features that need to be included in the software before the release.
      • Open File 2 on electronic device 106 associated with Participant C 106 a before the second defined time (15 minutes) of the start time of the second meeting so that Participant C 106 a may include details of eligible candidates for hiring in File 2.
      • Open the updated File 2 on each of the plurality of electronic devices, such as electronic device 102, 104 and 106 and display the updates made to File 2 by Participant C 106 a.
  • After the summary is generated, the input/output unit 208 may be configured to display the summary to each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a before the second defined time (15 minutes) of the start time of the second meeting. In an exemplary embodiment, displaying the summary may include opening content (one or more files, or one or more applications) for which the one or more updates have been performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. In an exemplary embodiment, the second defined time may be user configurable.
  • Additionally, the summary generation unit 212 may be configured to display the one or more updates performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a and may also display the change in the plurality of attributes associated with the content that was consumed during and post the first meeting before the second defined time of the start time of the second meeting.
  • Further, the transceiver 206 may be configured to transmit the generated summary of the one or more action points to each of the plurality of electronic devices, such as electronic device 102, 104 and 106. In an exemplary embodiment, the plurality of electronic devices, such as electronic device 102, 104 and 106 may be configured to display the summary to each of the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a before the second defined time (15 minutes) of the start time of the second meeting. In an exemplary embodiment, displaying the summary may include opening content (one or more files, or one or more applications) for which the one or more updates have been performed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. In an exemplary embodiment, the second defined time may be user configurable. In an exemplary embodiment, the summary may be displayed as a pop-up to the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a. Thus, the summary generation unit 212 may automatically understand the content consumed by the plurality of participants, such as Participant A 102 a, Participant B 104 a, and Participant C 106 a in the first meeting and provide the summary before the start of the second meeting to enable the participants to work in a more collaborative and effective manner.
  • FIG. 3 is a flowchart that illustrates a method for displaying the generated summary of one or more action points for each of a plurality of participants in the meeting, in accordance with at least one exemplary embodiment of the disclosure. Referring to FIG. 3, the method 300 starts at 302 and proceeds to 304. At 304, a central server 112 may be configured to monitor in real time, content consumed by a plurality of participants during a first meeting and for a first defined time before the first meeting. At 306, the central server 112 may be configured to track one or more updates performed by the plurality of participants to the monitored content during and post the first meeting. At 308, the central server 112 may be configured to identify a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants. At 310, the central server 112 may be configured to generate a summary of one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting. At 312, the central server 112 may be configured to display the summary to each of the plurality of participants before a second defined time of the start time of the second meeting. In an exemplary embodiment, displaying the summary may comprise opening content for which the one or more updates may have been performed by the plurality of participants. Control passes to end operation 314.
  • FIG. 4 is a block diagram of an exemplary computer system for implementing a dynamically generated news feed, in accordance with various exemplary embodiments of the present disclosure. Referring to FIG. 4, there is shown an exemplary computer system 401, which may comprise a central processing unit (“CPU” or “processor”) 402, an I/O interface 403, an input device 404, an output device 405, a transceiver 406, a network interface 407, a communication network 408, devices, such as 409, 410 and 411, storage interface 412, one or more memory devices, such as RAM 413, ROM 414, and memory device 415.
  • Variations of computer system 401 may be used for generating and displaying the summary. The computer system 401 may comprise a central processing unit (“CPU” or “processor”) 402. Processor 402 may comprise at least one data processor for executing program components for executing user- or system-generated requests. A user may include a person, a person using a device such as those included in this disclosure, or such a device itself. The processor 402 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc. The processor 402 may include a microprocessor, such as AMD Athlon, Duron or Opteron, ARM's application, embedded or secure processors, IBM PowerPC, Intel's Core, Itanium, Xeon, Celeron or other line of processors, etc. The processor 402 may be implemented using mainframe, distributed processor, multi-core, parallel, grid, or other architectures. Some exemplary embodiments may utilize embedded technologies like application-specific integrated circuits (ASICs), digital signal processors (DSPs), Field Programmable Gate Arrays (FPGAs), etc.
  • The processor 402 may be disposed in communication with one or more input/output (I/O) devices via I/O interface 403. The I/O interface 403 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n/b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
  • Using the I/O interface 403, the computer system 401 may communicate with one or more I/O devices. For example, the input device 404 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, sensor (e.g., accelerometer, light sensor, GPS, gyroscope, proximity sensor, or the like), stylus, scanner, storage device, transceiver, video device/source, visors, etc. Output device 405 may be a printer, fax machine, video display (e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, or the like), audio speaker, etc. In some exemplary embodiments, a transceiver 406 may be disposed in connection with the processor 402. The transceiver may facilitate various types of wireless transmission or reception. For example, the transceiver may include an antenna operatively connected to a transceiver chip (e.g., Texas Instruments WiLink WL1283, Broadcom BCM4750IUB8, Infineon Technologies X-Gold 618-PMB9800, or the like), providing IEEE 802.11a/b/g/n, Bluetooth, FM, global positioning system (GPS), 2G/3G HSDPA/HSUPA communications, etc.
  • In some exemplary embodiments, the processor 402 may be disposed in communication with a communication network 408 via a network interface 407. The network interface 407 may communicate with the communication network 408. The network interface may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. The communication network 408 may include, for example, a direct interconnection, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, etc. Using the network interface 407 and the communication network 408, the computer system 401 may communicate with devices 409, 410, and 411. These devices may include, without limitation, personal computer(s), server(s), fax machines, printers, scanners, various mobile devices such as cellular telephones, smartphones (e.g., Apple iPhone®, Blackberry®, Android®-based phones, etc.), tablet computers, eBook readers (Amazon Kindle®, Nook, etc.), laptop computers, notebooks, gaming consoles (Microsoft Xbox®, Nintendo DS®, Sony PlayStation®, etc.), or the like. In some exemplary embodiments, the computer system 1101 may itself embody one or more of these devices.
  • In some exemplary embodiments, the processor 402 may be disposed in communication with one or more memory devices (e.g., RAM 413, ROM 414, etc.) via a storage interface 412. The storage interface may connect to memory devices including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as serial advanced technology attachment (SATA), integrated drive electronics (IDE), IEEE-1394, universal serial bus (USB), fiber channel, small computer systems interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, redundant array of independent discs (RAID), solid-state memory devices, solid-state drives, etc.
  • The memory devices may store a collection of program or database components, including, without limitation, an operating system 416, user interface application 417, web browser 418, mail server 419, mail client 420, user/application data 421 (e.g., any data variables or data records discussed in this disclosure), etc. The operating system 416 may facilitate resource management and operation of the computer system 401. Examples of operating systems include, without limitation, Apple Macintosh OS X, UNIX, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSD, NetBSD, OpenBSD, etc.), Linux distributions (e.g., Red Hat, Ubuntu, Kubuntu, etc.), IBM OS/2, Microsoft Windows (XP, Vista/7/8, etc.), Apple iOS, Google Android, Blackberry OS, or the like. User interface 417 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities. For example, user interfaces may provide computer interaction interface elements on a display system operatively connected to the computer system 401, such as cursors, icons, check boxes, menus, scrollers, windows, widgets, etc. Graphical user interfaces (GUIs) may be employed, including, without limitation, Apple Macintosh operating systems' Aqua, IBM OS/2, Microsoft Windows (e.g., Aero, Metro, etc.), Unix X-Windows, web interface libraries (e.g., ActiveX, Java, Javascript, AJAX, HTML, Adobe Flash, etc.), or the like.
  • In some exemplary embodiments, the computer system 401 may implement a web browser 1118 stored program component. The web browser may be a hypertext viewing application, such as Microsoft Internet Explorer, Google Chrome, Mozilla Firefox, Apple Safari, etc. Secure web browsing may be provided using HTTPS (secure hypertext transport protocol), secure sockets layer (SSL), Transport Layer Security (TLS), etc. Web browsers may utilize facilities such as AJAX, DHTML, Adobe Flash, JavaScript, Java, application programming interfaces (APIs), etc. In some exemplary embodiments, the computer system 401 may implement a mail server 519 stored program component. The mail server may be an Internet mail server such as Microsoft Exchange, or the like. The mail server may utilize facilities such as ASP, ActiveX, ANSI C++/C#, Microsoft .NET, CGI scripts, Java, JavaScript, PERL, PHP, Python, WebObjects, etc. The mail server may utilize communication protocols such as internet message access protocol (IMAP), messaging application programming interface (MAPI), Microsoft Exchange, post office protocol (POP), simple mail transfer protocol (SMTP), or the like. In some exemplary embodiments, the computer system 401 may implement a mail client 420 stored program component. The mail client may be a mail viewing application, such as Apple Mail, Microsoft Entourage, Microsoft Outlook, Mozilla Thunderbird, etc.
  • In some exemplary embodiments, computer system 401 may store user/application data 421, such as the data, variables, records, etc. as described in this disclosure. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase. Alternatively, such databases may be implemented using standardized data structures, such as an array, hash, linked list, struct, structured text file (e.g., XML), table, or as object-oriented databases (e.g., using ObjectStore, Poet, Zope, etc.). Such databases may be consolidated or distributed, sometimes among the various computer systems discussed above in this disclosure. It is to be understood that the structure and operation of the computer or database component may be combined, consolidated, or distributed in any working combination.
  • Furthermore, one or more computer-readable storage media may be utilized to implement various exemplary embodiments consistent with the present invention. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform operations or stages consistent with the exemplary embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
  • Various exemplary embodiments of the disclosure encompass numerous advantages including methods and systems for generating a summary of a first meeting that may be displayed before the second meeting. In an exemplary embodiment, the methods and systems may identify a start time of the second meeting scheduled in continuation to the first meeting and may generate summary of one or more action points associated with each participant of the second meeting based on content consumed by the participants in the first meeting. Disclosed methods and systems effectively monitor and track in real time one or more updates performed by the plurality of participants to the content consumed during and post the first meeting and thus may generate an accurate summary of action points for each of the participant, thereby providing a context to the second meeting and also ensure that the action items discussed on the first meeting have been actioned upon before the second meeting is initiated. This ensures that the second meeting is much more productive. Additionally, as a trained machine learning model is used to generate the summary hence the generated summary of the action points for each of the participants is much accurate.
  • Thus, the claimed operations as discussed above are not routine, conventional, or well understood in the art, as the claimed operation enable the following solutions to the existing problems in conventional technologies.
  • The present disclosure may be realized in hardware, or a combination of hardware and software. The present disclosure may be realized in a centralized fashion, in at least one computer system, or in a distributed fashion, where different elements may be spread across several interconnected computer systems. A computer system or other apparatus adapted for carrying out the methods described herein may be suited. A combination of hardware and software may be a general-purpose computer system with a computer program that, when loaded and executed, may control the computer system such that it carries out the methods described herein. The present disclosure may be realized in hardware that comprises a portion of an integrated circuit that also performs other functions.
  • A person with ordinary skills in the art will appreciate that the systems, modules, and sub-modules have been illustrated and explained to serve as examples and should not be considered limiting in any manner. It will be further appreciated that the variants of the above disclosed system elements, modules, and other features and functions, or alternatives thereof, may be combined to create other different systems or applications.
  • Those skilled in the art will appreciate that any of the aforementioned operations and/or system modules may be suitably replaced, reordered, or removed, and additional operations and/or system modules may be inserted, depending on the needs of a particular application. In addition, the systems of the aforementioned exemplary embodiments may be implemented using a wide variety of suitable processes and system modules, and are not limited to any particular computer hardware, software, middleware, firmware, microcode, and the like. The claims can encompass exemplary embodiments for hardware and software, or a combination thereof.
  • While the present disclosure has been described with reference to certain exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present disclosure. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from its scope. Therefore, it is intended that the present disclosure not be limited to the particular exemplary embodiment disclosed, but that the present disclosure will include all exemplary embodiments falling within the scope of the appended claims.

Claims (19)

What is claimed is:
1. A method, comprising:
monitoring, by a processor in real time, content consumed by a plurality of participants during a first meeting and for a first defined time before the first meeting, wherein the monitored content is associated with each of the plurality of participants;
tracking, by the processor, one or more updates performed by the plurality of participants to the monitored content during and post the first meeting;
identifying, by the processor, a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants;
generating, by the processor, a summary of one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting; and
displaying, by the processor, the summary to each of the plurality of participants before a second defined time of the start time of the second meeting, wherein displaying the summary comprises opening content for which the one or more updates have been performed by the plurality of participants.
2. The method of claim 1, wherein the content comprises at least one of one or more applications, one or more files, one or more libraries, audio content, video content, meeting notes, presentation content, screen sharing content, and/or file sharing content that is accessed by the plurality of participants.
3. The method of claim 1, wherein the content is associated with a plurality of attributes comprising author name, last modified time, last modified date, last modified author, version number, title, category, total editing time, file size, creation date, last accessed date, last accessed author name, sharing status, and online status.
4. The method of claim 1, wherein generating the summary of the one or more action points comprises:
generating, by the processor, meeting data associated with a participant of the first meeting, wherein the meeting data comprises data associated with one or more interactions performed by the participant during the first meeting;
training, by the processor, a machine learning (ML) model associated with the participant based on the meeting data associated with the participant; and
determining, by the processor, the one or more action points to be performed by the participant before the start time of the second meeting based on the trained ML.
5. The method of claim 4, wherein the meeting data further comprises a transcript of the audio generated by the participant during the first meeting, a transcript of the content shared by the participant during the first meeting, and/or meeting notes inputted by the participant.
6. The method of claim 4, wherein the one or more interactions comprises events performed by the plurality of participants by using the electronic device or one or more device peripherals attached to the electronic device, wherein the events comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
7. The method of claim 4 further comprising training, by the processor, the ML model based on metadata associated with the first meeting, wherein the metadata associated with the first meeting comprises at least a type of the first meeting.
8. The method of claim 1, wherein the first meeting and the second meeting have same plurality of participants and agenda of either of the first meeting and the second meeting forms a subset of other.
9. The method of claim 1, wherein the first defined time and second defined time is user configurable.
10. A central server, comprising:
a hardware processor; and
a memory communicatively coupled to the hardware processor, wherein the memory stores processor instructions, which, on execution, causes the hardware processor to:
monitor in real time, content consumed by a plurality of participants during a first meeting and for a first defined time before the first meeting, wherein the monitored content is associated with each of the plurality of participants;
track one or more updates performed by the plurality of participants to the monitored content during and post the first meeting;
identify a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants;
generate a summary of one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting; and
display the summary to each of the plurality of participants before a second defined time of the start time of the second meeting, wherein displaying the summary comprises opening content for which the one or more updates have been performed by the plurality of participants.
11. The central server of claim 10, wherein the content comprises at least one of one or more applications, one or more files, one or more libraries, audio content, video content, meeting notes, presentation content, screen sharing content, and/or file sharing content that is accessed by the plurality of participants.
12. The central server of claim 10, wherein the content is associated with a plurality of attributes comprising author name, last modified time, last modified date, last modified author, version number, title, category, total editing time, file size, creation date, last accessed date, last accessed author name, sharing status, and online status.
13. The central server of claim 10, wherein the hardware processor is further configured to:
generate meeting data associated with a participant of the first meeting, wherein the meeting data comprises data associated with one or more interactions performed by the participant during the first meeting;
train a machine learning (ML) model associated with the participant based on the meeting data associated with the participant; and
determine the one or more action points to be performed by the participant before the start time of the second meeting based on the trained ML.
14. The central server of claim 13, wherein the meeting data further comprises a transcript of the audio generated by the participant during the first meeting, a transcript of the content shared by the participant during the first meeting, and/or meeting notes inputted by the participant.
15. The central server of claim 13, wherein the one or more interactions comprises events performed by the plurality of participants by using the electronic device or one or more device peripherals attached to the electronic device, wherein the events comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
16. The central server of claim 13, wherein the hardware processor is further configured to train the ML model based on metadata associated with the first meeting, wherein the metadata associated with the first meeting comprises at least a type of the first meeting.
17. The central server of claim 10, wherein the first meeting and the second meeting have same plurality of participants and agenda of either of the first meeting and the second meeting forms a subset of other.
18. The central server of claim 10, wherein the first defined time and second defined time is user configurable.
19. A non-transitory computer readable medium having stored thereon, computer executable instructions, which when executed by at least one hardware processor in an electronic device, causes the electronic device to perform operations, the operations comprising:
monitoring in real time, content consumed by a plurality of participants during a first meeting and for a first defined time before the first meeting, wherein the monitored content is associated with each of the plurality of participants;
tracking one or more updates performed by the plurality of participants to the monitored content during and post the first meeting;
identifying a start time of a second meeting scheduled in continuation to the first meeting based on calendar information associated with each of the plurality of participants;
generating a summary of one or more action points for each of the plurality of participants based on the content consumed by the plurality of participants during the first meeting and for the first defined time before the first meeting; and
displaying the summary to each of the plurality of participants before a second defined time of the start time of the second meeting, wherein displaying the summary comprises opening content for which the one or more updates have been performed by the plurality of participants.
US17/308,640 2020-05-21 2021-05-05 Meeting summary generation Abandoned US20210367802A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/308,640 US20210367802A1 (en) 2020-05-21 2021-05-05 Meeting summary generation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063028123P 2020-05-21 2020-05-21
US17/308,640 US20210367802A1 (en) 2020-05-21 2021-05-05 Meeting summary generation

Publications (1)

Publication Number Publication Date
US20210367802A1 true US20210367802A1 (en) 2021-11-25

Family

ID=78607913

Family Applications (8)

Application Number Title Priority Date Filing Date
US17/308,887 Abandoned US20210367984A1 (en) 2020-05-21 2021-05-05 Meeting experience management
US17/308,623 Active US11488116B2 (en) 2020-05-21 2021-05-05 Dynamically generated news feed
US17/308,916 Abandoned US20210367986A1 (en) 2020-05-21 2021-05-05 Enabling Collaboration Between Users
US17/308,264 Active US11537998B2 (en) 2020-05-21 2021-05-05 Capturing meeting snippets
US17/308,329 Active US11416831B2 (en) 2020-05-21 2021-05-05 Dynamic video layout in video conference meeting
US17/308,586 Abandoned US20210365893A1 (en) 2020-05-21 2021-05-05 Recommendation unit for generating meeting recommendations
US17/308,640 Abandoned US20210367802A1 (en) 2020-05-21 2021-05-05 Meeting summary generation
US17/308,772 Abandoned US20210365896A1 (en) 2020-05-21 2021-05-05 Machine learning (ml) model for participants

Family Applications Before (6)

Application Number Title Priority Date Filing Date
US17/308,887 Abandoned US20210367984A1 (en) 2020-05-21 2021-05-05 Meeting experience management
US17/308,623 Active US11488116B2 (en) 2020-05-21 2021-05-05 Dynamically generated news feed
US17/308,916 Abandoned US20210367986A1 (en) 2020-05-21 2021-05-05 Enabling Collaboration Between Users
US17/308,264 Active US11537998B2 (en) 2020-05-21 2021-05-05 Capturing meeting snippets
US17/308,329 Active US11416831B2 (en) 2020-05-21 2021-05-05 Dynamic video layout in video conference meeting
US17/308,586 Abandoned US20210365893A1 (en) 2020-05-21 2021-05-05 Recommendation unit for generating meeting recommendations

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/308,772 Abandoned US20210365896A1 (en) 2020-05-21 2021-05-05 Machine learning (ml) model for participants

Country Status (1)

Country Link
US (8) US20210367984A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240163398A1 (en) * 2021-09-28 2024-05-16 Atlassian Pty Ltd. Apparatuses, computer-implemented methods, and computer program products for generating a collaborative contextual summary interface in association with an audio-video conferencing interface service
US20240395254A1 (en) * 2023-05-24 2024-11-28 Otter.ai, Inc. Systems and methods for live summarization
US12182502B1 (en) 2022-03-28 2024-12-31 Otter.ai, Inc. Systems and methods for automatically generating conversation outlines and annotation summaries
US12400661B2 (en) 2017-07-09 2025-08-26 Otter.ai, Inc. Systems and methods for capturing, processing, and rendering one or more context-aware moment-associating elements
US12406684B2 (en) 2021-02-26 2025-09-02 Otter.ai, Inc. Systems and methods for automatic joining as a virtual meeting participant for transcription
US12406672B2 (en) 2018-10-17 2025-09-02 Otter.ai, Inc. Systems and methods for live broadcasting of context-aware transcription and/or other elements related to conversations and/or speeches

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10263799B1 (en) 2018-08-29 2019-04-16 Capital One Services, Llc Managing meeting data
US11765213B2 (en) * 2019-06-11 2023-09-19 Nextiva, Inc. Mixing and transmitting multiplex audiovisual information
US11595447B2 (en) 2020-08-05 2023-02-28 Toucan Events Inc. Alteration of event user interfaces of an online conferencing service
US12023116B2 (en) * 2020-12-21 2024-07-02 Cilag Gmbh International Dynamic trocar positioning for robotic surgical system
US12175968B1 (en) * 2021-03-26 2024-12-24 Amazon Technologies, Inc. Skill selection for responding to natural language inputs
US11937016B2 (en) * 2021-05-26 2024-03-19 International Business Machines Corporation System and method for real-time, event-driven video conference analytics
US11894938B2 (en) 2021-06-21 2024-02-06 Toucan Events Inc. Executing scripting for events of an online conferencing service
US11916687B2 (en) 2021-07-28 2024-02-27 Zoom Video Communications, Inc. Topic relevance detection using automated speech recognition
US20230098137A1 (en) * 2021-09-30 2023-03-30 C/o Uniphore Technologies Inc. Method and apparatus for redacting sensitive information from audio
US11985180B2 (en) * 2021-11-16 2024-05-14 Microsoft Technology Licensing, Llc Meeting-video management engine for a meeting-video management system
US11722536B2 (en) 2021-12-27 2023-08-08 Atlassian Pty Ltd. Apparatuses, computer-implemented methods, and computer program products for managing a shared dynamic collaborative presentation progression interface in association with an audio-video conferencing interface service
WO2023158330A1 (en) * 2022-02-16 2023-08-24 Ringcentral, Inc., System and method for rearranging conference recordings
US20230297208A1 (en) * 2022-03-16 2023-09-21 Figma, Inc. Collaborative widget state synchronization
US12155729B2 (en) * 2022-03-18 2024-11-26 Zoom Video Communications, Inc. App pinning in video conferences
JP7459890B2 (en) * 2022-03-23 2024-04-02 セイコーエプソン株式会社 Display methods, display systems and programs
US20230401497A1 (en) * 2022-06-09 2023-12-14 Vmware, Inc. Event recommendations using machine learning
CN117459673A (en) * 2022-07-19 2024-01-26 奥图码股份有限公司 Electronic device and method for video conferencing
US12095580B2 (en) * 2022-10-31 2024-09-17 Docusign, Inc. Conferencing platform integration with agenda generation
US11838139B1 (en) 2022-10-31 2023-12-05 Docusign, Inc. Conferencing platform integration with assent tracking
US12373644B2 (en) * 2022-12-13 2025-07-29 Calabrio, Inc. Evaluating transcripts through repetitive statement analysis
US20250119507A1 (en) * 2023-10-09 2025-04-10 Dell Products, L.P. Handling conference room boundaries and/or context

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090030986A1 (en) * 2007-07-27 2009-01-29 Twinstrata, Inc. System and method for remote asynchronous data replication
US20090210933A1 (en) * 2008-02-15 2009-08-20 Shear Jeffrey A System and Method for Online Content Production
US20090222741A1 (en) * 2008-02-29 2009-09-03 Microsoft Corporation Collaborative management of activities occurring during the lifecycle of a meeting
US20100042704A1 (en) * 2008-08-12 2010-02-18 International Business Machines Corporation Automating application state of a set of computing devices responsive to scheduled events based on historical data
US20110072362A1 (en) * 2009-09-22 2011-03-24 International Business Machines Corporation Meeting Agenda Management
US20150081841A1 (en) * 2013-09-13 2015-03-19 Incontact, Inc. Systems and methods for data synchronization management
US20150142800A1 (en) * 2013-11-15 2015-05-21 Citrix Systems, Inc. Generating electronic summaries of online meetings
US20160012130A1 (en) * 2014-07-14 2016-01-14 Yahoo! Inc. Aiding composition of themed articles about popular and novel topics and offering users a navigable experience of associated content
US20160117624A1 (en) * 2014-10-23 2016-04-28 International Business Machines Incorporated Intelligent meeting enhancement system
US20160179293A1 (en) * 2014-12-17 2016-06-23 Fuji Xerox Co., Ltd. Systems and methods for plan-based hypervideo playback
US20180101823A1 (en) * 2016-10-11 2018-04-12 Ricoh Company, Ltd. Managing Electronic Meetings Using Artificial Intelligence and Meeting Rules Templates
US20180101824A1 (en) * 2016-10-11 2018-04-12 Ricoh Company, Ltd. Real-Time (Intra-Meeting) Processing Using Artificial Intelligence
US20180131904A1 (en) * 2013-06-26 2018-05-10 Touchcast LLC Intelligent virtual assistant system and method
US20190219984A1 (en) * 2018-01-12 2019-07-18 Industrial Technology Research Institute Machine tool collision avoidance method and system using the same
US20190386839A1 (en) * 2018-06-13 2019-12-19 Lenovo (Singapore) Pte. Ltd. Device, method, and system for managed updating of meeting handout data
US20200092341A1 (en) * 2017-05-23 2020-03-19 Huawei Technologies Co., Ltd. Information exchange method and terminal
US20200106735A1 (en) * 2018-09-27 2020-04-02 Salvatore Guerrieri Systems and Methods for Communications & Commerce Between System Users and Non-System Users
US20200118150A1 (en) * 2018-10-16 2020-04-16 Igt Unlockable electronic incentives
US20200167371A1 (en) * 2018-11-27 2020-05-28 Slack Technologies, Inc. Dynamic and selective object update for local storage copy based on network connectivity characteristics
US20200374146A1 (en) * 2019-05-24 2020-11-26 Microsoft Technology Licensing, Llc Generation of intelligent summaries of shared content based on a contextual analysis of user engagement
US20210117050A1 (en) * 2019-10-22 2021-04-22 Microsoft Technology Licensing, Llc Structured arrangements for tracking content items on a shared user interface
US11095468B1 (en) * 2020-02-13 2021-08-17 Amazon Technologies, Inc. Meeting summary service
US20210264377A1 (en) * 2020-02-20 2021-08-26 Sap Se Shared collaborative electronic events
US20210377138A1 (en) * 2019-02-18 2021-12-02 Huawei Technologies Co., Ltd. Communication Method, Apparatus, and System

Family Cites Families (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10742433B2 (en) * 2003-06-16 2020-08-11 Meetup, Inc. Web-based interactive meeting facility, such as for progressive announcements
US6963352B2 (en) 2003-06-30 2005-11-08 Nortel Networks Limited Apparatus, method, and computer program for supporting video conferencing in a communication system
US7634540B2 (en) 2006-10-12 2009-12-15 Seiko Epson Corporation Presenter view control system and method
US8180029B2 (en) 2007-06-28 2012-05-15 Voxer Ip Llc Telecommunication and multimedia management method and apparatus
US9615056B2 (en) 2010-02-10 2017-04-04 Oovoo, Llc System and method for video communication on mobile devices
US9264659B2 (en) 2010-04-07 2016-02-16 Apple Inc. Video conference network management for a mobile device
US8514263B2 (en) * 2010-05-12 2013-08-20 Blue Jeans Network, Inc. Systems and methods for scalable distributed global infrastructure for real-time multimedia communication
US20130191299A1 (en) 2010-10-28 2013-07-25 Talentcircles, Inc. Methods and apparatus for a social recruiting network
US20120144320A1 (en) 2010-12-03 2012-06-07 Avaya Inc. System and method for enhancing video conference breaks
US20120192080A1 (en) * 2011-01-21 2012-07-26 Google Inc. Tailoring content based on available bandwidth
US9210213B2 (en) * 2011-03-03 2015-12-08 Citrix Systems, Inc. Reverse seamless integration between local and remote computing environments
US9113032B1 (en) 2011-05-31 2015-08-18 Google Inc. Selecting participants in a video conference
US8941708B2 (en) 2011-07-29 2015-01-27 Cisco Technology, Inc. Method, computer-readable storage medium, and apparatus for modifying the layout used by a video composing unit to generate a composite video signal
JP2015507246A (en) 2011-12-06 2015-03-05 アグリーヤ モビリティ インコーポレーテッド Seamless collaboration and communication
US20130282820A1 (en) 2012-04-23 2013-10-24 Onmobile Global Limited Method and System for an Optimized Multimedia Communications System
US8914452B2 (en) * 2012-05-31 2014-12-16 International Business Machines Corporation Automatically generating a personalized digest of meetings
US9141504B2 (en) 2012-06-28 2015-09-22 Apple Inc. Presenting status data received from multiple devices
US9953304B2 (en) 2012-12-30 2018-04-24 Buzd, Llc Situational and global context aware calendar, communications, and relationship management
US10484189B2 (en) * 2013-11-13 2019-11-19 Microsoft Technology Licensing, Llc Enhanced collaboration services
US20150358810A1 (en) 2014-06-10 2015-12-10 Qualcomm Incorporated Software Configurations for Mobile Devices in a Collaborative Environment
US9846528B2 (en) * 2015-03-02 2017-12-19 Dropbox, Inc. Native application collaboration
US20160307165A1 (en) 2015-04-20 2016-10-20 Cisco Technology, Inc. Authorizing Participant Access To A Meeting Resource
US20160350720A1 (en) 2015-05-29 2016-12-01 Citrix Systems, Inc. Recommending meeting times based on previous meeting acceptance history
US10255946B1 (en) 2015-06-25 2019-04-09 Amazon Technologies, Inc. Generating tags during video upload
DE112016003352T5 (en) 2015-07-24 2018-04-12 Max Andaker Smooth user interface for virtual collaboration, communication and cloud computing
US20190332994A1 (en) * 2015-10-03 2019-10-31 WeWork Companies Inc. Generating insights about meetings in an organization
US10620811B2 (en) * 2015-12-30 2020-04-14 Dropbox, Inc. Native application collaboration
US10572961B2 (en) * 2016-03-15 2020-02-25 Global Tel*Link Corporation Detection and prevention of inmate to inmate message relay
US20170308866A1 (en) * 2016-04-22 2017-10-26 Microsoft Technology Licensing, Llc Meeting Scheduling Resource Efficiency
US20180046957A1 (en) * 2016-08-09 2018-02-15 Microsoft Technology Licensing, Llc Online Meetings Optimization
US20180077092A1 (en) 2016-09-09 2018-03-15 Tariq JALIL Method and system for facilitating user collaboration
US20180101760A1 (en) * 2016-10-11 2018-04-12 Ricoh Company, Ltd. Selecting Meeting Participants for Electronic Meetings Using Artificial Intelligence
US9699410B1 (en) 2016-10-28 2017-07-04 Wipro Limited Method and system for dynamic layout generation in video conferencing system
US11568369B2 (en) * 2017-01-13 2023-01-31 Fujifilm Business Innovation Corp. Systems and methods for context aware redirection based on machine-learning
TWI644565B (en) 2017-02-17 2018-12-11 陳延祚 Video image processing method and system using the same
US20180270452A1 (en) 2017-03-15 2018-09-20 Electronics And Telecommunications Research Institute Multi-point connection control apparatus and method for video conference service
US10838396B2 (en) 2017-04-18 2020-11-17 Cisco Technology, Inc. Connecting robotic moving smart building furnishings
US20180331842A1 (en) 2017-05-15 2018-11-15 Microsoft Technology Licensing, Llc Generating a transcript to capture activity of a conference session
US9967520B1 (en) 2017-06-30 2018-05-08 Ringcentral, Inc. Method and system for enhanced conference management
US11412012B2 (en) * 2017-08-24 2022-08-09 Re Mago Holding Ltd Method, apparatus, and computer-readable medium for desktop sharing over a web socket connection in a networked collaboration workspace
US10553208B2 (en) * 2017-10-09 2020-02-04 Ricoh Company, Ltd. Speech-to-text conversion for interactive whiteboard appliances using multiple services
US20190172017A1 (en) * 2017-12-04 2019-06-06 Microsoft Technology Licensing, Llc Tagging meeting invitees to automatically create tasks
US20190205839A1 (en) * 2017-12-29 2019-07-04 Microsoft Technology Licensing, Llc Enhanced computer experience from personal activity pattern
US11120199B1 (en) * 2018-02-09 2021-09-14 Voicebase, Inc. Systems for transcribing, anonymizing and scoring audio content
US10757148B2 (en) * 2018-03-02 2020-08-25 Ricoh Company, Ltd. Conducting electronic meetings over computer networks using interactive whiteboard appliances and mobile devices
US20210004735A1 (en) * 2018-03-22 2021-01-07 Siemens Corporation System and method for collaborative decentralized planning using deep reinforcement learning agents in an asynchronous environment
US20190312917A1 (en) 2018-04-05 2019-10-10 Microsoft Technology Licensing, Llc Resource collaboration with co-presence indicators
CN108595645B (en) * 2018-04-26 2020-10-30 深圳市鹰硕技术有限公司 Conference speech management method and device
US10735211B2 (en) * 2018-05-04 2020-08-04 Microsoft Technology Licensing, Llc Meeting insight computing system
US10606576B1 (en) * 2018-10-26 2020-03-31 Salesforce.Com, Inc. Developer experience for live applications in a cloud collaboration platform
US20200341625A1 (en) 2019-04-26 2020-10-29 Microsoft Technology Licensing, Llc Automated conference modality setting application
US11689379B2 (en) 2019-06-24 2023-06-27 Dropbox, Inc. Generating customized meeting insights based on user interactions and meeting media
US11049511B1 (en) 2019-12-26 2021-06-29 Lenovo (Singapore) Pte. Ltd. Systems and methods to determine whether to unmute microphone based on camera input
US11049077B1 (en) 2019-12-31 2021-06-29 Capital One Services, Llc Computer-based systems configured for automated electronic calendar management and work task scheduling and methods of use thereof
US10999346B1 (en) 2020-01-06 2021-05-04 Dialogic Corporation Dynamically changing characteristics of simulcast video streams in selective forwarding units
US11989696B2 (en) * 2020-01-16 2024-05-21 Capital One Services, Llc Computer-based systems configured for automated electronic calendar management with meeting room locating and methods of use thereof
US10735212B1 (en) 2020-01-21 2020-08-04 Capital One Services, Llc Computer-implemented systems configured for automated electronic calendar item predictions and methods of use thereof
US11288636B2 (en) 2020-01-23 2022-03-29 Capital One Services, Llc Computer-implemented systems configured for automated electronic calendar item predictions for calendar item rescheduling and methods of use thereof
US11438841B2 (en) 2020-01-31 2022-09-06 Dell Products, Lp Energy savings system based machine learning of wireless performance activity for mobile information handling system connected to plural wireless networks
US11393176B2 (en) * 2020-02-07 2022-07-19 Krikey, Inc. Video tools for mobile rendered augmented reality game
US11080356B1 (en) * 2020-02-27 2021-08-03 International Business Machines Corporation Enhancing online remote meeting/training experience using machine learning
WO2021194372A1 (en) * 2020-03-26 2021-09-30 Ringcentral, Inc. Methods and systems for managing meeting notes
US20210319408A1 (en) 2020-04-09 2021-10-14 Science House LLC Platform for electronic management of meetings
US11470014B2 (en) 2020-04-30 2022-10-11 Dell Products, Lp System and method of managing data connections to a communication network using tiered devices and telemetry data
US11570219B2 (en) * 2020-05-07 2023-01-31 Re Mago Holding Ltd Method, apparatus, and computer readable medium for virtual conferencing with embedded collaboration tools
US11184560B1 (en) 2020-12-16 2021-11-23 Lenovo (Singapore) Pte. Ltd. Use of sensor input to determine video feed to provide as part of video conference
US11119985B1 (en) 2021-03-19 2021-09-14 Atlassian Pty Ltd. Apparatuses, methods, and computer program products for the programmatic documentation of extrinsic event based data objects in a collaborative documentation service

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090030986A1 (en) * 2007-07-27 2009-01-29 Twinstrata, Inc. System and method for remote asynchronous data replication
US20090210933A1 (en) * 2008-02-15 2009-08-20 Shear Jeffrey A System and Method for Online Content Production
US20090222741A1 (en) * 2008-02-29 2009-09-03 Microsoft Corporation Collaborative management of activities occurring during the lifecycle of a meeting
US20100042704A1 (en) * 2008-08-12 2010-02-18 International Business Machines Corporation Automating application state of a set of computing devices responsive to scheduled events based on historical data
US20110072362A1 (en) * 2009-09-22 2011-03-24 International Business Machines Corporation Meeting Agenda Management
US20180131904A1 (en) * 2013-06-26 2018-05-10 Touchcast LLC Intelligent virtual assistant system and method
US20150081841A1 (en) * 2013-09-13 2015-03-19 Incontact, Inc. Systems and methods for data synchronization management
US20150142800A1 (en) * 2013-11-15 2015-05-21 Citrix Systems, Inc. Generating electronic summaries of online meetings
US20160012130A1 (en) * 2014-07-14 2016-01-14 Yahoo! Inc. Aiding composition of themed articles about popular and novel topics and offering users a navigable experience of associated content
US20160117624A1 (en) * 2014-10-23 2016-04-28 International Business Machines Incorporated Intelligent meeting enhancement system
US20160179293A1 (en) * 2014-12-17 2016-06-23 Fuji Xerox Co., Ltd. Systems and methods for plan-based hypervideo playback
US20180101823A1 (en) * 2016-10-11 2018-04-12 Ricoh Company, Ltd. Managing Electronic Meetings Using Artificial Intelligence and Meeting Rules Templates
US20180101824A1 (en) * 2016-10-11 2018-04-12 Ricoh Company, Ltd. Real-Time (Intra-Meeting) Processing Using Artificial Intelligence
US20200092341A1 (en) * 2017-05-23 2020-03-19 Huawei Technologies Co., Ltd. Information exchange method and terminal
US20190219984A1 (en) * 2018-01-12 2019-07-18 Industrial Technology Research Institute Machine tool collision avoidance method and system using the same
US20190386839A1 (en) * 2018-06-13 2019-12-19 Lenovo (Singapore) Pte. Ltd. Device, method, and system for managed updating of meeting handout data
US20200106735A1 (en) * 2018-09-27 2020-04-02 Salvatore Guerrieri Systems and Methods for Communications & Commerce Between System Users and Non-System Users
US20200118150A1 (en) * 2018-10-16 2020-04-16 Igt Unlockable electronic incentives
US20200167371A1 (en) * 2018-11-27 2020-05-28 Slack Technologies, Inc. Dynamic and selective object update for local storage copy based on network connectivity characteristics
US20210377138A1 (en) * 2019-02-18 2021-12-02 Huawei Technologies Co., Ltd. Communication Method, Apparatus, and System
US20200374146A1 (en) * 2019-05-24 2020-11-26 Microsoft Technology Licensing, Llc Generation of intelligent summaries of shared content based on a contextual analysis of user engagement
US20210117050A1 (en) * 2019-10-22 2021-04-22 Microsoft Technology Licensing, Llc Structured arrangements for tracking content items on a shared user interface
US11095468B1 (en) * 2020-02-13 2021-08-17 Amazon Technologies, Inc. Meeting summary service
US20210264377A1 (en) * 2020-02-20 2021-08-26 Sap Se Shared collaborative electronic events

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12400661B2 (en) 2017-07-09 2025-08-26 Otter.ai, Inc. Systems and methods for capturing, processing, and rendering one or more context-aware moment-associating elements
US12406672B2 (en) 2018-10-17 2025-09-02 Otter.ai, Inc. Systems and methods for live broadcasting of context-aware transcription and/or other elements related to conversations and/or speeches
US12406684B2 (en) 2021-02-26 2025-09-02 Otter.ai, Inc. Systems and methods for automatic joining as a virtual meeting participant for transcription
US20240163398A1 (en) * 2021-09-28 2024-05-16 Atlassian Pty Ltd. Apparatuses, computer-implemented methods, and computer program products for generating a collaborative contextual summary interface in association with an audio-video conferencing interface service
US12182502B1 (en) 2022-03-28 2024-12-31 Otter.ai, Inc. Systems and methods for automatically generating conversation outlines and annotation summaries
US20240395254A1 (en) * 2023-05-24 2024-11-28 Otter.ai, Inc. Systems and methods for live summarization

Also Published As

Publication number Publication date
US20210367800A1 (en) 2021-11-25
US20210365896A1 (en) 2021-11-25
US20210367801A1 (en) 2021-11-25
US11488116B2 (en) 2022-11-01
US20210368134A1 (en) 2021-11-25
US11416831B2 (en) 2022-08-16
US20210365893A1 (en) 2021-11-25
US11537998B2 (en) 2022-12-27
US20210367986A1 (en) 2021-11-25
US20210367984A1 (en) 2021-11-25

Similar Documents

Publication Publication Date Title
US20210367802A1 (en) Meeting summary generation
US20160239770A1 (en) Method and system for dynamically changing process flow of a business process
US9712569B2 (en) Method and apparatus for timeline-synchronized note taking during a web conference
US8874648B2 (en) E-meeting summaries
US20180262798A1 (en) Methods and systems for rendering multimedia content on a user device
US8903902B2 (en) Framework and method for real-time embedded collaboration using business process and transaction context
US11200538B2 (en) System and method for a unified incident management interface
EP3258392A1 (en) Systems and methods for building contextual highlights for conferencing systems
US20140047330A1 (en) Collaborative decision making in contract documents
US20180005146A1 (en) Knowledge-based decision support systems and method for process lifecycle automation
US20190089663A1 (en) Using organizational rank to facilitate electronic communication
US20150356495A1 (en) Digital workspace
US11573809B2 (en) Method and system for providing virtual services
CN114240322B (en) Business processing method, device, storage medium and electronic device
US10636014B2 (en) Conversational calendar integration
WO2023278036A1 (en) Assistant for providing information on unknown topics
AU2025213617A1 (en) Electronic devices and methods for selecting and displaying multimodal content
US20240310978A1 (en) Systems and methods for a digital interface
US10043146B2 (en) Method and device for estimating efficiency of an employee of an organization
US20160380950A1 (en) System and method for detecting expertise via meeting participation
US20220309440A1 (en) Gamification of business processes
WO2017017598A1 (en) Method and system for enabling interactive infomercials
US20210295991A1 (en) System and method for identifying healthcare issues
EP3306494A1 (en) System and method for customizing a presentation
WO2025022220A1 (en) A system for providing a networking platform for user interaction and a method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: HUDDL INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YARLAGADDA, KRISHNA;RAJAMANI, HARISH;DAVULURI, NAVA;SIGNING DATES FROM 20210426 TO 20210503;REEL/FRAME:056147/0286

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION