US20210367986A1 - Enabling Collaboration Between Users - Google Patents
Enabling Collaboration Between Users Download PDFInfo
- Publication number
- US20210367986A1 US20210367986A1 US17/308,916 US202117308916A US2021367986A1 US 20210367986 A1 US20210367986 A1 US 20210367986A1 US 202117308916 A US202117308916 A US 202117308916A US 2021367986 A1 US2021367986 A1 US 2021367986A1
- Authority
- US
- United States
- Prior art keywords
- users
- electronic device
- applications
- application
- interactions
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/34—Browsing; Visualisation therefor
- G06F16/345—Summarisation for human users
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/954—Navigation, e.g. using categorised browsing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/109—Time management, e.g. calendars, reminders, meetings or time accounting
- G06Q10/1093—Calendar-based scheduling for persons or groups
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/109—Time management, e.g. calendars, reminders, meetings or time accounting
- G06Q10/1097—Time management, e.g. calendars, reminders, meetings or time accounting using calendar-based scheduling for task assignment
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
- G10L15/18—Speech classification or search using natural language modelling
- G10L15/1815—Semantic context, e.g. disambiguation of the recognition hypotheses based on word meaning
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/57—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for processing of video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1818—Conference organisation arrangements, e.g. handling schedules, setting up parameters needed by nodes to attend a conference, booking network resources, notifying involved parties
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1831—Tracking arrangements for later retrieval, e.g. recording contents, participants activities or behavior, network status
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/52—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1069—Session establishment or de-establishment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1096—Supplementary features, e.g. call forwarding or call holding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/401—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
- H04L65/4015—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
-
- H04L67/22—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/535—Tracking the activity of the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/155—Conference systems involving storage of or access to video conference sessions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8549—Creating video summaries, e.g. movie trailer
Definitions
- the presently disclosed exemplary embodiments are related, in general, to a web-based collaborative environment shared by a plurality of users. More particularly, the presently disclosed exemplary embodiments are related to a method and a system for synchronizing in real-time each of one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- Screen sharing may be an alternative option for users for collaborative working.
- remote screen sharing applications firstly only one user can share and control the edits that may be made to an application and plurality of users cannot edit/control content concurrently. Additionally, even if control may be given to other user for editing, even then the content being accessed is via remote sharing and not natively.
- users do not access files and applications natively i.e. applications or files installed and/or stored on the device itself, and thus synchronization between devices needs to be optimal.
- One problem frequently encountered in collaborative systems may be maintaining synchronization between the shared workspace at each of the client devices. Lack of synchronization may cause conflicts between users. For example, if synchronization is not maintained, a user may attempt to edit a document that has been deleted by another user. Therefore, it is necessary to maintain synchronization in real time or near real time between client devices and the shared workspace.
- a major technical problem in the aforementioned conventional approaches is that the files are not accessed natively and are accessed via a storage server or cloud server thereby making synchronization in real-time amongst plurality of users very important.
- quality and richness of experience while collaboration is minimal thereby leading to a poor collaborative effort.
- conventional collaboration software fails to facilitate collaboration between users.
- a method and a system for synchronizing in real-time each of one or more interactions received from the plurality of users for enabling collaboration between the plurality of users is provided substantially as shown in, and/or described in connection with, at least one of the figures, as set forth more completely in the claims.
- FIG. 1 is a block diagram that illustrates a system environment for synchronizing in real-time each of one or more interactions received from the plurality of users for enabling collaboration between the plurality of users, in accordance with at least one exemplary embodiment
- FIG. 2 is a block diagram that illustrates an electronic device configured for enabling collaboration between users, in accordance with at least one exemplary embodiment
- FIGS. 3A and 3B are diagrams that collectively illustrate an exemplary scenario of implementation of the collaborative canvas application to enable collaboration between participants in a video conferencing tool, in accordance with at least one exemplary embodiment
- FIG. 4 is a flowchart that illustrates a method for enabling collaboration between users, in accordance with at least one exemplary embodiment
- FIG. 5 is a block diagram of an exemplary computer system for implementing collaboration between users, in accordance with various exemplary embodiments of the present disclosure.
- the illustrated embodiments provide a method and a system for enabling collaboration between users.
- the method may be implemented by a collaborative canvas application executing on an electronic device including one or more processors.
- the method may include creating an interactive collaboration session amongst a plurality of users.
- the method may include receiving one or more interactions associated with a plurality of applications from the plurality of users.
- the plurality of applications may be initiated within the interactive collaboration session.
- each of the plurality of applications are accessed natively by each of the plurality of users.
- the native access to applications corresponds to a user accessing applications or files installed and/or stored on the electronic device.
- the method may include synchronizing in real-time each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- FIG. 1 is a block diagram that illustrates a system environment 100 for synchronizing in real-time each of one or more interactions received from the plurality of users for enabling collaboration between the plurality of users, in accordance with at least one exemplary embodiment.
- the system environment 100 may include a plurality of electronic devices, such as 102 , 104 and 106 associated with a plurality of users, such as User A 102 a , User B 104 a , and User C 106 a , a communication network 108 , and a database server 110 .
- Each of the electronic devices 102 , 104 , and 106 associated with the plurality of users may be communicatively coupled with each other via the communication network 108 .
- the plurality of electronic devices may refer to a computing device used by a user who may collaboratively work with remaining users.
- the plurality of electronic devices, such as electronic device 102 , 104 and 106 may comprise of one or more processors and one or more memories.
- the one or more memories may include computer readable code that may be executable by the one or more processors to perform predetermined operations.
- the plurality of electronic devices, such as electronic device 102 , 104 and 106 may present a user-interface to the user for performing one or more interactions on the electronic device.
- Examples of the plurality of electronic devices may include, but are not limited to, a personal computer, a laptop, a personal digital assistant (PDA), a mobile device, a tablet, or any other computing device.
- PDA personal digital assistant
- the plurality of users may be utilizing the electronic device 102 , the electronic device 104 and the electronic device 106 , respectively as shown in FIG. 1 .
- the plurality of users such as User A 102 a , User B 104 a , and User C 106 a may interact with the plurality of electronic devices, such as electronic device 102 , 104 and 106 by performing one or more interactions on the user-interface presented to each of the respective users of the associated electronic device.
- the communication network 108 may include a communication medium through which each of the electronic devices, such as 102 , 104 and 106 associated with the plurality of users may communicate with each other. Such a communication may be performed, in accordance with various wired and wireless communication protocols.
- wired and wireless communication protocols include, but are not limited to, Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), ZigBee, EDGE, infrared (IR), IEEE 802.11, 802.16, 2G, 3G, 4G, 5G, 6G cellular communication protocols, and/or Bluetooth (BT) communication protocols.
- the communication network 108 may include, but is not limited to, the Internet, a cloud network, a Wireless Fidelity (Wi-Fi) network, a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a telephone line (POTS), and/or a Metropolitan Area Network (MAN).
- Wi-Fi Wireless Fidelity
- WLAN Wireless Local Area Network
- LAN Local Area Network
- POTS telephone line
- MAN Metropolitan Area Network
- the plurality of electronic devices may include a database server 110 .
- the database server 110 may refer to a computing device that may be configured to store files associated with one or more applications installed on the electronic device.
- the plurality of electronic devices such as electronic device 102 , 104 and 106 may communicate with the database server 110 using one or more protocols such as, but not limited to, Open Database Connectivity (ODBC) protocol and Java Database Connectivity (JDBC) protocol.
- ODBC Open Database Connectivity
- JDBC Java Database Connectivity
- the database server 110 may include a special purpose operating system specifically configured to perform one or more database operations on one or more files associated with the plurality of applications initiated within the interactive collaboration session.
- Examples of database operations may include, but are not limited to, Select, Insert, Update, and Delete.
- the database server may include hardware that may be configured to perform one or more predetermined operations.
- the database server 110 may be realized through various technologies such as, but not limited to, Microsoft® SQL Server, Oracle®, IBM DB2®, Microsoft Access®, PostgreSQL®, MySQL® and SQLite®, and the like.
- the scope of the disclosure is not limited to realizing the electronic device and the database server as separate entities.
- the database server may be realized as an application program installed on and/or running on the electronic device without departing from the scope of the disclosure.
- the plurality of electronic devices may be configured to initiate and execute a collaborative canvas application.
- the plurality of electronic devices such as electronic device 102 , 104 and 106 may be configured to create an interactive collaboration session amongst the plurality of users.
- a User A 102 a , a User B 104 a , and a User C 106 a may be utilizing the electronic device 102 , the electronic device 104 and the electronic device 106 , respectively as shown in FIG. 1 .
- At least one of the User 102 a , User B 104 a , and User C 106 a may click on an icon associated with the collaborative canvas application.
- the User 102 a , the User B 104 a , and the User C 106 a may perform a touch operation on an icon associated with the collaborative canvas application.
- the collaborative canvas application may be initiated then each of the electronic device 102 , electronic device 104 and electronic device 106 create an interactive collaboration session amongst the User A 102 a , User B 104 a , and User C 106 a.
- the plurality of electronic devices such as electronic device 102 , 104 and 106 may be configured to receive one or more interactions associated with a plurality of applications from the plurality of users.
- the one or more interactions may comprise events performed by the plurality of users by using the electronic device or one or more device peripherals attached to the electronic device.
- the events comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
- the electronic device 102 of User A may initiate a plurality of applications (e.g. Confluence® and Jira®) within the interactive collaboration session.
- the User A may initiate Confluence® application and Jira® application concurrently and may further start making edits to the applications that have been opened within the interactive collaboration session.
- each of the plurality of applications may be accessed natively by each of the plurality of users.
- the native access to applications corresponds to user accessing applications or files installed and/or stored on the electronic device.
- the User A may initiate Confluence® application and Jira® application concurrently and the files associated with these applications may be stored on the electronic device 102 and the applications Confluence® application and Jira® application are also installed on the electronic device 102 .
- the User A is accessing local content that may be present on the electronic device 102 and may be making edits or changes to the files of the applications that are installed on the electronic device 102 .
- the User A and the User B create an interactive collaboration session. Further, the User A is interacting with Confluence® application and Jira® application concurrently that is installed on the electronic device 102 and the files associated with these applications may be stored on the electronic device 102 . The User B is interacting with Confluence® application and Photoshop® application concurrently that is installed on electronic device 104 and the files associated with these applications may be stored on the electronic device 104 .
- the plurality of electronic devices such as electronic device 102 , 104 and 106 may be configured to synchronizing in real-time, each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- each of the plurality of users may perform one or more interactions associated with the plurality of applications concurrently.
- the one or more interactions may comprise events performed by the plurality of users by using the electronic device.
- the events comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
- the User A 102 a is interacting with Confluence® application and Jira® application concurrently and the files associated with these applications may be stored on the electronic device 102 and the Confluence® application and Jira® application are also installed on the electronic device 102 .
- the User B 104 a may initiate Confluence® application and Photoshop® application concurrently and the files associated with these applications may be stored on the electronic device 104 and the Confluence® application and Photoshop® application are also installed on the electronic device 104 .
- the User A 102 a and the User B 104 a have created the interactive collaboration session and the plurality of applications have been initiated in the interactive collaboration session hence the User A 102 a and the User B 104 a may interact with the plurality of applications concurrently i.e. the User A 102 a and the User B 104 a may interact and make changes or edits to any of these applications: Confluence® application, Jira® application and Photoshop® application concurrently via the interactive collaboration session.
- a User A 102 a may not need to necessarily open/initiate each and every application within the interactive collaboration session. Only applications initiated within the interactive collaboration session may be utilized for collaboration between the plurality of users, such as User A 102 a , User B 104 a , and User C 106 a .
- a User B 104 a may choose to initiate an application that may not be within the interactive collaboration session so that the interactions performed by the user on such an application are not synchronized in real-time between the plurality of users, such as User A 102 a , User B 104 a , and User C 106 a .
- a pop-up dialog box may be utilized to confirm from the User B 104 a when the User B 104 a wants to initiate the application within the interactive collaboration session or not.
- a drag and drop feature may initiate an application within the interactive collaboration session.
- the graphical user interface of the interactive collaboration session may include a section that displays each of the plurality of applications that are being synchronized in real-time with the interactions performed by the plurality of users, such as User A 102 a , User B 104 a , and User C 106 a . If a User C 106 a drags an application icon within such a section, then the respective application may be initiated within interactive collaboration session.
- each user from the plurality of users may access content that is installed locally or natively and still make changes or interact with the content and then the electronic device may synchronize in real-time, each of the interactions received from the plurality of users for enabling collaboration between the plurality of users, such as User A 102 a , User B 104 a , and User C 106 a .
- the collaborative canvas application may be integrated within at least one of video conferencing tools, and online meeting tools.
- FIG. 2 is a block diagram that illustrates an electronic device, such as 102 configured for enabling collaboration between users, in accordance with at least one exemplary embodiment.
- the electronic device 102 may include a processor 202 , a memory 204 , a transceiver 206 , an input/output unit 208 , an interaction monitoring unit 210 , and a synchronization unit 212 .
- the processor 202 may be communicatively coupled to the memory 204 , the transceiver 206 , the input/output unit 208 , the interaction monitoring unit 210 , and the synchronization unit 212 .
- the transceiver 206 may be communicatively coupled to the communication network 108 .
- the processor 202 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to execute a set of instructions stored in the memory 204 .
- the processor 202 may be implemented based on several processor technologies known in the art.
- the processor 202 works in coordination with the transceiver 206 , the input/output unit 208 , the interaction monitoring unit 210 , and the synchronization unit 212 for enabling collaboration between plurality of users.
- Examples of the processor 202 include, but not limited to, an X86-based processor, a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (ASIC) processor, a Complex Instruction Set Computing (CISC) processor, and/or other processor.
- RISC Reduced Instruction Set Computing
- ASIC Application-Specific Integrated Circuit
- CISC Complex Instruction Set Computing
- the memory 204 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to store the set of instructions, which are executed by the processor 202 .
- the memory 204 may be configured to store one or more programs, routines, or scripts that are executed in coordination with the processor 202 .
- the memory 204 may be implemented based on a Random-Access Memory (RAM), a Read-Only Memory (ROM), a Hard Disk Drive (HDD), a storage server, and/or a Secure Digital (SD) card.
- RAM Random-Access Memory
- ROM Read-Only Memory
- HDD Hard Disk Drive
- SD Secure Digital
- the transceiver 206 comprises of suitable logic, circuitry, interfaces, and/or code that may be configured to receive one or more interactions associated with a plurality of applications from the plurality of users, via the communication network 108 .
- the transceiver 206 may be further configured to transmit and receive content during creation of the interactive collaboration session.
- the transceiver 206 may implement one or more known technologies to support wired or wireless communication with the communication network 108 .
- the transceiver 206 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a Universal Serial Bus (USB) device, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and/or a local buffer.
- the transceiver 206 may communicate via wireless communication with networks, such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN).
- networks such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN).
- networks such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and
- the wireless communication may use any of a plurality of communication standards, protocols and technologies, such as: Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS).
- GSM Global System for Mobile Communications
- EDGE Enhanced Data GSM Environment
- W-CDMA wideband code division multiple access
- CDMA code division multiple access
- TDMA time division multiple access
- Wi-Fi e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n
- VoIP voice over Internet Protocol
- Wi-MAX a protocol for email, instant messaging
- the input/output unit 208 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to provide one or more inputs that may correspond to one or more interactions to the electronic device for interactive collaboration amongst the plurality of users.
- the input/output unit 208 comprises of various input and output devices that are configured to communicate with the processor 202 .
- Examples of the input devices include, but are not limited to, a keyboard, a mouse, a joystick, a touch screen, a microphone, a camera, and/or a docking station.
- Examples of the output devices include, but are not limited to, a display screen and/or a speaker.
- the display screen may be configured to display synchronized real-time interactions received from the plurality of users within the collaborative canvas application for enabling collaboration between the plurality of users.
- the interaction monitoring unit 210 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to monitor one or more interactions performed by a user of the electronic device, such as electronic device 102 .
- the monitored one or more interactions may be associated with the plurality of applications that have been initiated within the interactive collaboration session. Further, such monitored one or more interactions may be transmitted to the remaining electronic devices, such as electronic device 104 and electronic device 106 , via the transceiver 206 .
- the synchronization unit 212 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to synchronize in real-time each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- any one of the plurality of users may initiate execution of a collaborative canvas application via the processor 202 .
- the processor 202 may be configured to create an interactive collaboration session amongst the plurality of users.
- the data transfer between the plurality of electronic devices during the interactive collaboration session may be performed by the transceiver 206 , via the communication network 108 .
- the transceiver 206 may be configured to receive one or more interactions associated with a plurality of applications from the plurality of users, via the communication network 108 .
- the plurality of applications may be initiated within the interactive collaboration session and each of the plurality of applications may be accessed natively by each of the plurality of users.
- the native access to applications corresponds to user accessing applications or files installed and/or stored on the electronic device 102 .
- the interaction monitoring unit 210 may be configured to monitor one or more interactions performed by a user of the electronic device 102 .
- the monitored one or more interactions may be associated with the plurality of applications that have been initiated within the interactive collaboration session. Further, such monitored one or more interactions may be transmitted to the remaining electronic devices, such as electronic device 104 and electronic device 106 , via the transceiver 206 .
- each of the plurality of users may interact with the plurality of applications concurrently and the transceiver of each respective electronic device may be configured to receive information regarding the one or more interactions performed by the plurality of users.
- the one or more interactions may comprise events performed by the plurality of users by using the electronic device 102 or one or more device peripherals attached to the electronic device 102 .
- the events may comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
- the synchronization unit 212 may be configured to synchronize in real-time each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- FIGS. 3A and 3B are diagrams that collectively illustrate an exemplary scenario of implementation of the collaborative canvas application to enable collaboration between participants in a video conferencing tool, in accordance with at least one exemplary embodiment.
- FIGS. 3A and 3B have been explained in conjunction with the elements of FIG. 1 .
- participant A and participant B may utilize electronic device 102 and electronic device 104 , respectively, to collaborate utilizing a video conferencing tool.
- the collaborative canvas application is integrated within the video conferencing tool by the processor 202 .
- the integration of the collaborative canvas application within the video conferencing tool may be performed by state of the art application integration tools, such as, but not limited to, Celigo®, Cloud Elements®, IBM Integration Designer®, and the like.
- Such integration tools ensure that the electronic device provides the user with the capability to integrate the collaborative canvas application within the video conferencing tool.
- the collaborative canvas application and the video conferencing tool may continue to work independent of each other.
- FIG. 3A shows an exemplary graphical user interface 302 of the video conferencing tool that is displayed to participant A and participant B on electronic device 102 and electronic device 104 , respectively when the interactive collaboration session amongst the participant A and participant B is created.
- the graphical user interface illustrates a collaborative canvas application icon 306 , a collaborative canvas area 308 , an application list section 310 , a participant list section 312 , a chat window 314 , and a meeting snippets section 316 .
- the collaborative canvas application icon 306 may be utilized to initiate execution of the collaborative canvas application that is integrated with the electronic device. For example, the participant A may click on the collaborative canvas application icon 306 to create the interactive collaboration session amongst the participant A and participant B.
- the collaborative canvas area 308 may be the area where the plurality of users i.e. participant A and participant B are able to collaboratively work and interact with a plurality of applications concurrently.
- the collaborative canvas area 308 may be more like an interactive white boarding solution that may help participants in a video conference/audio-visual meeting to share with each other and concurrently work across applications.
- interactions such as key typing, movement of mouse cursors, and the like of each of the plurality of participants may be viewed by each other via the collaborative canvas area 308 .
- the application list section 310 may display a plurality of applications that may be installed natively on each of the respective electronic devices.
- the participant may choose to initiate one or more applications displayed in the application list section 310 within the interactive collaboration session that may be created when the participant clicks on the collaborative canvas application icon 306 .
- the participant may choose not to initiate one or more applications displayed in the application list section 310 within the interactive collaboration session.
- the participant list section 312 displays the details of participants that are currently working collaboratively within the video conferencing tool.
- metadata about each of the participants may be displayed in the participant list section 312 .
- such metadata may comprise image of the participant, name of the participant, organization name of the participant, designation name of the participant and the like.
- chat window 314 may enable the plurality of participants that have joined the video conferencing tool to have conversations with each other and discuss about meeting agenda and any other relevant information.
- each of the electronic devices of the participant A and the participant B who have joined the video conference via the video conferencing tool may generate a plurality of meeting snippets based on the identification of the trigger event and such meeting snippets may be shared within the meeting snippets section 316 .
- Each of the plurality of participants may provide feedback about the plurality of meeting snippets.
- identifying a trigger event in real-time may be initiated by at least one participant of the video conference.
- the trigger event may be indicative of at least a reference to meeting metadata associated with the meeting.
- the video conference may be recorded for a predetermined duration to generate the plurality of meeting snippets.
- the meeting metadata associated with the meeting or video conference may comprise an agenda of the meeting, one or more topics to be discussed during the meeting, a time duration of the meeting, a schedule of the meeting, meeting notes carried forwarded from previous meetings, and/or the like.
- an interactive collaboration session may be established among the participant A and the participant B. Further, the participant A may initiate the application 1 and the application 2 in the interactive collaboration session.
- the application 1 and application 2 are installed natively or locally on the electronic device 102 of the participant A.
- native access to applications corresponds to user accessing applications or files installed and/or stored on the electronic device.
- FIG. 3A illustrates the collaborative canvas area 308 displaying the application 1 and application 2 that have been initiated by participant A.
- the participant A may perform one or more interactions associated with files of the application 1 and the application 2 concurrently.
- the one or more interactions comprises events performed by the participant A by using the electronic device 102 or one or more device peripherals attached to the electronic device 102 .
- the events comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event.
- the interaction monitoring unit 210 may be configured to monitor such interactions and such interactions are synchronized in real-time by the synchronization unit 212 to display on the collaborative canvas area 308 open on the electronic device 104 of the Participant B.
- FIG. 3A illustrates the collaborative canvas area 308 displaying the application 1 and the application 2 that have been initiated by the participant A and any changes/edits/interactions which are made by the participant A to any of the files associated with the application 1 and the application 2 may be viewed by the Participant B in real-time within the collaborative canvas area 308 .
- the Participant B may initiate the application 3 natively within the collaborative canvas area 308 of the electronic device 104 and start performing one or more interactions on one or more files associated with the application 3.
- the participant B may open a Microsoft Word® document that is stored locally on the electronic device 104 but open it within the interactive collaboration session.
- the participant B may interact with the Microsoft Word® document.
- the participant B may perform a typing event (such as, writing a sample software code) in the Microsoft Word® document.
- FIG. 3B 304 shows the updated collaborative canvas area 308 that may be depicted on both the electronic device 102 and the electronic device 104 for each of the respective Participant A and Participant B.
- interactions performed by the Participant B that are associated with the application 3 may be also viewed in real-time by the Participant A.
- Participant A may concurrently perform interactions on the application 1 and the application 2 that have been opened natively from the electronic device 102 and also perform interactions on the application 3 that has been initiated via the electronic device 104 .
- the participant B may concurrently interact with the application 3 that has been initiated natively from the electronic device 104 and also interact with the application 1, and the application 2 via the collaborative canvas application opened in the electronic device 104 .
- Such collaboration may be possible via the collaborative canvas application.
- the Participant A may be performing interactions associated with the application 1 and the application 2, then the Participant B will be able to view the changes in real-time because of the real-time synchronization.
- the Participant A and the Participant B may concurrently interact with the application 1, the application 2, and the application 3 and may also view the interactions performed by each of the participants in real-time because of the real-time synchronization.
- a collaborative experience may be provided that may be defined by a framework by monitoring interactions at each electronic device in real-time around web based content. This provides a way for participants of an interactive collaboration session to interact with web-based content by rendering content natively with ability to collaborate along with the other participants over it.
- Such framework and experience makes the traditional screen share experience obsolete which depends on largely a one-to-many broadcasting methodology of content.
- FIG. 4 is a flowchart that illustrates a method 400 for enabling collaboration between users, in accordance with at least one exemplary embodiment. The method starts at 402 and proceeds to 404 .
- an electronic device may be configured for executing a collaborative canvas application to create an interactive collaboration session amongst a plurality of users.
- the electronic device may be configured for receiving one or more interactions associated with plurality of applications from plurality of users and each of plurality of applications are accessed natively by each of plurality of users.
- the electronic device may be configured for synchronizing in real-time each of one or more interactions received from plurality of users for enabling collaboration between plurality of users. Control passes to end operation 410 .
- FIG. 5 is a block diagram of an exemplary computer system for implementing collaboration between users, in accordance with various exemplary embodiments of the present disclosure.
- the exemplary computer system 501 may comprise a central processing unit (“CPU” or “processor”) 502 , an I/O interface 503 , an input device 504 , an output device 505 , a transceiver 506 , a network interface 507 , a communication network 508 , devices, such as 509 , 510 and 511 , storage interface 512 , one or more memory devices, such as RAM 513 , ROM 514 , and memory device 515 .
- CPU central processing unit
- Computer system 501 may be used for implementing collaboration between users.
- the computer system 501 may comprise a central processing unit (“CPU” or “processor”) 502 .
- Processor 502 may comprise at least one data processor for executing program components for executing user- or system-generated requests.
- a user may include a person, a person using a device such as those included in this disclosure, or such a device itself.
- the processor 502 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
- the processor 502 may include a microprocessor, such as AMD Athlon, Duron or Opteron, ARM's application, embedded or secure processors, IBM PowerPC, Intel's Core, Itanium, Xeon, Celeron or other line of processors, etc.
- the processor 502 may be implemented using mainframe, distributed processor, multi-core, parallel, grid, or other architectures. Some exemplary embodiments may utilize embedded technologies like application-specific integrated circuits (ASICs), digital signal processors (DSPs), Field Programmable Gate Arrays (FPGAs), etc.
- ASICs application-specific integrated circuits
- DSPs digital signal processors
- FPGAs Field Programmable Gate Arrays
- I/O Processor 502 may be disposed in communication with one or more input/output (I/O) devices via I/O interface 503 .
- the I/O interface 503 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n/b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
- CDMA code-division multiple access
- HSPA+ high-speed packet access
- GSM global system for mobile communications
- LTE long-term evolution
- WiMax wireless wide area network
- the computer system 501 may communicate with one or more I/O devices.
- the input device 504 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, sensor (e.g., accelerometer, light sensor, GPS, gyroscope, proximity sensor, or the like), stylus, scanner, storage device, transceiver, video device/source, visors, etc.
- Output device 505 may be a printer, fax machine, video display (e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, or the like), audio speaker, etc.
- video display e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, or the like
- audio speaker etc.
- a transceiver 506 may be disposed in connection with the processor 502 . The transceiver may facilitate various types of wireless transmission or reception.
- the transceiver may include an antenna operatively connected to a transceiver chip (e.g., Texas Instruments WiLink WL1283, Broadcom BCM4750IUB8, Infineon Technologies X-Gold 618-PMB9800, or the like), providing IEEE 802.11a/b/g/n, Bluetooth, FM, global positioning system (GPS), 2G/3G HSDPA/HSUPA communications, etc.
- a transceiver chip e.g., Texas Instruments WiLink WL1283, Broadcom BCM4750IUB8, Infineon Technologies X-Gold 618-PMB9800, or the like
- IEEE 802.11a/b/g/n e.g., Texas Instruments WiLink WL1283, Broadcom BCM4750IUB8, Infineon Technologies X-Gold 618-PMB9800, or the like
- IEEE 802.11a/b/g/n e.g., Bluetooth, FM, global positioning system (GPS), 2G/3G HSDPA/HS
- the processor 502 may be disposed in communication with a communication network 508 via a network interface 507 .
- the network interface 507 may communicate with the communication network 508 .
- the network interface may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc.
- the communication network 508 may include, without limitation, a direct interconnection, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, etc.
- the computer system 501 may communicate with devices 509 , 510 , and 511 .
- These devices may include, without limitation, personal computer(s), server(s), fax machines, printers, scanners, various mobile devices such as cellular telephones, smartphones (e.g., Apple iPhone®, Blackberry®, Android®-based phones, etc.), tablet computers, eBook readers (Amazon Kindle®, Nook, etc.), laptop computers, notebooks, gaming consoles (Microsoft Xbox®, Nintendo DS®, Sony PlayStation®, etc.), or the like.
- the computer system 1101 may itself embody one or more of these devices.
- the processor 502 may be disposed in communication with one or more memory devices (e.g., RAM 513 , ROM 514 , etc.) via a storage interface 512 .
- the storage interface may connect to memory devices including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as serial advanced technology attachment (SATA), integrated drive electronics (IDE), IEEE-1394, universal serial bus (USB), fiber channel, small computer systems interface (SCSI), etc.
- the memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, redundant array of independent discs (RAID), solid-state memory devices, solid-state drives, etc.
- the memory devices may store a collection of program or database components, including, without limitation, an operating system 516 , user interface application 517 , web browser 518 , mail server 519 , mail client 520 , user/application data 521 (e.g., any data variables or data records discussed in this disclosure), etc.
- the operating system 516 may facilitate resource management and operation of the computer system 501 .
- Operating systems include, without limitation, Apple Macintosh OS X, UNIX, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSD, NetBSD, OpenBSD, etc.), Linux distributions (e.g., Red Hat, Ubuntu, Kubuntu, etc.), IBM OS/2, Microsoft Windows (XP, Vista/7/8, etc.), Apple iOS, Google Android, Blackberry OS, or the like.
- User interface 517 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities.
- user interfaces may provide computer interaction interface elements on a display system operatively connected to the computer system 501 , such as cursors, icons, check boxes, menus, scrollers, windows, widgets, etc.
- GUIs Graphical user interfaces
- GUIs may be employed, including, without limitation, Apple Macintosh operating systems' Aqua, IBM OS/2, Microsoft Windows (e.g., Aero, Metro, etc.), Unix X-Windows, web interface libraries (e.g., ActiveX, Java, Javascript, AJAX, HTML, Adobe Flash, etc.), or the like.
- the computer system 501 may implement a web browser 1118 stored program component.
- the web browser may be a hypertext viewing application, such as Microsoft Internet Explorer, Google Chrome, Mozilla Firefox, Apple Safari, etc. Secure web browsing may be provided using HTTPS (secure hypertext transport protocol), secure sockets layer (SSL), Transport Layer Security (TLS), etc. Web browsers may utilize facilities such as AJAX, DHTML, Adobe Flash, JavaScript, Java, application programming interfaces (APIs), etc.
- the computer system 501 may implement a mail server 519 stored program component.
- the mail server may be an Internet mail server such as Microsoft Exchange, or the like.
- the mail server may utilize facilities such as ASP, ActiveX, ANSI C++/C#, Microsoft .NET, CGI scripts, Java, JavaScript, PERL, PHP, Python, WebObjects, etc.
- the mail server may utilize communication protocols such as internet message access protocol (IMAP), messaging application programming interface (MAPI), Microsoft Exchange, post office protocol (POP), simple mail transfer protocol (SMTP), or the like.
- IMAP internet message access protocol
- MAPI messaging application programming interface
- POP post office protocol
- SMTP simple mail transfer protocol
- the computer system 401 may implement a mail client 520 stored program component.
- the mail client may be a mail viewing application, such as Apple Mail, Microsoft Entourage, Microsoft Outlook, Mozilla Thunderbird, etc.
- computer system 501 may store user/application data 1121 , such as the data, variables, records, etc. as described in this disclosure.
- databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
- databases may be implemented using standardized data structures, such as an array, hash, linked list, struct, structured text file (e.g., XML), table, or as object-oriented databases (e.g., using ObjectStore, Poet, Zope, etc.).
- object-oriented databases e.g., using ObjectStore, Poet, Zope, etc.
- Such databases may be consolidated or distributed, sometimes among the various computer systems discussed above in this disclosure. It is to be understood that the structure and operation of the any computer or database component may be combined, consolidated, or distributed in any working combination.
- a computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored.
- a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform operations or stages consistent with the exemplary embodiments described herein.
- the term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
- the methods and systems may render content via the interactive collaboration session natively thereby improving quality and richness of experience while maximizing the collaboration.
- content may be rendered natively and plurality of users may interact with plurality of applications concurrently hence a better and efficient system for collaboration is implemented.
- only the synchronization of interactions received from the plurality of users is performed rather synchronizing complete applications that are hosted on a third party or cloud server. This reduces the bandwidth requirement and the synchronization of interactions amongst the plurality of the applications is maintained in real time and there is no lag.
- the efficacy of system while enabling collaboration amongst plurality of uses is improved.
- the disclosure enables plurality of users to interact with plurality of applications concurrently.
- the technical problems of real-time synchronization and the ability to concurrently interact with plurality of applications and across such plurality of applications is solved by implementing operations for creating, by a collaborative canvas application executing on an electronic device, an interactive collaboration session amongst a plurality of users; receiving, by the electronic device, one or more interactions associated with a plurality of applications from the plurality of users, wherein the plurality of applications are initiated within the interactive collaboration session, and wherein each of the plurality of applications are accessed natively by each of the plurality of users; and synchronizing in real-time, by the electronic device, each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- the disclosure overcomes the aforementioned shortcomings of lag in synchronization by ensuring that each of the plurality of applications are accessed natively by each of the plurality of users when the interactive collaboration session is initiated.
- the method provides for efficient and accurate ways for lag free real-time synchronization of each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- the present disclosure may be realized in hardware, or a combination of hardware and software.
- the present disclosure may be realized in a centralized fashion, in at least one computer system, or in a distributed fashion, where different elements may be spread across several interconnected computer systems.
- a computer system or other apparatus adapted for carrying out the methods described herein may be suited.
- a combination of hardware and software may be a general-purpose computer system with a computer program that, when loaded and executed, may control the computer system such that it carries out the methods described herein.
- the present disclosure may be realized in hardware that comprises a portion of an integrated circuit that also performs other functions.
- any of the aforementioned operations and/or system modules may be suitably replaced, reordered, or removed, and additional operations and/or system modules may be inserted, depending on the needs of a particular application.
- the systems of the aforementioned exemplary embodiments may be implemented using a wide variety of suitable processes and system modules, and are not limited to any particular computer hardware, software, middleware, firmware, microcode, and the like.
- the claims can encompass exemplary embodiments for hardware and software, or a combination thereof.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Computer Interaction (AREA)
- General Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Acoustics & Sound (AREA)
- Software Systems (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Evolutionary Computation (AREA)
- Computer Hardware Design (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Telephonic Communication Services (AREA)
- Information Transfer Between Computers (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
- This application makes reference to, claims priority to, and claims benefit from U.S. Provisional Application Ser. No. 63/028,123, which was filed on May 21, 2020.
- The above referenced application is hereby incorporated herein by reference in its entirety.
- The presently disclosed exemplary embodiments are related, in general, to a web-based collaborative environment shared by a plurality of users. More particularly, the presently disclosed exemplary embodiments are related to a method and a system for synchronizing in real-time each of one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- There may be many situations where different users need to collaborate and work together in a coordinated fashion. As an example, users at different locations may need to collaborate to generate a report or presentation. The generation of the report or presentation may involve use of many different software applications concurrently. For example, a text editor may be used to generate text for the report while a graphics program may be used to generate graphics and a photo editing program may be used to generate or manipulate images. All of these applications need to work in parallel in a shared workspace, however, the plurality of users cannot access each of the plurality of applications concurrently and cannot make edits to one or more files associated with such applications concurrently.
- Screen sharing may be an alternative option for users for collaborative working. However, in remote screen sharing applications, firstly only one user can share and control the edits that may be made to an application and plurality of users cannot edit/control content concurrently. Additionally, even if control may be given to other user for editing, even then the content being accessed is via remote sharing and not natively. Thus, users do not access files and applications natively i.e. applications or files installed and/or stored on the device itself, and thus synchronization between devices needs to be optimal. One problem frequently encountered in collaborative systems may be maintaining synchronization between the shared workspace at each of the client devices. Lack of synchronization may cause conflicts between users. For example, if synchronization is not maintained, a user may attempt to edit a document that has been deleted by another user. Therefore, it is necessary to maintain synchronization in real time or near real time between client devices and the shared workspace.
- A major technical problem in the aforementioned conventional approaches is that the files are not accessed natively and are accessed via a storage server or cloud server thereby making synchronization in real-time amongst plurality of users very important. In addition to the above, if there are network issues then quality and richness of experience while collaboration is minimal thereby leading to a poor collaborative effort. Thus, conventional collaboration software fails to facilitate collaboration between users.
- Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of described systems with some aspects of the present disclosure, as set forth in the remainder of the present application and with reference to the drawings.
- A method and a system for synchronizing in real-time each of one or more interactions received from the plurality of users for enabling collaboration between the plurality of users is provided substantially as shown in, and/or described in connection with, at least one of the figures, as set forth more completely in the claims.
- These and other features and advantages of the present disclosure may be appreciated from a review of the following detailed description of the present disclosure, along with the accompanying figures in which like reference numerals refer to like parts throughout.
- The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, exemplary embodiments, and features described above, further aspects, exemplary embodiments, and features will become apparent by reference to the drawings and the following detailed description. Further, it is to be understood that the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
-
FIG. 1 is a block diagram that illustrates a system environment for synchronizing in real-time each of one or more interactions received from the plurality of users for enabling collaboration between the plurality of users, in accordance with at least one exemplary embodiment; -
FIG. 2 is a block diagram that illustrates an electronic device configured for enabling collaboration between users, in accordance with at least one exemplary embodiment; -
FIGS. 3A and 3B are diagrams that collectively illustrate an exemplary scenario of implementation of the collaborative canvas application to enable collaboration between participants in a video conferencing tool, in accordance with at least one exemplary embodiment; -
FIG. 4 is a flowchart that illustrates a method for enabling collaboration between users, in accordance with at least one exemplary embodiment; and -
FIG. 5 is a block diagram of an exemplary computer system for implementing collaboration between users, in accordance with various exemplary embodiments of the present disclosure. - The illustrated embodiments provide a method and a system for enabling collaboration between users. The method may be implemented by a collaborative canvas application executing on an electronic device including one or more processors. The method may include creating an interactive collaboration session amongst a plurality of users. The method may include receiving one or more interactions associated with a plurality of applications from the plurality of users. In an exemplary embodiment, the plurality of applications may be initiated within the interactive collaboration session. In an exemplary embodiment, each of the plurality of applications are accessed natively by each of the plurality of users. In an exemplary embodiment, the native access to applications corresponds to a user accessing applications or files installed and/or stored on the electronic device. Further, the method may include synchronizing in real-time each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
-
FIG. 1 is a block diagram that illustrates asystem environment 100 for synchronizing in real-time each of one or more interactions received from the plurality of users for enabling collaboration between the plurality of users, in accordance with at least one exemplary embodiment. Referring toFIG. 1 , thesystem environment 100 may include a plurality of electronic devices, such as 102, 104 and 106 associated with a plurality of users, such as User A 102 a, User B 104 a, and User C 106 a, acommunication network 108, and adatabase server 110. Each of theelectronic devices communication network 108. - The plurality of electronic devices, such as
electronic device electronic device electronic device electronic device - The plurality of users, such as User A 102 a, User B 104 a, and User C 106 a may be utilizing the
electronic device 102, theelectronic device 104 and theelectronic device 106, respectively as shown inFIG. 1 . The plurality of users, such as User A 102 a, User B 104 a, and User C 106 a may interact with the plurality of electronic devices, such aselectronic device - In an exemplary embodiment, the
communication network 108 may include a communication medium through which each of the electronic devices, such as 102, 104 and 106 associated with the plurality of users may communicate with each other. Such a communication may be performed, in accordance with various wired and wireless communication protocols. Examples of such wired and wireless communication protocols include, but are not limited to, Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), ZigBee, EDGE, infrared (IR), IEEE 802.11, 802.16, 2G, 3G, 4G, 5G, 6G cellular communication protocols, and/or Bluetooth (BT) communication protocols. Thecommunication network 108 may include, but is not limited to, the Internet, a cloud network, a Wireless Fidelity (Wi-Fi) network, a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a telephone line (POTS), and/or a Metropolitan Area Network (MAN). - In an exemplary embodiment, the plurality of electronic devices, such as
electronic device database server 110. In an exemplary embodiment, thedatabase server 110 may refer to a computing device that may be configured to store files associated with one or more applications installed on the electronic device. In an exemplary embodiment, the plurality of electronic devices, such aselectronic device database server 110 using one or more protocols such as, but not limited to, Open Database Connectivity (ODBC) protocol and Java Database Connectivity (JDBC) protocol. In an exemplary embodiment, thedatabase server 110 may include a special purpose operating system specifically configured to perform one or more database operations on one or more files associated with the plurality of applications initiated within the interactive collaboration session. Examples of database operations may include, but are not limited to, Select, Insert, Update, and Delete. In an exemplary embodiment, the database server may include hardware that may be configured to perform one or more predetermined operations. In an exemplary embodiment, thedatabase server 110 may be realized through various technologies such as, but not limited to, Microsoft® SQL Server, Oracle®, IBM DB2®, Microsoft Access®, PostgreSQL®, MySQL® and SQLite®, and the like. - A person having ordinary skill in the art will appreciate that the scope of the disclosure is not limited to realizing the electronic device and the database server as separate entities. In an exemplary embodiment, the database server may be realized as an application program installed on and/or running on the electronic device without departing from the scope of the disclosure.
- In operation, the plurality of electronic devices, such as
electronic device electronic device electronic device 102, theelectronic device 104 and theelectronic device 106, respectively as shown inFIG. 1 . At least one of the User 102 a, User B 104 a, and User C 106 a may click on an icon associated with the collaborative canvas application. In an exemplary embodiment, the User 102 a, the User B 104 a, and the User C 106 a may perform a touch operation on an icon associated with the collaborative canvas application. Once the collaborative canvas application may be initiated then each of theelectronic device 102,electronic device 104 andelectronic device 106 create an interactive collaboration session amongst the User A 102 a, User B 104 a, and User C 106 a. - After the interactive collaboration session has been created, the plurality of electronic devices, such as
electronic device - For example, after the interactive collaboration session has been created amongst the
electronic device 102, theelectronic device 104 and theelectronic device 106, of the User A, the User B and the User C, respectively, then theelectronic device 102 of User A may initiate a plurality of applications (e.g. Confluence® and Jira®) within the interactive collaboration session. For example, the User A may initiate Confluence® application and Jira® application concurrently and may further start making edits to the applications that have been opened within the interactive collaboration session. - In an exemplary embodiment, each of the plurality of applications may be accessed natively by each of the plurality of users. The native access to applications corresponds to user accessing applications or files installed and/or stored on the electronic device. For example, continuing the same example given above, the User A may initiate Confluence® application and Jira® application concurrently and the files associated with these applications may be stored on the
electronic device 102 and the applications Confluence® application and Jira® application are also installed on theelectronic device 102. Thus, the User A is accessing local content that may be present on theelectronic device 102 and may be making edits or changes to the files of the applications that are installed on theelectronic device 102. - An example of the plurality of users accessing the plurality of applications natively can be explained as follows. The User A and the User B create an interactive collaboration session. Further, the User A is interacting with Confluence® application and Jira® application concurrently that is installed on the
electronic device 102 and the files associated with these applications may be stored on theelectronic device 102. The User B is interacting with Confluence® application and Photoshop® application concurrently that is installed onelectronic device 104 and the files associated with these applications may be stored on theelectronic device 104. - Further, the plurality of electronic devices, such as
electronic device - Now, in an exemplary embodiment, the User A 102 a is interacting with Confluence® application and Jira® application concurrently and the files associated with these applications may be stored on the
electronic device 102 and the Confluence® application and Jira® application are also installed on theelectronic device 102. Similarly, the User B 104 a may initiate Confluence® application and Photoshop® application concurrently and the files associated with these applications may be stored on theelectronic device 104 and the Confluence® application and Photoshop® application are also installed on theelectronic device 104. Now, as the User A 102 a and the User B 104 a have created the interactive collaboration session and the plurality of applications have been initiated in the interactive collaboration session hence the User A 102 a and the User B 104 a may interact with the plurality of applications concurrently i.e. the User A 102 a and the User B 104 a may interact and make changes or edits to any of these applications: Confluence® application, Jira® application and Photoshop® application concurrently via the interactive collaboration session. - In an exemplary embodiment, a User A 102 a may not need to necessarily open/initiate each and every application within the interactive collaboration session. Only applications initiated within the interactive collaboration session may be utilized for collaboration between the plurality of users, such as User A 102 a, User B 104 a, and User C 106 a. In an exemplary embodiment, a User B 104 a may choose to initiate an application that may not be within the interactive collaboration session so that the interactions performed by the user on such an application are not synchronized in real-time between the plurality of users, such as User A 102 a, User B 104 a, and User C 106 a. In an exemplary embodiment, when an application is initiated, a pop-up dialog box may be utilized to confirm from the User B 104 a when the User B 104 a wants to initiate the application within the interactive collaboration session or not. In another exemplary embodiment, a drag and drop feature may initiate an application within the interactive collaboration session. For example, the graphical user interface of the interactive collaboration session may include a section that displays each of the plurality of applications that are being synchronized in real-time with the interactions performed by the plurality of users, such as User A 102 a, User B 104 a, and User C 106 a. If a User C 106 a drags an application icon within such a section, then the respective application may be initiated within interactive collaboration session.
- In accordance with various embodiments, each user from the plurality of users, such as User A 102 a, User B 104 a, and User C 106 a may access content that is installed locally or natively and still make changes or interact with the content and then the electronic device may synchronize in real-time, each of the interactions received from the plurality of users for enabling collaboration between the plurality of users, such as User A 102 a, User B 104 a, and User C 106 a. In an exemplary embodiment, the collaborative canvas application may be integrated within at least one of video conferencing tools, and online meeting tools.
-
FIG. 2 is a block diagram that illustrates an electronic device, such as 102 configured for enabling collaboration between users, in accordance with at least one exemplary embodiment. - In an exemplary embodiment, the
electronic device 102 may include aprocessor 202, amemory 204, atransceiver 206, an input/output unit 208, aninteraction monitoring unit 210, and asynchronization unit 212. Theprocessor 202 may be communicatively coupled to thememory 204, thetransceiver 206, the input/output unit 208, theinteraction monitoring unit 210, and thesynchronization unit 212. Thetransceiver 206 may be communicatively coupled to thecommunication network 108. - The
processor 202 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to execute a set of instructions stored in thememory 204. Theprocessor 202 may be implemented based on several processor technologies known in the art. Theprocessor 202 works in coordination with thetransceiver 206, the input/output unit 208, theinteraction monitoring unit 210, and thesynchronization unit 212 for enabling collaboration between plurality of users. Examples of theprocessor 202 include, but not limited to, an X86-based processor, a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit (ASIC) processor, a Complex Instruction Set Computing (CISC) processor, and/or other processor. - The
memory 204 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to store the set of instructions, which are executed by theprocessor 202. In an exemplary embodiment, thememory 204 may be configured to store one or more programs, routines, or scripts that are executed in coordination with theprocessor 202. Thememory 204 may be implemented based on a Random-Access Memory (RAM), a Read-Only Memory (ROM), a Hard Disk Drive (HDD), a storage server, and/or a Secure Digital (SD) card. - The
transceiver 206 comprises of suitable logic, circuitry, interfaces, and/or code that may be configured to receive one or more interactions associated with a plurality of applications from the plurality of users, via thecommunication network 108. Thetransceiver 206 may be further configured to transmit and receive content during creation of the interactive collaboration session. Thetransceiver 206 may implement one or more known technologies to support wired or wireless communication with thecommunication network 108. In an exemplary embodiment, thetransceiver 206 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a Universal Serial Bus (USB) device, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and/or a local buffer. Thetransceiver 206 may communicate via wireless communication with networks, such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN). The wireless communication may use any of a plurality of communication standards, protocols and technologies, such as: Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS). - The input/
output unit 208 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to provide one or more inputs that may correspond to one or more interactions to the electronic device for interactive collaboration amongst the plurality of users. The input/output unit 208 comprises of various input and output devices that are configured to communicate with theprocessor 202. Examples of the input devices include, but are not limited to, a keyboard, a mouse, a joystick, a touch screen, a microphone, a camera, and/or a docking station. Examples of the output devices include, but are not limited to, a display screen and/or a speaker. The display screen may be configured to display synchronized real-time interactions received from the plurality of users within the collaborative canvas application for enabling collaboration between the plurality of users. - The
interaction monitoring unit 210 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to monitor one or more interactions performed by a user of the electronic device, such aselectronic device 102. In an exemplary embodiment, the monitored one or more interactions may be associated with the plurality of applications that have been initiated within the interactive collaboration session. Further, such monitored one or more interactions may be transmitted to the remaining electronic devices, such aselectronic device 104 andelectronic device 106, via thetransceiver 206. - The
synchronization unit 212 comprises suitable logic, circuitry, interfaces, and/or code that may be configured to synchronize in real-time each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users. - In operation, when a plurality of users need to collaborate then any one of the plurality of users may initiate execution of a collaborative canvas application via the
processor 202. In response to initiation of the collaborative canvas application, theprocessor 202 may be configured to create an interactive collaboration session amongst the plurality of users. The data transfer between the plurality of electronic devices during the interactive collaboration session may be performed by thetransceiver 206, via thecommunication network 108. - After the interactive collaboration session may be created, the
transceiver 206 may be configured to receive one or more interactions associated with a plurality of applications from the plurality of users, via thecommunication network 108. In an exemplary embodiment, the plurality of applications may be initiated within the interactive collaboration session and each of the plurality of applications may be accessed natively by each of the plurality of users. In an exemplary embodiment, the native access to applications corresponds to user accessing applications or files installed and/or stored on theelectronic device 102. - Further, the
interaction monitoring unit 210 may be configured to monitor one or more interactions performed by a user of theelectronic device 102. In an exemplary embodiment, the monitored one or more interactions may be associated with the plurality of applications that have been initiated within the interactive collaboration session. Further, such monitored one or more interactions may be transmitted to the remaining electronic devices, such aselectronic device 104 andelectronic device 106, via thetransceiver 206. In an exemplary embodiment, each of the plurality of users may interact with the plurality of applications concurrently and the transceiver of each respective electronic device may be configured to receive information regarding the one or more interactions performed by the plurality of users. - In an exemplary embodiment, the one or more interactions may comprise events performed by the plurality of users by using the
electronic device 102 or one or more device peripherals attached to theelectronic device 102. In an exemplary embodiment, the events may comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event. - Further, the
synchronization unit 212 may be configured to synchronize in real-time each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users. - A person skilled in the art will understand that the scope of the disclosure should not be limited to enabling collaboration between users based on the aforementioned factors and using the aforementioned techniques. Further, the examples provided are for illustrative purposes and should not be construed to limit the scope of the disclosure.
-
FIGS. 3A and 3B are diagrams that collectively illustrate an exemplary scenario of implementation of the collaborative canvas application to enable collaboration between participants in a video conferencing tool, in accordance with at least one exemplary embodiment.FIGS. 3A and 3B have been explained in conjunction with the elements ofFIG. 1 . - In an exemplary scenario two participants, for example, participant A and participant B may utilize
electronic device 102 andelectronic device 104, respectively, to collaborate utilizing a video conferencing tool. The collaborative canvas application is integrated within the video conferencing tool by theprocessor 202. In an exemplary embodiment, the integration of the collaborative canvas application within the video conferencing tool may be performed by state of the art application integration tools, such as, but not limited to, Celigo®, Cloud Elements®, IBM Integration Designer®, and the like. Such integration tools ensure that the electronic device provides the user with the capability to integrate the collaborative canvas application within the video conferencing tool. In an exemplary embodiment, the collaborative canvas application and the video conferencing tool may continue to work independent of each other. -
FIG. 3A shows an exemplarygraphical user interface 302 of the video conferencing tool that is displayed to participant A and participant B onelectronic device 102 andelectronic device 104, respectively when the interactive collaboration session amongst the participant A and participant B is created. - As shown in
FIG. 3A , the graphical user interface illustrates a collaborativecanvas application icon 306, acollaborative canvas area 308, anapplication list section 310, aparticipant list section 312, achat window 314, and ameeting snippets section 316. The collaborativecanvas application icon 306 may be utilized to initiate execution of the collaborative canvas application that is integrated with the electronic device. For example, the participant A may click on the collaborativecanvas application icon 306 to create the interactive collaboration session amongst the participant A and participant B. - The
collaborative canvas area 308 may be the area where the plurality of users i.e. participant A and participant B are able to collaboratively work and interact with a plurality of applications concurrently. Thecollaborative canvas area 308 may be more like an interactive white boarding solution that may help participants in a video conference/audio-visual meeting to share with each other and concurrently work across applications. In an exemplary embodiment, interactions such as key typing, movement of mouse cursors, and the like of each of the plurality of participants may be viewed by each other via thecollaborative canvas area 308. - The
application list section 310 may display a plurality of applications that may be installed natively on each of the respective electronic devices. In an exemplary embodiment, the participant may choose to initiate one or more applications displayed in theapplication list section 310 within the interactive collaboration session that may be created when the participant clicks on the collaborativecanvas application icon 306. In an exemplary embodiment, the participant may choose not to initiate one or more applications displayed in theapplication list section 310 within the interactive collaboration session. When an application may be opened within thecollaborative canvas area 308, the participant A and the participant B may concurrently interact with the files associated with the application. - The
participant list section 312 displays the details of participants that are currently working collaboratively within the video conferencing tool. In an exemplary embodiment, metadata about each of the participants may be displayed in theparticipant list section 312. In an exemplary embodiment, such metadata may comprise image of the participant, name of the participant, organization name of the participant, designation name of the participant and the like. - Further, the
chat window 314 may enable the plurality of participants that have joined the video conferencing tool to have conversations with each other and discuss about meeting agenda and any other relevant information. - Further, each of the electronic devices of the participant A and the participant B who have joined the video conference via the video conferencing tool may generate a plurality of meeting snippets based on the identification of the trigger event and such meeting snippets may be shared within the
meeting snippets section 316. Each of the plurality of participants may provide feedback about the plurality of meeting snippets. In an exemplary embodiment, identifying a trigger event in real-time may be initiated by at least one participant of the video conference. In an exemplary embodiment, the trigger event may be indicative of at least a reference to meeting metadata associated with the meeting. In an exemplary embodiment, the video conference may be recorded for a predetermined duration to generate the plurality of meeting snippets. In an exemplary embodiment, the meeting metadata associated with the meeting or video conference may comprise an agenda of the meeting, one or more topics to be discussed during the meeting, a time duration of the meeting, a schedule of the meeting, meeting notes carried forwarded from previous meetings, and/or the like. - Based on the participant A clicking on the collaborative
canvas application icon 306 and thecollaborative canvas area 308 being displayed on bothelectronic device 102 andelectronic device 104, an interactive collaboration session may be established among the participant A and the participant B. Further, the participant A may initiate theapplication 1 and theapplication 2 in the interactive collaboration session. Theapplication 1 andapplication 2 are installed natively or locally on theelectronic device 102 of the participant A. In an exemplary embodiment, native access to applications corresponds to user accessing applications or files installed and/or stored on the electronic device.FIG. 3A illustrates thecollaborative canvas area 308 displaying theapplication 1 andapplication 2 that have been initiated by participant A. - After the
application 1 and theapplication 2 are initiated in the interactive collaboration session, the participant A may perform one or more interactions associated with files of theapplication 1 and theapplication 2 concurrently. In an exemplary embodiment, the one or more interactions comprises events performed by the participant A by using theelectronic device 102 or one or more device peripherals attached to theelectronic device 102. In an exemplary embodiment, the events comprise at least one of a mouse click event, a scrolling event, a typing event, or a hover event. - The
interaction monitoring unit 210 may be configured to monitor such interactions and such interactions are synchronized in real-time by thesynchronization unit 212 to display on thecollaborative canvas area 308 open on theelectronic device 104 of the Participant B.FIG. 3A illustrates thecollaborative canvas area 308 displaying theapplication 1 and theapplication 2 that have been initiated by the participant A and any changes/edits/interactions which are made by the participant A to any of the files associated with theapplication 1 and theapplication 2 may be viewed by the Participant B in real-time within thecollaborative canvas area 308. - Similarly, the Participant B may initiate the
application 3 natively within thecollaborative canvas area 308 of theelectronic device 104 and start performing one or more interactions on one or more files associated with theapplication 3. For example, the participant B may open a Microsoft Word® document that is stored locally on theelectronic device 104 but open it within the interactive collaboration session. Further, the participant B may interact with the Microsoft Word® document. For example, the participant B may perform a typing event (such as, writing a sample software code) in the Microsoft Word® document. - Once the participant B opens the Microsoft Word® document in the
collaborative canvas area 308 ofelectronic device 104 within the interactive collaboration session, then thecollaborative canvas area 308 initiated on the participant A'selectronic device 102 may also be updated to display files of theapplication 3 that is being used for collaboration i.e. Microsoft Word® document.FIG. 304 shows the updated3B collaborative canvas area 308 that may be depicted on both theelectronic device 102 and theelectronic device 104 for each of the respective Participant A and Participant B. - Further, interactions performed by the Participant B that are associated with the
application 3 may be also viewed in real-time by the Participant A. Additionally, Participant A may concurrently perform interactions on theapplication 1 and theapplication 2 that have been opened natively from theelectronic device 102 and also perform interactions on theapplication 3 that has been initiated via theelectronic device 104. Similarly, the participant B may concurrently interact with theapplication 3 that has been initiated natively from theelectronic device 104 and also interact with theapplication 1, and theapplication 2 via the collaborative canvas application opened in theelectronic device 104. Such collaboration may be possible via the collaborative canvas application. When the Participant A may be performing interactions associated with theapplication 1 and theapplication 2, then the Participant B will be able to view the changes in real-time because of the real-time synchronization. - Based on the description above, the Participant A and the Participant B may concurrently interact with the
application 1, theapplication 2, and theapplication 3 and may also view the interactions performed by each of the participants in real-time because of the real-time synchronization. - By integrating the collaborative canvas application into video conferencing tools a collaborative experience may be provided that may be defined by a framework by monitoring interactions at each electronic device in real-time around web based content. This provides a way for participants of an interactive collaboration session to interact with web-based content by rendering content natively with ability to collaborate along with the other participants over it. Such framework and experience makes the traditional screen share experience obsolete which depends on largely a one-to-many broadcasting methodology of content.
-
FIG. 4 is a flowchart that illustrates amethod 400 for enabling collaboration between users, in accordance with at least one exemplary embodiment. The method starts at 402 and proceeds to 404. - At 404, an electronic device may be configured for executing a collaborative canvas application to create an interactive collaboration session amongst a plurality of users. At
step 406, the electronic device may be configured for receiving one or more interactions associated with plurality of applications from plurality of users and each of plurality of applications are accessed natively by each of plurality of users. Atstep 408, the electronic device may be configured for synchronizing in real-time each of one or more interactions received from plurality of users for enabling collaboration between plurality of users. Control passes to endoperation 410. -
FIG. 5 is a block diagram of an exemplary computer system for implementing collaboration between users, in accordance with various exemplary embodiments of the present disclosure. Theexemplary computer system 501 may comprise a central processing unit (“CPU” or “processor”) 502, an I/O interface 503, an input device 504, anoutput device 505, atransceiver 506, anetwork interface 507, a communication network 508, devices, such as 509, 510 and 511,storage interface 512, one or more memory devices, such asRAM 513,ROM 514, andmemory device 515. - Variations of
computer system 501 may be used for implementing collaboration between users. Thecomputer system 501 may comprise a central processing unit (“CPU” or “processor”) 502.Processor 502 may comprise at least one data processor for executing program components for executing user- or system-generated requests. A user may include a person, a person using a device such as those included in this disclosure, or such a device itself. Theprocessor 502 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc. Theprocessor 502 may include a microprocessor, such as AMD Athlon, Duron or Opteron, ARM's application, embedded or secure processors, IBM PowerPC, Intel's Core, Itanium, Xeon, Celeron or other line of processors, etc. Theprocessor 502 may be implemented using mainframe, distributed processor, multi-core, parallel, grid, or other architectures. Some exemplary embodiments may utilize embedded technologies like application-specific integrated circuits (ASICs), digital signal processors (DSPs), Field Programmable Gate Arrays (FPGAs), etc. -
Processor 502 may be disposed in communication with one or more input/output (I/O) devices via I/O interface 503. The I/O interface 503 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n/b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc. - Using the I/
O interface 503, thecomputer system 501 may communicate with one or more I/O devices. For example, the input device 504 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, sensor (e.g., accelerometer, light sensor, GPS, gyroscope, proximity sensor, or the like), stylus, scanner, storage device, transceiver, video device/source, visors, etc.Output device 505 may be a printer, fax machine, video display (e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, or the like), audio speaker, etc. In some exemplary embodiments, atransceiver 506 may be disposed in connection with theprocessor 502. The transceiver may facilitate various types of wireless transmission or reception. For example, the transceiver may include an antenna operatively connected to a transceiver chip (e.g., Texas Instruments WiLink WL1283, Broadcom BCM4750IUB8, Infineon Technologies X-Gold 618-PMB9800, or the like), providing IEEE 802.11a/b/g/n, Bluetooth, FM, global positioning system (GPS), 2G/3G HSDPA/HSUPA communications, etc. - In some exemplary embodiments, the
processor 502 may be disposed in communication with a communication network 508 via anetwork interface 507. Thenetwork interface 507 may communicate with the communication network 508. The network interface may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. The communication network 508 may include, without limitation, a direct interconnection, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, etc. Using thenetwork interface 507 and the communication network 508, thecomputer system 501 may communicate withdevices - In some exemplary embodiments, the
processor 502 may be disposed in communication with one or more memory devices (e.g.,RAM 513,ROM 514, etc.) via astorage interface 512. The storage interface may connect to memory devices including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as serial advanced technology attachment (SATA), integrated drive electronics (IDE), IEEE-1394, universal serial bus (USB), fiber channel, small computer systems interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, redundant array of independent discs (RAID), solid-state memory devices, solid-state drives, etc. - The memory devices may store a collection of program or database components, including, without limitation, an operating system 516, user interface application 517,
web browser 518,mail server 519, mail client 520, user/application data 521 (e.g., any data variables or data records discussed in this disclosure), etc. The operating system 516 may facilitate resource management and operation of thecomputer system 501. Examples of operating systems include, without limitation, Apple Macintosh OS X, UNIX, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSD, NetBSD, OpenBSD, etc.), Linux distributions (e.g., Red Hat, Ubuntu, Kubuntu, etc.), IBM OS/2, Microsoft Windows (XP, Vista/7/8, etc.), Apple iOS, Google Android, Blackberry OS, or the like. User interface 517 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities. For example, user interfaces may provide computer interaction interface elements on a display system operatively connected to thecomputer system 501, such as cursors, icons, check boxes, menus, scrollers, windows, widgets, etc. Graphical user interfaces (GUIs) may be employed, including, without limitation, Apple Macintosh operating systems' Aqua, IBM OS/2, Microsoft Windows (e.g., Aero, Metro, etc.), Unix X-Windows, web interface libraries (e.g., ActiveX, Java, Javascript, AJAX, HTML, Adobe Flash, etc.), or the like. - In some exemplary embodiments, the
computer system 501 may implement a web browser 1118 stored program component. The web browser may be a hypertext viewing application, such as Microsoft Internet Explorer, Google Chrome, Mozilla Firefox, Apple Safari, etc. Secure web browsing may be provided using HTTPS (secure hypertext transport protocol), secure sockets layer (SSL), Transport Layer Security (TLS), etc. Web browsers may utilize facilities such as AJAX, DHTML, Adobe Flash, JavaScript, Java, application programming interfaces (APIs), etc. In some exemplary embodiments, thecomputer system 501 may implement amail server 519 stored program component. The mail server may be an Internet mail server such as Microsoft Exchange, or the like. The mail server may utilize facilities such as ASP, ActiveX, ANSI C++/C#, Microsoft .NET, CGI scripts, Java, JavaScript, PERL, PHP, Python, WebObjects, etc. The mail server may utilize communication protocols such as internet message access protocol (IMAP), messaging application programming interface (MAPI), Microsoft Exchange, post office protocol (POP), simple mail transfer protocol (SMTP), or the like. In some exemplary embodiments, the computer system 401 may implement a mail client 520 stored program component. The mail client may be a mail viewing application, such as Apple Mail, Microsoft Entourage, Microsoft Outlook, Mozilla Thunderbird, etc. - In some exemplary embodiments,
computer system 501 may store user/application data 1121, such as the data, variables, records, etc. as described in this disclosure. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase. Alternatively, such databases may be implemented using standardized data structures, such as an array, hash, linked list, struct, structured text file (e.g., XML), table, or as object-oriented databases (e.g., using ObjectStore, Poet, Zope, etc.). Such databases may be consolidated or distributed, sometimes among the various computer systems discussed above in this disclosure. It is to be understood that the structure and operation of the any computer or database component may be combined, consolidated, or distributed in any working combination. - Furthermore, one or more computer-readable storage media may be utilized in implementing exemplary embodiments consistent with the present invention. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform operations or stages consistent with the exemplary embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
- Various exemplary embodiments of the disclosure encompass numerous advantages including methods and systems for enabling collaboration between users. In an exemplary embodiment, the methods and systems may render content via the interactive collaboration session natively thereby improving quality and richness of experience while maximizing the collaboration. Secondly, as content may be rendered natively and plurality of users may interact with plurality of applications concurrently hence a better and efficient system for collaboration is implemented. Additionally, only the synchronization of interactions received from the plurality of users is performed rather synchronizing complete applications that are hosted on a third party or cloud server. This reduces the bandwidth requirement and the synchronization of interactions amongst the plurality of the applications is maintained in real time and there is no lag. Thus, the efficacy of system while enabling collaboration amongst plurality of uses is improved.
- In contrast to the conventional approach of screen sharing, the disclosure enables plurality of users to interact with plurality of applications concurrently. Thus, the technical problems of real-time synchronization and the ability to concurrently interact with plurality of applications and across such plurality of applications is solved by implementing operations for creating, by a collaborative canvas application executing on an electronic device, an interactive collaboration session amongst a plurality of users; receiving, by the electronic device, one or more interactions associated with a plurality of applications from the plurality of users, wherein the plurality of applications are initiated within the interactive collaboration session, and wherein each of the plurality of applications are accessed natively by each of the plurality of users; and synchronizing in real-time, by the electronic device, each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- The disclosure overcomes the aforementioned shortcomings of lag in synchronization by ensuring that each of the plurality of applications are accessed natively by each of the plurality of users when the interactive collaboration session is initiated. As a result, the method provides for efficient and accurate ways for lag free real-time synchronization of each of the one or more interactions received from the plurality of users for enabling collaboration between the plurality of users.
- Thus, the claimed operations as discussed above are not routine, conventional, or well understood in the art, as the claimed operation enable the following solutions to the existing problems in conventional technologies.
- The present disclosure may be realized in hardware, or a combination of hardware and software. The present disclosure may be realized in a centralized fashion, in at least one computer system, or in a distributed fashion, where different elements may be spread across several interconnected computer systems. A computer system or other apparatus adapted for carrying out the methods described herein may be suited. A combination of hardware and software may be a general-purpose computer system with a computer program that, when loaded and executed, may control the computer system such that it carries out the methods described herein. The present disclosure may be realized in hardware that comprises a portion of an integrated circuit that also performs other functions.
- A person with ordinary skills in the art will appreciate that the systems, modules, and sub-modules have been illustrated and explained to serve as examples and should not be considered limiting in any manner. It will be further appreciated that the variants of the above disclosed system elements, modules, and other features and functions, or alternatives thereof, may be combined to create other different systems or applications.
- Those skilled in the art will appreciate that any of the aforementioned operations and/or system modules may be suitably replaced, reordered, or removed, and additional operations and/or system modules may be inserted, depending on the needs of a particular application. In addition, the systems of the aforementioned exemplary embodiments may be implemented using a wide variety of suitable processes and system modules, and are not limited to any particular computer hardware, software, middleware, firmware, microcode, and the like. The claims can encompass exemplary embodiments for hardware and software, or a combination thereof.
- While the present disclosure has been described with reference to certain exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present disclosure. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present disclosure without departing from its scope. Therefore, it is intended that the present disclosure not be limited to the particular exemplary embodiment disclosed, but that the present disclosure will include all exemplary embodiments falling within the scope of the appended claims.
Claims (11)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/308,916 US20210367986A1 (en) | 2020-05-21 | 2021-05-05 | Enabling Collaboration Between Users |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063028123P | 2020-05-21 | 2020-05-21 | |
US17/308,916 US20210367986A1 (en) | 2020-05-21 | 2021-05-05 | Enabling Collaboration Between Users |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210367986A1 true US20210367986A1 (en) | 2021-11-25 |
Family
ID=78607913
Family Applications (8)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/308,887 Abandoned US20210367984A1 (en) | 2020-05-21 | 2021-05-05 | Meeting experience management |
US17/308,623 Active US11488116B2 (en) | 2020-05-21 | 2021-05-05 | Dynamically generated news feed |
US17/308,916 Abandoned US20210367986A1 (en) | 2020-05-21 | 2021-05-05 | Enabling Collaboration Between Users |
US17/308,264 Active US11537998B2 (en) | 2020-05-21 | 2021-05-05 | Capturing meeting snippets |
US17/308,329 Active US11416831B2 (en) | 2020-05-21 | 2021-05-05 | Dynamic video layout in video conference meeting |
US17/308,586 Abandoned US20210365893A1 (en) | 2020-05-21 | 2021-05-05 | Recommendation unit for generating meeting recommendations |
US17/308,640 Abandoned US20210367802A1 (en) | 2020-05-21 | 2021-05-05 | Meeting summary generation |
US17/308,772 Abandoned US20210365896A1 (en) | 2020-05-21 | 2021-05-05 | Machine learning (ml) model for participants |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/308,887 Abandoned US20210367984A1 (en) | 2020-05-21 | 2021-05-05 | Meeting experience management |
US17/308,623 Active US11488116B2 (en) | 2020-05-21 | 2021-05-05 | Dynamically generated news feed |
Family Applications After (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/308,264 Active US11537998B2 (en) | 2020-05-21 | 2021-05-05 | Capturing meeting snippets |
US17/308,329 Active US11416831B2 (en) | 2020-05-21 | 2021-05-05 | Dynamic video layout in video conference meeting |
US17/308,586 Abandoned US20210365893A1 (en) | 2020-05-21 | 2021-05-05 | Recommendation unit for generating meeting recommendations |
US17/308,640 Abandoned US20210367802A1 (en) | 2020-05-21 | 2021-05-05 | Meeting summary generation |
US17/308,772 Abandoned US20210365896A1 (en) | 2020-05-21 | 2021-05-05 | Machine learning (ml) model for participants |
Country Status (1)
Country | Link |
---|---|
US (8) | US20210367984A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230300204A1 (en) * | 2022-03-18 | 2023-09-21 | Zoom Video Communications, Inc. | App pinning in video conferences |
WO2023177756A1 (en) * | 2022-03-16 | 2023-09-21 | Figma, Inc. | Collaborative widget state synchronization |
US12411697B2 (en) | 2022-08-15 | 2025-09-09 | Figma, Inc. | Plugin management system for an interactive system or platform |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12400661B2 (en) | 2017-07-09 | 2025-08-26 | Otter.ai, Inc. | Systems and methods for capturing, processing, and rendering one or more context-aware moment-associating elements |
US10263799B1 (en) | 2018-08-29 | 2019-04-16 | Capital One Services, Llc | Managing meeting data |
US11423911B1 (en) | 2018-10-17 | 2022-08-23 | Otter.ai, Inc. | Systems and methods for live broadcasting of context-aware transcription and/or other elements related to conversations and/or speeches |
US11765213B2 (en) * | 2019-06-11 | 2023-09-19 | Nextiva, Inc. | Mixing and transmitting multiplex audiovisual information |
US11595447B2 (en) | 2020-08-05 | 2023-02-28 | Toucan Events Inc. | Alteration of event user interfaces of an online conferencing service |
US12023116B2 (en) * | 2020-12-21 | 2024-07-02 | Cilag Gmbh International | Dynamic trocar positioning for robotic surgical system |
US11676623B1 (en) | 2021-02-26 | 2023-06-13 | Otter.ai, Inc. | Systems and methods for automatic joining as a virtual meeting participant for transcription |
US12175968B1 (en) * | 2021-03-26 | 2024-12-24 | Amazon Technologies, Inc. | Skill selection for responding to natural language inputs |
US11937016B2 (en) * | 2021-05-26 | 2024-03-19 | International Business Machines Corporation | System and method for real-time, event-driven video conference analytics |
US11894938B2 (en) | 2021-06-21 | 2024-02-06 | Toucan Events Inc. | Executing scripting for events of an online conferencing service |
US11916687B2 (en) | 2021-07-28 | 2024-02-27 | Zoom Video Communications, Inc. | Topic relevance detection using automated speech recognition |
US11330229B1 (en) * | 2021-09-28 | 2022-05-10 | Atlassian Pty Ltd. | Apparatuses, computer-implemented methods, and computer program products for generating a collaborative contextual summary interface in association with an audio-video conferencing interface service |
US20230098137A1 (en) * | 2021-09-30 | 2023-03-30 | C/o Uniphore Technologies Inc. | Method and apparatus for redacting sensitive information from audio |
US11985180B2 (en) * | 2021-11-16 | 2024-05-14 | Microsoft Technology Licensing, Llc | Meeting-video management engine for a meeting-video management system |
US11722536B2 (en) | 2021-12-27 | 2023-08-08 | Atlassian Pty Ltd. | Apparatuses, computer-implemented methods, and computer program products for managing a shared dynamic collaborative presentation progression interface in association with an audio-video conferencing interface service |
WO2023158330A1 (en) * | 2022-02-16 | 2023-08-24 | Ringcentral, Inc., | System and method for rearranging conference recordings |
JP7459890B2 (en) * | 2022-03-23 | 2024-04-02 | セイコーエプソン株式会社 | Display methods, display systems and programs |
US12182502B1 (en) | 2022-03-28 | 2024-12-31 | Otter.ai, Inc. | Systems and methods for automatically generating conversation outlines and annotation summaries |
US20230401497A1 (en) * | 2022-06-09 | 2023-12-14 | Vmware, Inc. | Event recommendations using machine learning |
CN117459673A (en) * | 2022-07-19 | 2024-01-26 | 奥图码股份有限公司 | Electronic device and method for video conferencing |
US12095580B2 (en) * | 2022-10-31 | 2024-09-17 | Docusign, Inc. | Conferencing platform integration with agenda generation |
US11838139B1 (en) | 2022-10-31 | 2023-12-05 | Docusign, Inc. | Conferencing platform integration with assent tracking |
US12373644B2 (en) * | 2022-12-13 | 2025-07-29 | Calabrio, Inc. | Evaluating transcripts through repetitive statement analysis |
US20240395254A1 (en) * | 2023-05-24 | 2024-11-28 | Otter.ai, Inc. | Systems and methods for live summarization |
US20250119507A1 (en) * | 2023-10-09 | 2025-04-10 | Dell Products, L.P. | Handling conference room boundaries and/or context |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130144950A1 (en) * | 2011-12-06 | 2013-06-06 | Manikandan Sanakaranarasimhan | Seamless collaboration and communication |
US20150358810A1 (en) * | 2014-06-10 | 2015-12-10 | Qualcomm Incorporated | Software Configurations for Mobile Devices in a Collaborative Environment |
US20160044088A1 (en) * | 2011-03-03 | 2016-02-11 | Citrix Systems, Inc. | Reverse Seamless Integration Between Local and Remote Computing Environments |
US20170024100A1 (en) * | 2015-07-24 | 2017-01-26 | Coscreen, Inc. | Frictionless Interface for Virtual Collaboration, Communication and Cloud Computing |
US20170192656A1 (en) * | 2015-12-30 | 2017-07-06 | Dropbox, Inc. | Native Application Collaboration |
US20190068661A1 (en) * | 2017-08-24 | 2019-02-28 | Re Mago Holding Ltd | Method, apparatus, and computer-readable medium for desktop sharing over a web socket connection in a networked collaboration workspace |
US20190155471A1 (en) * | 2015-03-02 | 2019-05-23 | Dropbox, Inc. | Native Application Collaboration |
US20200134002A1 (en) * | 2018-10-26 | 2020-04-30 | Salesforce.Com, Inc. | Rich text box for live applications in a cloud collaboration platform |
US20210352120A1 (en) * | 2020-05-07 | 2021-11-11 | Re Mago Holding Ltd | Method, apparatus, and computer readable medium for virtual conferencing with embedded collaboration tools |
Family Cites Families (82)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10742433B2 (en) * | 2003-06-16 | 2020-08-11 | Meetup, Inc. | Web-based interactive meeting facility, such as for progressive announcements |
US6963352B2 (en) | 2003-06-30 | 2005-11-08 | Nortel Networks Limited | Apparatus, method, and computer program for supporting video conferencing in a communication system |
US7634540B2 (en) | 2006-10-12 | 2009-12-15 | Seiko Epson Corporation | Presenter view control system and method |
US8180029B2 (en) | 2007-06-28 | 2012-05-15 | Voxer Ip Llc | Telecommunication and multimedia management method and apparatus |
US8073922B2 (en) * | 2007-07-27 | 2011-12-06 | Twinstrata, Inc | System and method for remote asynchronous data replication |
US20090210933A1 (en) * | 2008-02-15 | 2009-08-20 | Shear Jeffrey A | System and Method for Online Content Production |
US9824333B2 (en) * | 2008-02-29 | 2017-11-21 | Microsoft Technology Licensing, Llc | Collaborative management of activities occurring during the lifecycle of a meeting |
US7912901B2 (en) * | 2008-08-12 | 2011-03-22 | International Business Machines Corporation | Automating application state of a set of computing devices responsive to scheduled events based on historical data |
US8214748B2 (en) | 2009-09-22 | 2012-07-03 | International Business Machines Corporation | Meeting agenda management |
US9615056B2 (en) | 2010-02-10 | 2017-04-04 | Oovoo, Llc | System and method for video communication on mobile devices |
US9264659B2 (en) | 2010-04-07 | 2016-02-16 | Apple Inc. | Video conference network management for a mobile device |
US8514263B2 (en) * | 2010-05-12 | 2013-08-20 | Blue Jeans Network, Inc. | Systems and methods for scalable distributed global infrastructure for real-time multimedia communication |
US20130191299A1 (en) | 2010-10-28 | 2013-07-25 | Talentcircles, Inc. | Methods and apparatus for a social recruiting network |
US20120144320A1 (en) | 2010-12-03 | 2012-06-07 | Avaya Inc. | System and method for enhancing video conference breaks |
US20120192080A1 (en) * | 2011-01-21 | 2012-07-26 | Google Inc. | Tailoring content based on available bandwidth |
US9113032B1 (en) | 2011-05-31 | 2015-08-18 | Google Inc. | Selecting participants in a video conference |
US8941708B2 (en) | 2011-07-29 | 2015-01-27 | Cisco Technology, Inc. | Method, computer-readable storage medium, and apparatus for modifying the layout used by a video composing unit to generate a composite video signal |
US20130282820A1 (en) | 2012-04-23 | 2013-10-24 | Onmobile Global Limited | Method and System for an Optimized Multimedia Communications System |
US8914452B2 (en) * | 2012-05-31 | 2014-12-16 | International Business Machines Corporation | Automatically generating a personalized digest of meetings |
US9141504B2 (en) | 2012-06-28 | 2015-09-22 | Apple Inc. | Presenting status data received from multiple devices |
US9953304B2 (en) | 2012-12-30 | 2018-04-24 | Buzd, Llc | Situational and global context aware calendar, communications, and relationship management |
US10075676B2 (en) | 2013-06-26 | 2018-09-11 | Touchcast LLC | Intelligent virtual assistant system and method |
US9723075B2 (en) * | 2013-09-13 | 2017-08-01 | Incontact, Inc. | Systems and methods for data synchronization management between call centers and CRM systems |
US10484189B2 (en) * | 2013-11-13 | 2019-11-19 | Microsoft Technology Licensing, Llc | Enhanced collaboration services |
US9400833B2 (en) | 2013-11-15 | 2016-07-26 | Citrix Systems, Inc. | Generating electronic summaries of online meetings |
US10990620B2 (en) * | 2014-07-14 | 2021-04-27 | Verizon Media Inc. | Aiding composition of themed articles about popular and novel topics and offering users a navigable experience of associated content |
US20160117624A1 (en) | 2014-10-23 | 2016-04-28 | International Business Machines Incorporated | Intelligent meeting enhancement system |
US9939983B2 (en) * | 2014-12-17 | 2018-04-10 | Fuji Xerox Co., Ltd. | Systems and methods for plan-based hypervideo playback |
US20160307165A1 (en) | 2015-04-20 | 2016-10-20 | Cisco Technology, Inc. | Authorizing Participant Access To A Meeting Resource |
US20160350720A1 (en) | 2015-05-29 | 2016-12-01 | Citrix Systems, Inc. | Recommending meeting times based on previous meeting acceptance history |
US10255946B1 (en) | 2015-06-25 | 2019-04-09 | Amazon Technologies, Inc. | Generating tags during video upload |
US20190332994A1 (en) * | 2015-10-03 | 2019-10-31 | WeWork Companies Inc. | Generating insights about meetings in an organization |
US10572961B2 (en) * | 2016-03-15 | 2020-02-25 | Global Tel*Link Corporation | Detection and prevention of inmate to inmate message relay |
US20170308866A1 (en) * | 2016-04-22 | 2017-10-26 | Microsoft Technology Licensing, Llc | Meeting Scheduling Resource Efficiency |
US20180046957A1 (en) * | 2016-08-09 | 2018-02-15 | Microsoft Technology Licensing, Llc | Online Meetings Optimization |
US20180077092A1 (en) | 2016-09-09 | 2018-03-15 | Tariq JALIL | Method and system for facilitating user collaboration |
US10572858B2 (en) * | 2016-10-11 | 2020-02-25 | Ricoh Company, Ltd. | Managing electronic meetings using artificial intelligence and meeting rules templates |
US10510051B2 (en) | 2016-10-11 | 2019-12-17 | Ricoh Company, Ltd. | Real-time (intra-meeting) processing using artificial intelligence |
US20180101760A1 (en) * | 2016-10-11 | 2018-04-12 | Ricoh Company, Ltd. | Selecting Meeting Participants for Electronic Meetings Using Artificial Intelligence |
US9699410B1 (en) | 2016-10-28 | 2017-07-04 | Wipro Limited | Method and system for dynamic layout generation in video conferencing system |
US11568369B2 (en) * | 2017-01-13 | 2023-01-31 | Fujifilm Business Innovation Corp. | Systems and methods for context aware redirection based on machine-learning |
TWI644565B (en) | 2017-02-17 | 2018-12-11 | 陳延祚 | Video image processing method and system using the same |
US20180270452A1 (en) | 2017-03-15 | 2018-09-20 | Electronics And Telecommunications Research Institute | Multi-point connection control apparatus and method for video conference service |
US10838396B2 (en) | 2017-04-18 | 2020-11-17 | Cisco Technology, Inc. | Connecting robotic moving smart building furnishings |
US20180331842A1 (en) | 2017-05-15 | 2018-11-15 | Microsoft Technology Licensing, Llc | Generating a transcript to capture activity of a conference session |
CN107342932B (en) * | 2017-05-23 | 2020-12-04 | 华为技术有限公司 | An information interaction method and terminal |
US9967520B1 (en) | 2017-06-30 | 2018-05-08 | Ringcentral, Inc. | Method and system for enhanced conference management |
US20200106735A1 (en) * | 2018-09-27 | 2020-04-02 | Salvatore Guerrieri | Systems and Methods for Communications & Commerce Between System Users and Non-System Users |
US10553208B2 (en) * | 2017-10-09 | 2020-02-04 | Ricoh Company, Ltd. | Speech-to-text conversion for interactive whiteboard appliances using multiple services |
US20190172017A1 (en) * | 2017-12-04 | 2019-06-06 | Microsoft Technology Licensing, Llc | Tagging meeting invitees to automatically create tasks |
US20190205839A1 (en) * | 2017-12-29 | 2019-07-04 | Microsoft Technology Licensing, Llc | Enhanced computer experience from personal activity pattern |
TWI656942B (en) * | 2018-01-12 | 2019-04-21 | 財團法人工業技術研究院 | Machine tool collision avoidance method and system |
US11120199B1 (en) * | 2018-02-09 | 2021-09-14 | Voicebase, Inc. | Systems for transcribing, anonymizing and scoring audio content |
US10757148B2 (en) * | 2018-03-02 | 2020-08-25 | Ricoh Company, Ltd. | Conducting electronic meetings over computer networks using interactive whiteboard appliances and mobile devices |
US20210004735A1 (en) * | 2018-03-22 | 2021-01-07 | Siemens Corporation | System and method for collaborative decentralized planning using deep reinforcement learning agents in an asynchronous environment |
US20190312917A1 (en) | 2018-04-05 | 2019-10-10 | Microsoft Technology Licensing, Llc | Resource collaboration with co-presence indicators |
CN108595645B (en) * | 2018-04-26 | 2020-10-30 | 深圳市鹰硕技术有限公司 | Conference speech management method and device |
US10735211B2 (en) * | 2018-05-04 | 2020-08-04 | Microsoft Technology Licensing, Llc | Meeting insight computing system |
JP2019215727A (en) * | 2018-06-13 | 2019-12-19 | レノボ・シンガポール・プライベート・リミテッド | Conference apparatus, conference apparatus control method, program, and conference system |
US11367095B2 (en) * | 2018-10-16 | 2022-06-21 | Igt | Unlockable electronic incentives |
US11016993B2 (en) * | 2018-11-27 | 2021-05-25 | Slack Technologies, Inc. | Dynamic and selective object update for local storage copy based on network connectivity characteristics |
CN111586674B (en) * | 2019-02-18 | 2022-01-14 | 华为技术有限公司 | Communication method, device and system |
US20200341625A1 (en) | 2019-04-26 | 2020-10-29 | Microsoft Technology Licensing, Llc | Automated conference modality setting application |
US20200374146A1 (en) | 2019-05-24 | 2020-11-26 | Microsoft Technology Licensing, Llc | Generation of intelligent summaries of shared content based on a contextual analysis of user engagement |
US11689379B2 (en) | 2019-06-24 | 2023-06-27 | Dropbox, Inc. | Generating customized meeting insights based on user interactions and meeting media |
US11262886B2 (en) * | 2019-10-22 | 2022-03-01 | Microsoft Technology Licensing, Llc | Structured arrangements for tracking content items on a shared user interface |
US11049511B1 (en) | 2019-12-26 | 2021-06-29 | Lenovo (Singapore) Pte. Ltd. | Systems and methods to determine whether to unmute microphone based on camera input |
US11049077B1 (en) | 2019-12-31 | 2021-06-29 | Capital One Services, Llc | Computer-based systems configured for automated electronic calendar management and work task scheduling and methods of use thereof |
US10999346B1 (en) | 2020-01-06 | 2021-05-04 | Dialogic Corporation | Dynamically changing characteristics of simulcast video streams in selective forwarding units |
US11989696B2 (en) * | 2020-01-16 | 2024-05-21 | Capital One Services, Llc | Computer-based systems configured for automated electronic calendar management with meeting room locating and methods of use thereof |
US10735212B1 (en) | 2020-01-21 | 2020-08-04 | Capital One Services, Llc | Computer-implemented systems configured for automated electronic calendar item predictions and methods of use thereof |
US11288636B2 (en) | 2020-01-23 | 2022-03-29 | Capital One Services, Llc | Computer-implemented systems configured for automated electronic calendar item predictions for calendar item rescheduling and methods of use thereof |
US11438841B2 (en) | 2020-01-31 | 2022-09-06 | Dell Products, Lp | Energy savings system based machine learning of wireless performance activity for mobile information handling system connected to plural wireless networks |
US11393176B2 (en) * | 2020-02-07 | 2022-07-19 | Krikey, Inc. | Video tools for mobile rendered augmented reality game |
US11095468B1 (en) | 2020-02-13 | 2021-08-17 | Amazon Technologies, Inc. | Meeting summary service |
US11488114B2 (en) | 2020-02-20 | 2022-11-01 | Sap Se | Shared collaborative electronic events for calendar services |
US11080356B1 (en) * | 2020-02-27 | 2021-08-03 | International Business Machines Corporation | Enhancing online remote meeting/training experience using machine learning |
WO2021194372A1 (en) * | 2020-03-26 | 2021-09-30 | Ringcentral, Inc. | Methods and systems for managing meeting notes |
US20210319408A1 (en) | 2020-04-09 | 2021-10-14 | Science House LLC | Platform for electronic management of meetings |
US11470014B2 (en) | 2020-04-30 | 2022-10-11 | Dell Products, Lp | System and method of managing data connections to a communication network using tiered devices and telemetry data |
US11184560B1 (en) | 2020-12-16 | 2021-11-23 | Lenovo (Singapore) Pte. Ltd. | Use of sensor input to determine video feed to provide as part of video conference |
US11119985B1 (en) | 2021-03-19 | 2021-09-14 | Atlassian Pty Ltd. | Apparatuses, methods, and computer program products for the programmatic documentation of extrinsic event based data objects in a collaborative documentation service |
-
2021
- 2021-05-05 US US17/308,887 patent/US20210367984A1/en not_active Abandoned
- 2021-05-05 US US17/308,623 patent/US11488116B2/en active Active
- 2021-05-05 US US17/308,916 patent/US20210367986A1/en not_active Abandoned
- 2021-05-05 US US17/308,264 patent/US11537998B2/en active Active
- 2021-05-05 US US17/308,329 patent/US11416831B2/en active Active
- 2021-05-05 US US17/308,586 patent/US20210365893A1/en not_active Abandoned
- 2021-05-05 US US17/308,640 patent/US20210367802A1/en not_active Abandoned
- 2021-05-05 US US17/308,772 patent/US20210365896A1/en not_active Abandoned
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160044088A1 (en) * | 2011-03-03 | 2016-02-11 | Citrix Systems, Inc. | Reverse Seamless Integration Between Local and Remote Computing Environments |
US20130144950A1 (en) * | 2011-12-06 | 2013-06-06 | Manikandan Sanakaranarasimhan | Seamless collaboration and communication |
US20150358810A1 (en) * | 2014-06-10 | 2015-12-10 | Qualcomm Incorporated | Software Configurations for Mobile Devices in a Collaborative Environment |
US20190155471A1 (en) * | 2015-03-02 | 2019-05-23 | Dropbox, Inc. | Native Application Collaboration |
US20170024100A1 (en) * | 2015-07-24 | 2017-01-26 | Coscreen, Inc. | Frictionless Interface for Virtual Collaboration, Communication and Cloud Computing |
US20170192656A1 (en) * | 2015-12-30 | 2017-07-06 | Dropbox, Inc. | Native Application Collaboration |
US20190068661A1 (en) * | 2017-08-24 | 2019-02-28 | Re Mago Holding Ltd | Method, apparatus, and computer-readable medium for desktop sharing over a web socket connection in a networked collaboration workspace |
US20200134002A1 (en) * | 2018-10-26 | 2020-04-30 | Salesforce.Com, Inc. | Rich text box for live applications in a cloud collaboration platform |
US20210352120A1 (en) * | 2020-05-07 | 2021-11-11 | Re Mago Holding Ltd | Method, apparatus, and computer readable medium for virtual conferencing with embedded collaboration tools |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023177756A1 (en) * | 2022-03-16 | 2023-09-21 | Figma, Inc. | Collaborative widget state synchronization |
US20230300204A1 (en) * | 2022-03-18 | 2023-09-21 | Zoom Video Communications, Inc. | App pinning in video conferences |
US12155729B2 (en) * | 2022-03-18 | 2024-11-26 | Zoom Video Communications, Inc. | App pinning in video conferences |
US12411697B2 (en) | 2022-08-15 | 2025-09-09 | Figma, Inc. | Plugin management system for an interactive system or platform |
Also Published As
Publication number | Publication date |
---|---|
US20210367800A1 (en) | 2021-11-25 |
US20210365896A1 (en) | 2021-11-25 |
US20210367801A1 (en) | 2021-11-25 |
US11488116B2 (en) | 2022-11-01 |
US20210368134A1 (en) | 2021-11-25 |
US11416831B2 (en) | 2022-08-16 |
US20210365893A1 (en) | 2021-11-25 |
US11537998B2 (en) | 2022-12-27 |
US20210367984A1 (en) | 2021-11-25 |
US20210367802A1 (en) | 2021-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210367986A1 (en) | Enabling Collaboration Between Users | |
US10225603B2 (en) | Methods and systems for rendering multimedia content on a user device | |
US9946754B2 (en) | System and method for data validation | |
US20160239770A1 (en) | Method and system for dynamically changing process flow of a business process | |
EP3258392A1 (en) | Systems and methods for building contextual highlights for conferencing systems | |
EP4083865A1 (en) | Method and system for providing virtual services | |
US20200007947A1 (en) | Method and device for generating real-time interpretation of a video | |
US11573809B2 (en) | Method and system for providing virtual services | |
US20180219924A1 (en) | Method and System for Providing Interactive Control of Shared Content Over a Video Conference | |
AU2025213617A1 (en) | Electronic devices and methods for selecting and displaying multimodal content | |
US20240310978A1 (en) | Systems and methods for a digital interface | |
US20140109070A1 (en) | System and method for configurable entry points generation and aiding validation in a software application | |
US9531957B1 (en) | Systems and methods for performing real-time image vectorization | |
US10277463B2 (en) | System and method for synchronizing computing platforms | |
US10739989B2 (en) | System and method for customizing a presentation | |
US20200211247A1 (en) | Method and system for controlling an object avatar | |
EP4506832A1 (en) | Method and system for providing real-time assistance to users using generative artificial intelligence (ai) models | |
EP4343652A1 (en) | Method and system for enabling virtual interaction | |
US20160210227A1 (en) | Method and system for identifying areas of improvements in an enterprise application |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HUDDL INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DAVULURI, NAVA;RAJAMANI, HARISH;YARLAGADDA, KRISHNA;AND OTHERS;REEL/FRAME:056242/0478 Effective date: 20210426 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |