US20220256118A1 - System and method for recording online collaboration - Google Patents
System and method for recording online collaboration Download PDFInfo
- Publication number
- US20220256118A1 US20220256118A1 US17/675,056 US202217675056A US2022256118A1 US 20220256118 A1 US20220256118 A1 US 20220256118A1 US 202217675056 A US202217675056 A US 202217675056A US 2022256118 A1 US2022256118 A1 US 2022256118A1
- Authority
- US
- United States
- Prior art keywords
- modified
- temporal record
- temporal
- online collaboration
- record
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 54
- 230000002123 temporal effect Effects 0.000 claims abstract description 158
- 238000004891 communication Methods 0.000 claims abstract description 63
- 238000013500 data storage Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 15
- 230000000977 initiatory effect Effects 0.000 description 10
- 230000009471 action Effects 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 238000012986 modification Methods 0.000 description 9
- 230000011664 signaling Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000001360 synchronised effect Effects 0.000 description 6
- 238000012546 transfer Methods 0.000 description 6
- 230000007704 transition Effects 0.000 description 6
- 230000000875 corresponding effect Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 3
- 238000007667 floating Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000002156 mixing Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1831—Tracking arrangements for later retrieval, e.g. recording contents, participants activities or behavior, network status
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/155—Conference systems involving storage of or access to video conference sessions
Definitions
- the present disclosure relates generally to collaborative work; and more specifically, to online collaboration recording systems for recording online collaboration sessions, and/or for enabling collaborative work on digital documents with timeline-based data. Furthermore, the present disclosure also relates to methods for recording online collaboration sessions. Moreover, the present disclosure also relates to computer program products comprising instructions to cause the aforesaid systems to carry out the aforesaid methods.
- Digital collaboration refers to two or more people collaborating or meeting remotely on a digital project, over the shared space, to share documents, messages, or multimedia data.
- each contributor needs to digitally communicate with other contributors, either directly or indirectly.
- Contributors may digitally connect with each other directly via peer-to-peer connection model or indirectly via client-server communication model.
- Each model has its own set of benefits and deficiencies associated with various systems and methods of digital collaboration.
- Synchronous systems include exchange of information between participants simultaneously and in real-time.
- Another example includes web conferencing services where data shared in real-time is lost if not saved before the end of a session.
- Most common examples of asynchronous systems are forums, blogs, social medias and other such digitally shared spaces. Such systems include exchange of shared information between an uploader and subsequent visitors.
- the present disclosure seeks to provide an online collaboration recording system for recording an online collaboration session.
- the present disclosure also seeks to provide a method for recording an online collaboration session.
- the present disclosure also seeks to provide a computer program product comprising instructions to cause the aforesaid system to carry out the aforesaid method.
- the present disclosure provides an at least partial solution to the aforementioned technical problem, or problems, associated with known art.
- An aim of the present disclosure is to provide a solution that at least partially overcomes the aforementioned technical problem or problems.
- an online collaboration recording system for recording an online collaboration session comprising:
- a computing arrangement in communication with a plurality of devices, wherein each device is accessible by a user during the online collaborating session, wherein in operation the computing arrangement executes instructions to synchronize a digital project in said online collaborating session amongst the plurality of devices, by:
- Embodiments of the present disclosure substantially eliminate, or at least partially address, the aforementioned problems in the prior art and provide an online collaboration recording system that allows for creation, execution, recording, and sharing of useful contextual information pertaining to collaborative work performed during an online collaboration session.
- the online collaboration recording system is easy to integrate with existing computing hardware.
- the computing arrangement edits the at least one modified object in the temporal record for outputting an output stream, by:
- the computing arrangement in operation, edits the temporal record by any one of:
- the at least one object or at least one modified object comprises one or more properties
- the one or more properties comprises one or more of an on-screen position, on-screen size and content of the at least one object or at least one modified object.
- the content of the at least one object or at least one modified object comprises a set of temporal changes in properties of the at least one object or properties of the at least one modified object over a recorded period of time and/or one or more of a video file or an audio file.
- the at least one object or the at least one modified object is stored at a local data storage or a remote data storage as a set of objects or modified objects and temporal changes to each of the objects and modified objects.
- the computing arrangement via an encryption module, encrypts the temporal record prior to synchronizing the temporal record with the plurality of devices.
- the computing arrangement or the plurality of devices via a decryption module, decrypts the encrypted temporal record after synchronization.
- an embodiment of the present disclosure provides a method for recording an online collaboration session, the method comprising:
- the method further comprises:
- editing the temporal record comprises any one of:
- the at least one object or at least one modified object comprises one or more properties
- the one or more properties comprises one or more of an on-screen position, on screen size and content of the at least one object or at least one modified object.
- the content of the at least one object or at least one modified object comprises a set of temporal changes in properties of the at least one object or properties of the at least one modified object over a recorded period of time and/or one or more of a video file or an audio file.
- the at least one object or the at least one modified object is stored at a local data storage or a remote data storage as a set of objects or modified objects and temporal changes to each of the objects and modified objects.
- the method further comprises encrypting the temporal record prior to synchronizing the temporal record with the plurality of devices.
- the method further comprises decrypting the temporal record after synchronizing the temporal record with the plurality of devices.
- an embodiment of the present disclosure provides a computer program product comprising instructions to cause the aforesaid system to carry out the aforesaid method.
- FIG. 1 illustrates an exemplary sequence diagram for establishment of connection between a computing arrangement with a plurality of devices, in accordance with an embodiment of the present disclosure
- FIG. 2 illustrates an exemplary sequence diagram for suspending ongoing collaborative work using the online collaboration recording system, in accordance with an embodiment of the present disclosure
- FIG. 3 illustrates an exemplary sequence diagram for resuming collaborative work using the online collaboration recording system, in accordance with an embodiment of the present disclosure
- FIG. 4 illustrates an exemplary operation object, in accordance with an embodiment of the present disclosure
- FIGS. 5A and 5B illustrate an exemplary sequence diagram for a given operation transfer between an initiating device and a receiving device whilst performing collaborative work, from a perspective of the initiating device, in accordance with an embodiment of the present disclosure
- FIGS. 6A and 6B illustrate an exemplary sequence diagram for the given operation transfer of FIGS. 5A and 5B , from a perspective of the receiving device, in accordance with an embodiment of the present disclosure
- FIG. 7 is an exemplary sequence diagram for recording at least one modified object to compile the temporal record, in accordance with an embodiment of the present disclosure
- FIG. 8 illustrates step of a method for recording an online collaboration session, in accordance with an embodiment of the present disclosure
- FIG. 9 illustrates a formation of a chain of changes in objects' properties, in accordance with an embodiment of the present disclosure.
- FIGS. 10A, 10B and 100 are exemplary illustrations of digital project contents and a camera viewfinder frame visible from a user viewport visible to a first user (i.e., a host), the camera viewfinder frame visible from a user viewport visible to a second user, in accordance with an embodiment of the present disclosure.
- an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent.
- a non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
- FIG. 1 is an exemplary sequence diagram for implementation of an online collaboration recording system 100 for recording an online collaboration session, in accordance with an embodiment of the present disclosure.
- the online collaboration recording system 100 comprises a computing arrangement 102 in communication with a plurality of devices 104 .
- the computing arrangement 102 is communicably coupled to the plurality of devices 104 via a communication network 106 .
- Each device (of the plurality of devices 104 ) is accessible by a user during the online collaborating session.
- the term “computing arrangement” refers to hardware, software, firmware and/or any combination thereof, suitable for controlling operation of the online collaboration recording system 100 .
- the computing arrangement 102 allows for recording the online collaboration session.
- the computing arrangement 102 includes an arrangement of one or more computational entities that are capable of performing various computational tasks for operation of the online collaboration recording system 100 .
- the term “device” refers to an electronic device associated with (or used by) a user that is capable of enabling the user to perform specific tasks associated with the online collaboration session.
- the term “device” is intended to be broadly interpreted to include any electronic device that may be used for voice and/or data communication over the communication network 106 .
- the plurality of devices 104 enable a plurality of users associated therewith to join and participate in the online collaboration session. In such a case, the plurality of devices 104 provides the plurality of users with an interactive user interface, using which the plurality participates in the online collaboration session.
- the plurality of devices 104 comprises an Input/Output module (or I/O module) to enable the users to provide inputs to and receive outputs from the online collaboration session.
- I/O module an Input/Output module
- Example of the plurality of devices 104 include but are not limited to, cellular phones, personal digital assistants (PDAs), handheld devices, wireless modems, laptop computers, tablet computers, personal computers, etc.
- the term “communication network” refers to an arrangement of interconnected programmable and/or non-programmable components that are configured to facilitate data communication between the plurality of devices 104 and the computing arrangement 102 .
- the communication network 106 may include, but is not limited to, one or more peer-to-peer network, a hybrid peer-to-peer network, local area networks (LANs), radio access networks (RANs), metropolitan area networks (MANs), wide area networks (WANs), all or a portion of a public network such as the global computer network known as the Internet, a private network, a cellular network and any other communication system or systems at one or more locations.
- the communication network 106 includes wired or wireless communication that can be carried out via any number of known protocols, including, but not limited to, Web Real-Time Communication (WebRTC) protocols, Internet Protocol (IP), Wireless Access Protocol (WAP), Frame Relay, or Asynchronous Transfer Mode (ATM). Moreover, any other suitable protocols using voice, video, data, or combinations thereof, can also be employed.
- the communication network 106 is robust and have substantially sufficient bandwidth in order to allow the access of the online collaborating session to the users of each device.
- the communication network 106 has a star topology.
- each device (of the plurality of devices 104 ) is connected to the computing arrangement 102 and the computing arrangement 102 acts as a central hub or host for facilitating communication between the plurality of devices 104 . Therefore, in such a case, a given device is communicably coupled to another device in an indirect manner (namely, via the computing arrangement 102 ).
- the computing arrangement 102 can be implemented by way of at least one device.
- the central hub or host runs on the at least one device amongst the plurality of devices, thereby enabling the digital collaboration system 100 to work as an ad-hoc session or a meeting solution.
- the computing arrangement 102 is implemented by way of a server, more specifically a back-end server.
- the back-end server functions as the central hub.
- the online collaboration recording system 100 works as a cloud or persistent document provider solution.
- the back-end server is coupled in communication to the plurality of devices 104 , via the communication network 106 .
- the communication network 106 employs WebRTC technology to facilitate communication between the computing arrangement 102 and the plurality of devices 104 .
- at least one signaling server establishes communication between the computing arrangement 102 and the plurality of devices 104 by way of WebRTC signaling.
- each of the plurality of devices 104 comprise a corresponding communication module to establish the online collaboration session.
- the communication modules of the plurality of devices 104 are compatible with a communication module of the computing arrangement 102 , for enabling proper communication within the online collaboration recording system 100 .
- all communication modules are compatible with the WebRTC technology.
- Such communication modules may also be referred to as “Synchronization Engine”.
- the communication network 106 employs the WebRTC PeerConnection to facilitate real time communication transport between the computing arrangement 102 and the plurality of devices 104 .
- the WebRTC PeerConnection enables the online collaboration session between the central hub and the plurality of devices 104 , via the at least one signaling server.
- the WebRTC PeerConnection employs Web Sockets as the at least one signaling server.
- the WebRTC data channels are employed for exchanging data, actions and control messages (for example, such as file-transfer) between peers.
- WebRTC MediaTracks are employed for real-time media (for example, such as audio and/or video).
- the communication network 106 is responsible for mixing, forwarding and recording media from and/or to all peers.
- the communication network 106 establishes signaling connection between the computing arrangement 102 and the plurality of devices 104 , via the at least one signaling server.
- the virtual room is created, wherein each virtual room has a unique identifier used by other peers to join it.
- the unique identifier related to a given virtual room can be employed by the plurality of users to join the online collaboration session.
- the computing arrangement 102 upon successful establishment of communication between the computing arrangement 102 and the plurality of devices 104 , the computing arrangement 102 executes instructions to start a collaborative work or to keep continuing currently ongoing collaborative work. It will be appreciated that the currently ongoing collaborative work would not be suspended before establishing said online collaboration session to allow for users of the plurality of devices 104 to complete any currently ongoing work, prior to joining the online collaborating session for collaboratively working on the digital project.
- the computing arrangement 102 executes instructions to suspend currently ongoing collaborative work. More details pertaining to such suspension of the currently ongoing collaborative work have been described herein later in conjunction with FIG. 2 .
- the computing arrangement 102 executes instructions to synchronize a digital project in said online collaborating session amongst the plurality of devices 104 , by:
- online collaboration session refers to a communication session that is temporarily established between the computing arrangement 102 and the plurality of devices 104 for facilitating interactive exchange of information between the plurality of devices 104 .
- Such interactive exchange of information between the plurality of devices 104 pertains to collaborative work that is to be performed on the digital project, by the plurality of users.
- the plurality of devices 104 communicate with each other via messages and optionally, responses to said messages.
- the computing arrangement 102 detects and manages conflicts between the plurality of devices 104 .
- the plurality of users join a virtual room (or a network-based room) for performing collaborative work on the digital project.
- the plurality of users perform said collaborative work in real time or near real time.
- collaboration work refers to simultaneous working (for example, by way of editing content, creating content, deleting content, and the like) of the plurality of users on a given part or an entirety of the digital project.
- the collaborative work relates to performing changes to the given part of the digital project over time, upon collaboration of the plurality of users.
- the changes implemented on the digital project, by a given user can be recorded and are shared with the remaining users working on the same digital project, simultaneously.
- the term “digital project” refers to a computer-based project upon which the plurality of users collaborate to perform meaningful work. Such a digital project could pertain to a number of domains including, but not limited to, business, education, military, medical science.
- the digital project can be a video project related to marketing of a product or a business.
- the digital project can be an audio-visual project related to demonstrating a technique for research project.
- the digital project can be a digital presentation related to historical facts.
- the digital project can be a digital document related to findings of a scientific experiment.
- the term “object” refers to a data construct of the digital project upon which collaborative work is to be performed by the plurality of users.
- the digital project comprises a single object whereas in other implementations, the digital project comprises a plurality of objects.
- multiple users can synchronously or asynchronously collaborate for working on the given digital project.
- at least one object include, but are not limited to, time object, audio object, image object, text object, drawing object.
- the at least one object is a camera object, wherein the camera object comprises a recordable property describing camera viewfinder location.
- camera viewfinder location refers to location data (specific to type of the digital project) allowing for determining which part of the digital project is covered by an abstract camera.
- a camera viewfinder frame is an on-screen rectangle used to determine or inform an area of the digital project that is visible while replaying recording or after conversion of said recording to at least one video file.
- the camera viewfinder frame is independent of the user viewport, but may be synchronized together using at least one of: the camera viewfinder frame may follow the user viewport so as to record similar to local screen capture of the digital project; the user viewport may follow another user viewport, wherein output is up to user; or maybe a combination of the two. Furthermore, the recordable property of the camera viewfinder location describes location of the user viewport in the digital project.
- the recordable property of the camera viewfinder location is dependent on nature of the digital project, such as for example, visible rectangle in a whiteboard or any two-dimensional (2-D) infinite space (i.e., when the camera viewfinder location becomes the camera viewfinder frame), page in a document, slide on presentation, and/or combination of position, orientation vector and field of view in three-dimensional (3-D) space.
- the camera viewfinder location and the user viewport have been described herein later in conjunction with FIG. 10 .
- the plurality of users perform collaborative work at the same time. In such a case, any change to the digital project made by a user would be visible to all other users in real time.
- the plurality of users perform collaborative work at different times, wherein the plurality of users have ability to provide simultaneous input.
- the simultaneous input may be collected independently on various devices, with all manipulations (i.e., changes) being collected asynchronously and joined together to form a chain of activities. In such a case, any change to the digital project made by a user would be transmitted to all other users in real time, and would be visible to such users when they choose to work on the digital project.
- said online collaboration recording system 100 is not limited to a session with objects in two-dimensional (2-D) infinite space as a same arrangement and set of rules applies for a three-dimensional (3-D) space, wherein the 3-D space is an extension of space and object properties.
- the online collaboration recording system 100 is flexible enough to be applied to a broad range of digital projects such as, but not limited to, whiteboards, text files, spreadsheets, video projects.
- the computing arrangement 102 executes instructions to resume the collaborative work on the digital project. More details pertaining to resuming the collaborative work on the digital project have been described herein later in conjunction with FIG. 3 .
- the first user input is received via the communication module or an input interface.
- one or more properties of the at least one object are modified to form at least one modified object.
- the first user input could be in form of a touch input, a voice input, a digital command, a gesture input, and the like.
- properties of the at least one change and such object becomes the at least one modified object. Therefore, the term “modified object” refers to an object whose properties are modified according to the first user input. It will be appreciated that at a given time, the digital project may include multiple objects but only some objects among such multiple objects may be modified in the aforesaid manner.
- Such modification of the at least one object can be understood to be a “collaboration action” pertaining to the online collaboration session.
- new data upon receiving the first user input, new data might be added, or existing data might be altered for at least one object.
- one or more properties referring to the new data may be modified to form at least one modified state in at least one modified object.
- a collaborative session adds new data, wherein the new data comprises recording voice of any given user.
- the new data is in form of waveform bytes which are stored in a file on a disk. Subsequently, such a file is represented by an audio object comprising a current time recordable property, denoted as “current time” which refers to a point in time in the audio file when the new data is inserted.
- a single collaboration action is performed within the online collaboration session.
- a plurality of collaboration actions are performed within the online collaboration session.
- any collaboration action pertaining to the online collaboration session uses an operation object as a proxy.
- the operation object is a state machine object that provides a generic mechanism for implementing at least one collaboration action in the digital project.
- the operation object is created as a result of end-user input, but could also be created by automated test bot or specifically tailored bot. In such a case, the automated test bot or specifically tailored bot is also coupled in communication with the computing arrangement 102 via the communication network 106 . More details of the operation object are elucidated herein later in conjunction with FIG. 4 .
- the first user input is received via the communication module.
- the first user input is received via the input interface of said one or more of plurality of devices 104 .
- the at least one object or at least one modified object comprises one or more properties
- the one or more properties comprises one or more of an on-screen position, on screen size and content of the at least one object or at least one modified object.
- the term, “property” refers to an attribute associated with the at least one object or the at least one modified object of the digital project pertaining to which collaborative work is performed by the plurality of users during the online collaboration session.
- the one or more properties of the at least one object or at least one modified object are well-defined. Examples of one or more properties may include, but are not limited to on-screen position, on-screen size and content (for example, such as image, audio and the like) of the at least one object or at least one modified object.
- the at least one object or at least one modified object comprises one or more recordable properties.
- the one or more recordable properties vary with respect to time.
- the one or more recordable properties may relate to an on-screen position of the at least one object or the at least one modified object.
- the content of the at least one object or at least one modified object comprises a set of temporal changes in properties of the at least one object or properties of the at least one modified object over a recorded period of time and/or one or more of a video file or an audio file.
- the content of the at least one object or at least one modified object refers to a given data which is a part of the collaborative work.
- Such change in the content is additionally reflected in related state of the at least one object or the at least one modified object which might be recordable.
- state is designed to reflect characteristics (i.e., attributes, or condition) of the content at any given point in recording time which allows for effective playback of changes made to content across time.
- the at least one object may be a drawing object, wherein the content of the drawing object is an ordered set of lines and points. Furthermore, new content is added to the drawing object, wherein the new content comprises a new line or new points. Additionally, the drawing object further comprises a recordable state, wherein the recordable state is a pair of indices pointing to the ordered set of lines and points as well as the new line and the new point in the last visible line.
- the content of the at least one object or at least one modified object comprises one or more of the video files for example, such as a video data or the audio file for example, such as a voice data.
- the content of the at least one object or at least one modified object comprises an image file.
- the content of the at least one object or at least one modified object includes images, graphics and the like.
- the content of the at least one object or at least one modified object comprises a text file.
- the content of the at least one object or at least one modified object includes textual data, spreadsheet documents and the like.
- the at least one object or the at least one modified object is stored at a local data storage or a remote data storage as a set of objects or modified objects and temporal changes to each of the objects and modified objects.
- the local data storage as the set of objects or the modified objects and temporal changes to each of the objects and the modified objects can be implemented by a way of memory unit associated with operation-initiating device and/or memory unit associated with operation-receiving device.
- the remote data storage can be implemented by a way of memory module of the computing arrangement 102 .
- the memory module of computing arrangement 102 may be a memory unit of computing arrangement 102 or a database arrangement coupled in communication with computing arrangement 102 .
- the remote data storage can be implemented by a way of cloud server arrangement communicably coupled to the online collaboration recording system 100 .
- the digital project further comprises at least one track, wherein a given track represents how the one or more recordable properties of the at least one object or at least one modified object vary with respect to time.
- a given track represents changes of a given recordable property of a given object or a given modified object. Such changes are made by the plurality of users working in the digital project during the online collaboration session.
- the digital project further comprises at least one asset, wherein a given asset represents at least one file used by a given object.
- assets include, but are not limited to, the video file, the audio file, the image file, the text file.
- a given asset is associated with a single object.
- a given asset is associated with a plurality of objects.
- the digital project further comprises a time object, wherein the time object represents a time duration of the digital project in form of a plurality of frames.
- a single frame is a discrete unit of time in which a current state of the one or more recordable properties of the at least one object or the at least one modified object is recorded.
- the time object comprises a current frame index.
- a given frame type is specific to a given recordable property.
- the current state of its one or more recordable properties can be stored in a corresponding frame.
- Such a frame can be referred to as a “record frame”.
- the current state of the one or more recordable properties of the given object or the given modified object is restored from a corresponding record frame into a replay frame. It will be appreciated that by coordinating restoring of recorded state between all objects (whether modified or not) and their recordable properties, the online collaboration recording system 100 allows for restoring a state of the digital project at any given point in time.
- the track records the plurality of frames sequentially relative to an abstract timeline.
- the time object is abstracted from absolute time.
- the time object comprises a current frame index.
- the tracks share a common time base, denoted by “Time 0”, wherein the “Time 0” defines beginning of recording of the track.
- the digital project comprises a state to represent current timeline position of the digital project, denoted by “current time point”, wherein the “current time point” is relative to the “Time 0”. Subsequently, the digital project further comprises recording duration, wherein the recording duration represents the duration of the overall recording relative to “Time 0”. Consequently, later playback and conversion to video of the recorded state changes are in synchronization across multiple objects. Furthermore, consequently, recording may be iterated, wherein new temporal records may be inserted in the middle or at the end of an existing digital project and mixing new temporal records with previous temporal records.
- the user may record changes made to a digital document, wherein the digital document is a whiteboard.
- the recorded changes made to the digital document may be represented as timeline of changes made during a collaborative session or across multiple collaborative sessions.
- the time comprises tracks representing changes made to specific objects in time.
- At least one unique identifier is created for at least one of: the at least one object, the at least one track, the at least one asset.
- a given digital project may pertain to creating a video to market a product.
- the digital project may comprise the at least one object, the at least one track, the at least one asset.
- each object, track, and asset has its own unique identifier that is created upon creation of said object, track, and asset.
- the at least one object may comprise an image object having a unique identifier.
- the defined sets of properties associated with the image object may include position (x,y), size (width, height) of the image object, and a corresponding image as asset.
- the position (x,y) is identified as the at least one recordable property which changes over time and the position track is identified as the asset track for the image object.
- At (iii), at least one state of the at least one modified object is recorded temporally, via the recorder, to compile the temporal record.
- the “recorder” is implemented by way of hardware, software, firmware, or a combination of these, suitable for compiling the temporal record of the online collaboration session.
- the term “recording” refers to storing the at least one state of one or more recordable properties of the at least one object or at least one modified object in the digital project, over the time duration of the digital project. This is done by utilizing a recording model that is owned by each object.
- the state of the at least one modified object is the recordable property of the at least one modified object.
- the change in the one or more recordable properties of the at least one object or at least one modified object with respect to time is recorded as the at least one track of the digital project.
- the “temporal record” includes evidence of the collaborative work performed on the digital project, during the online collaboration session.
- the temporal record is a compilation of the at least one track of the digital project. It will be appreciated that authentic moments of ideation and understanding during the online collaboration session are recorded by way of the temporal record.
- the process of temporally recording the at least one modified object is related to state transitions of the operation object.
- Each operation declares which recordable properties of objects it operates upon are changed due to its execution. This means that recording changes of these properties may start when operation transitions to “started” state and may finish when operation transitions to “finished” state.
- the process of temporally recording the at least one modified object is related to capturing states and changes within corresponding properties of the at least one object or the at least one modified object.
- recordable properties of the at least one object or the at least one modified object is the state of the object.
- a recording track is created for every pair of the at least one object or the at least one modified object and the captured state.
- the recording track represents changes made to a given state of the at least one object or the at least one modified object.
- all changes to the state of the at least one object or the at least one modified object are saved in a frame, wherein the frame is specific to a given type of the state.
- the type of state for a two-dimensional (2-D) object may be for transformation, denoted as “transform”;
- the type of state for a resizable 2-D object may be size of the 2-D object, denoted as “size”;
- the type of state for the drawing object may be range of visible drawing of the drawing object, denoted as “range of visible drawing”;
- the type of state for animated object may be current frame of the animated object, denoted “current frame”;
- the type of state for multimedia object may be current time of the multimedia object, denoted as “current time”.
- transformation of all the 2-D objects are recorded as six floating point numbers which are used to describe position, scale and rotation of all the 2-D objects; for resizable 2-D objects, size is recorded as two floating point numbers representing width and height of the 2-D objects; for the drawing object, the range of visible drawing is recorded as two integers, where a first integer may be number of visible lines, and a second integer may be number of visible points in last visible line; for the animated objects, current frame may be recorded as a third integer, where the third integer describes index of currently visible frame of the animated object; for multimedia objects, current time is recorded as one floating point number that describes time in timeline of the multimedia files which is played currently.
- each of the at least one object and/or the at least one modified object define their own set of recordable properties, thereby known as type of the state.
- the temporal record is a stream of changes to the at least one object or the at least one modified object occurring over duration of the recording, that are initiated independently by multiple users.
- the recording is performed less rigidly as there is digital collaboration with the multiple users as contributors.
- this allows independent input and individual user's contribution along with allowing multiple viewpoints while recording.
- the at least one object or the at least one modified object and the state of the at least one object or the at least one modified object might not always be mapped one to one.
- the at least one object may be a drawing object comprising object data and recorded state.
- object data of the drawing object comprises a vector of drawn lines and their respective points.
- the state of the drawing object consists of data which acts as the state of the object describing recordable property of the drawing of the drawing object.
- the process of temporally recording the at least one modified object is performed in a desynchronized manner.
- each device records a state of the digital project independently of other devices.
- the process of temporally recording the at least one object or the at least one modified object is flexible in nature.
- past sections of the temporal record may be changed at any given moment, leading to multiple temporal records.
- a state of every frame is defined by a chain of recorded changes applied to the existing at least one object or the at least one modified object represented in said frame, thereby forming another temporal record where the at least one modified object reacts to changes performed in the past sections. Consequently, a final recorded state is a combination of multiple temporal records, and might be changed later upon playback.
- the process of temporally recording the at least one object or the at least one modified object comprises receiving multiple inputs from multiple users at multiple times. These multiple inputs pertain to changing of the state of the at least one object or the at least one modified object.
- the recording is multi-layered, as multiple inputs may be provided at multiple times by re-running recording sequence and adding additional modification to the objects.
- the voice input and/or the video stream of any user may be recorded as at least one object or as the at least one modified object forming additional layers of the captured states.
- the temporal record is synchronized, via the communication module, amongst the plurality of devices 104 .
- synchronizing the temporal record it is meant that the temporal record is communicated to all users performing the collaborative work on the digital project substantially simultaneously.
- synchronizing the temporal record pertains to sharing the temporal record between all users working in the online collaborating session at the same time.
- said users have an up-to date record of the collaborative work that is performed on the project. This helps said users to be on the same page regarding progress of work on the digital project for collaborating in a very efficient manner.
- the online collaboration recording system 100 serves as an up-to date whiteboard whereupon said users can collaborate efficiently for continuous development and feedback pertaining to the digital project.
- the temporal record is synchronized by way of the computing arrangement 102 .
- the plurality of devices 104 transmit their recorded changes to the at least one object (which are optionally recorded in form of tracks) to the computing arrangement 102 whereat such data unified to compile the temporal record. Thereafter, the temporal record is synchronously transmitted to the plurality of devices 104 .
- the computing arrangement 102 edits the at least one modified object in the temporal record for outputting an output stream, by:
- the second user input pertains to editing of the temporal record.
- Such editing of the at least one modified object can be understood to be another “collaboration action” pertaining to the online collaboration session.
- the output stream comprises the edited temporal record and provides the up-to-date edited temporal record to the plurality of users.
- the editing of the temporal record is performed in a non-linear manner.
- the temporal record can be compiled by assembling recordings of collaborative work performed at various time instants in a flexible manner (for example, by rearranging such recordings, overriding previously saved recordings, and the like).
- the editing need not be done in any time-specific manner but any portion of the temporal record.
- the editing of the temporal record provides a customizable temporal record, thereby creating well-edited temporal records. Therefore, such temporal records provide most relevant information associated with the digital project to all the users who have access to the digital project.
- the computing arrangement 102 edits the temporal record by any one of:
- the temporal record may include three objects, wherein the content of the three objects are three video files.
- one additional object comprising a video file may be added to the temporal record.
- the temporal record may include five objects, wherein the content of the five objects is one audio file each.
- two objects may be removed from the temporal records, thereby resulting in three objects having one audio file each.
- the temporal record may include three objects, wherein the content of the two objects is one audio file each and the content of one object is a video file. In such a case, two objects having similar content may be combined in the temporal record.
- the temporal record may include two objects, wherein the content of the two objects is one video file each.
- two objects may have different properties.
- the properties may be modified in the temporal record for simplification.
- the computing arrangement 102 allows for recording collaborative edits to the digital projects in a format which allows for later playback and conversion to video on demand.
- flexible recording format allows to manipulate objects (such as the camera viewfinder location) after edits are made and is not dependent on location of a given user during editing of the digital project.
- the computing arrangement 102 via an encryption module, encrypts the temporal record prior to synchronizing the temporal record with the plurality of devices 104 .
- the term “encryption” refers to conversion the temporal record into a specific code, thereby preventing any unauthorized access to the temporal record.
- the temporal record is encrypted, via an encryption module, thereby providing security to the temporal record. In such a case, only authorized users can access the temporal record via their associated devices.
- the encryption can be implemented by various commonly known techniques. Examples of the encryption technique include, but are not limited to, hashing, public-key cryptography, private-key cryptography.
- the computing arrangement 102 or the plurality of devices 104 via a decryption module, decrypts the encrypted temporal record after synchronization.
- the decryption module is utilized for decrypting the encrypted temporal record after synchronization.
- the encrypted temporal record is in form of the specific code which is not easily decoded by the user.
- the decryption module is used in order to convert the encrypted temporal record into a readable form.
- FIG. 2 is an exemplary sequence diagram for suspending ongoing collaborative work using the online collaboration recording system 100 , in accordance with an embodiment of the present disclosure.
- the computing arrangement 102 disables performing local operations and thereafter, sends requests to the plurality of devices 104 to suspend collaborative work, via the communication network 106 .
- the plurality of devices 104 disable performing of local operations, and thereafter, send responses for said request to the computing arrangement 102 , via the communication network 106 .
- the computing arrangement 102 waits to establish a new collaboration session until it receives responses from each device of the plurality of devices 104 .
- the computing arrangement 102 suspends all ongoing collaborative work.
- FIG. 2 is merely an example, which should not unduly limit the scope of the claims herein.
- a person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure.
- FIG. 3 is an exemplary sequence diagram for resuming collaborative work using the online collaboration recording system 100 , in accordance with an embodiment of the present disclosure.
- the computing arrangement 102 resets its operation list and creates “resume work” requests to be sent to the plurality of devices 104 , and enables performing local operations.
- Such “resume work” requests are sent from the computing arrangement 102 to the plurality of devices 104 via the communication network 106 .
- the plurality of devices 104 execute the “resume work” requests and reset their operation lists. Thereafter, the plurality of devices 104 enable performing local operations. Once all devices enable performing local operations, collaborative work can be performed on the digital project.
- FIG. 3 is merely an example, which should not unduly limit the scope of the claims herein.
- a person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure.
- the computing arrangement 102 aggregates updates to a plurality of digital projects that are sent to a first group of devices connected to the computing arrangement 102 until a new device sends an open project confirmation request to the computing arrangement 102 .
- a new device from the second group sends the open project confirmation request after receiving and loading the digital project into memory.
- the computing arrangement 102 sends all project updates to said new device followed by “resume work” request.
- the new device executes all project updates as received.
- the new device Upon receiving “resume work” request, the new device is considered as fully connected with the up-to-date digital project which gives an end-user associated with the new device the ability to perform collaborative work on the digital project.
- FIG. 4 illustrates an exemplary operation object 400 , in accordance with an embodiment of the present disclosure.
- the exemplary operation object 400 details various states of at least one operation that is to be performed on a given object of the digital project.
- the at least one operation pertains to the at least one collaboration action that is to be performed for the given object.
- the given object undergoes a state transition from one state to another state.
- such a state transition is communicated (for example, using a WebRTC data channel) from a first device (namely, an initiating device of a user who initiates the given operation) to a second device (namely, a receiving device of a user who receives an update of the given operation) by sending an operation message with information specific to state transition and operation type pair that would allow for recreation of the given operation by the second user.
- a first device namely, an initiating device of a user who initiates the given operation
- a second device namely, a receiving device of a user who receives an update of the given operation
- an operation message with information specific to state transition and operation type pair that would allow for recreation of the given operation by the second user.
- the exemplary operation object 400 depicts five states of an operation (such as preparing, readying, executing, cancelling and finishing) that is to be performed on the given object of the digital project.
- the receiving device is: a device among the plurality of devices 104 , the computing arrangement 102 .
- FIG. 4 is merely an example, which should not unduly limit the scope of the claims herein. A person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure.
- FIGS. 5A and 5B illustrate an exemplary sequence diagram for a given operation transfer between the initiating device 502 and the receiving device whilst performing collaborative work, from a perspective of the initiating device, in accordance with an embodiment of the present disclosure.
- the initiating device 502 creates an operation that is to be performed upon the given object, based upon an input from the user associated with the initiating device 502 .
- An operation-specific setup is prepared and validated at the initiating device 502 . If the operation-specific setup is unable to lock the object, the operation is aborted. If the operation-specific setup is able to lock the object, the operation is said to be in ready state.
- the initiating device 502 transmits the operation-specific setup to the receiving device via the communication network 106 , whilst also starting execution of the operation locally. The operation stays in executing state until it attains completion (namely, finishing) or is cancelled.
- FIGS. 5A and 5B are merely examples, which should not unduly limit the scope of the claims herein. A person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure.
- FIGS. 6A and 6B illustrate an exemplary sequence diagram for the given operation transfer of FIGS. 5A and 5B , from a perspective of the receiving device 602 , in accordance with an embodiment of the present disclosure.
- the receiving device 602 receives the operation-specific setup transmitted by the initiating device 502 , via the communication network 106 .
- a state of a remote operation changes to ‘preparing’.
- the remote operation is checked for possible conflicts, and its status changes to ‘ready’ when all conflicts (if any) are resolved.
- the receiving device 602 begins executing the remote operation.
- the remote operation stays in executing state until it attains completion (namely, finishing) or is cancelled.
- FIGS. 6A and 6B are merely examples, which should not unduly limit the scope of the claims herein. A person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure.
- FIG. 7 is an exemplary sequence diagram for recording the at least one modified object to compile the temporal record, in accordance with an embodiment of the present disclosure.
- a recording request is sent from the computing arrangement 102 to the plurality of devices 104 , via the communication network 106 .
- work is suspended within the online collaboration session.
- a recording timer is started and recording begins at the plurality of devices 104 .
- work is resumed within the online collaboration session.
- Collaborative work is performed whilst being recorded until a request to stop recording is sent from the computing arrangement 102 to the plurality of devices 104 .
- recording is suspended and collaborative work on the digital project is resumed without recording.
- FIG. 8 illustrates step of a method 800 for recording an online collaboration session, in accordance with an embodiment of the present disclosure.
- the online collaboration session is established to allow for performing simultaneous collaborative work on a digital project, the digital project comprising at least one object, wherein the digital project is shared and simultaneously modified between a plurality of users.
- the first user input is received from one of the plurality of users and based thereon the at least one object is modified to form at least one modified object.
- the at least one modified object is recorded temporally to compile the temporal record.
- the temporal record is synchronized amongst the plurality of users.
- steps 802 to 808 are only illustrative and other alternatives can also be provided where one or more steps are added, one or more steps are removed, or one or more steps are provided in a different sequence without departing from the scope of the claims herein.
- FIG. 9 illustrates a formation of a chain of changes in objects' properties, which provides further details of the embodiments of the present disclosure.
- FIGS. 10A, 10B and 100 are exemplary illustrations of digital project contents and a camera viewfinder frame 1002 visible from a user viewport 1004 visible to a first user (i.e., a host), the camera viewfinder frame 1002 visible from a user viewport 1006 visible to a second user, in accordance with an embodiment of the present disclosure.
- FIG. 10A is an exemplary representation of 2-D digital project contents.
- a bounding frame 1008 of the digital project contents might be artificial since the digital project may be infinite in 2-D space, but in the present exemplary illustration, no additional content lies outside the bounding frame 1008 of the digital project contents.
- the digital project comprises a bar chart object, a table object, and a list object.
- the camera viewfinder frame 1002 shows a frame which covers the table object.
- FIG. 10B represents the user viewport 1004 of the host, as viewed by the host only, wherein the user viewport 1004 comprises the table object and the list object.
- the user viewport 1004 of the host also covers the camera viewfinder frame 1002 .
- FIG. 100 represents the user viewport 1006 of the second user, as viewed by the second user only, which covers the bar chart object and the table object, wherein the table object is not entirely within the user viewport 1006 .
- shaded portion of the table object is not visible to the second user but is still covered by the camera viewfinder frame 1002 .
- FIG. 10A further represents temporal record that is recorded during an online collaborative session involving users operating in the user viewport 1004 of the host and the user viewport 1006 of the second user, wherein the bounding frame 1008 of the digital project contents of said temporal record comprises recordings of all objects and their state changes.
- the state changes are recorded regardless of any specific user viewport or camera viewfinder frame locations, but the camera viewfinder frame 1002 describes specific area of the digital project to be shown while replaying project on user request, or when converting such recording to a video format.
- visual content lying outside bounds of the user viewport 1004 is not visible to the host and is not recorded when recording collaborative sessions.
- the bar chart object (shown in FIG. 100 ) and its state changes would not have been recorded by existing collaboration systems.
- the embodiments of the present disclosure beneficially enable:
- the method 800 further comprises:
- editing the temporal record comprises any one of:
- the at least one object or at least one modified object comprises one or more properties
- the one or more properties comprises one or more of an on-screen position, on screen size and content of the at least one object or at least one modified object.
- the content of the at least one object or at least one modified object comprises a set of temporal changes in properties of the at least one object or properties of the at least one modified object over a recorded period of time and/or one or more of a video file or an audio file.
- the at least one object or the at least one modified object is stored at a local data storage or a remote data storage as a set of objects or modified objects and temporal changes to each of the objects and modified objects.
- the method 800 further comprises encrypting the temporal record prior to synchronizing the temporal record with the plurality of devices.
- the method 800 further comprises decrypting the temporal record after synchronizing the temporal record with the plurality of devices.
- an embodiment of the present disclosure provides a computer program product comprising instructions to cause the aforementioned online collaboration recording system to carry out the aforementioned method.
- the computer program product comprises a non-transitory machine-readable data storage medium having stored thereon program instructions that, when accessed by the computing arrangement, cause the computing arrangement to execute the aforementioned method.
- the present disclosure provides the aforementioned online collaboration recording system and the aforementioned method for recording an online collaboration session.
- the online collaboration recording system allows for compiling a temporal record of an entirety or a portion of collaborative work performed during the online collaboration session and not simply an end result of such collaborative work.
- a viewer of the temporal record is provided useful contextual information pertaining to the collaborative work performed during the online collaboration session.
- the temporal record is compiled as a core functionality of the online collaboration recording system.
- the online collaboration recording system optionally allows for editing the temporal record by way of object-based editing to modify content of the temporal record.
- the online collaboration recording system provides a single solution for creation, execution, recording, and sharing of the collaborative work between multiple users.
- the aforementioned method is easy to implement, and allows for capturing the online collaboration session in a non-linear manner.
- the temporal record can be compiled by assembling recordings of collaborative work performed at various time instants in a flexible manner (for example, by rearranging such recordings, overriding previously saved recordings, and the like).
- the online collaboration recording system can be easily integrated with existing networks, file storage systems, devices and the like. Therefore, cost of implementing such a system are very nominal.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Tourism & Hospitality (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Marketing (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Economics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
- The present disclosure relates generally to collaborative work; and more specifically, to online collaboration recording systems for recording online collaboration sessions, and/or for enabling collaborative work on digital documents with timeline-based data. Furthermore, the present disclosure also relates to methods for recording online collaboration sessions. Moreover, the present disclosure also relates to computer program products comprising instructions to cause the aforesaid systems to carry out the aforesaid methods.
- In recent years, the field of digital collaboration in a shared space has been growing exponentially. Digital collaboration refers to two or more people collaborating or meeting remotely on a digital project, over the shared space, to share documents, messages, or multimedia data. In order to collaborate digitally, each contributor needs to digitally communicate with other contributors, either directly or indirectly. Contributors may digitally connect with each other directly via peer-to-peer connection model or indirectly via client-server communication model. Each model has its own set of benefits and deficiencies associated with various systems and methods of digital collaboration.
- An example of a digital collaboration in a shared space based on a synchronous system and method is most commonly used as online chat. Synchronous systems include exchange of information between participants simultaneously and in real-time. Another example includes web conferencing services where data shared in real-time is lost if not saved before the end of a session. Most common examples of asynchronous systems are forums, blogs, social medias and other such digitally shared spaces. Such systems include exchange of shared information between an uploader and subsequent visitors.
- Many of the current (i.e., existing) collaboration systems are beginning to record the sessions, and to output the recorded sessions for distribution. For example, recording of a digital collaboration in a shared space is currently outputted as a video stream or as screen recordings. However, there are many shortcomings with such existing recordings of digital collaborations on a project. The complexity of shortcomings with such existing recordings increases with group projects where it is not possible to determine which individual contributes what content, neither it is possible to determine contents based on actions of such contributors or users. Complexity further increases with unlimited digital supply of content and with using these variety of contents (e.g., image, videos, pdf, gif, and other formats) in the shared space during the digital collaboration and its recording with present methods (of video or screen recording).
- Therefore, in light of the foregoing discussion, there is a need to resolve the shortcomings associated with recording of digital collaborations on digital projects between multiple users as contributors and to synchronise such recordings with individual user's contribution.
- The present disclosure seeks to provide an online collaboration recording system for recording an online collaboration session.
- The present disclosure also seeks to provide a method for recording an online collaboration session.
- The present disclosure also seeks to provide a computer program product comprising instructions to cause the aforesaid system to carry out the aforesaid method.
- The present disclosure provides an at least partial solution to the aforementioned technical problem, or problems, associated with known art. An aim of the present disclosure is to provide a solution that at least partially overcomes the aforementioned technical problem or problems.
- In one aspect, an embodiment of the present disclosure provides an online collaboration recording system for recording an online collaboration session, comprising:
- a computing arrangement in communication with a plurality of devices, wherein each device is accessible by a user during the online collaborating session, wherein in operation the computing arrangement executes instructions to synchronize a digital project in said online collaborating session amongst the plurality of devices, by:
- i) establishing, via a communication module, said online collaboration session for performing collaborative work on the digital project, the digital project comprising at least one object;
- ii) receiving, via the communication module or an input interface, a first user input and based thereon modifying one or more properties of the at least one object to form at least one modified object;
- iii) recording temporally, via a recorder, at least one state of the at least one modified object to compile a temporal record; and
- iv) synchronizing, via the communication module, the temporal record amongst the plurality of devices.
- Embodiments of the present disclosure substantially eliminate, or at least partially address, the aforementioned problems in the prior art and provide an online collaboration recording system that allows for creation, execution, recording, and sharing of useful contextual information pertaining to collaborative work performed during an online collaboration session. The online collaboration recording system is easy to integrate with existing computing hardware.
- Optionally, in operation the computing arrangement edits the at least one modified object in the temporal record for outputting an output stream, by:
- v) receiving, via the communication module or the input interface, a second user input;
- vi) editing, via an editor, the temporal record based on the second user input; and
- (vi) outputting, via an output interface, the output stream based on the edited temporal record.
- Optionally, the computing arrangement, in operation, edits the temporal record by any one of:
-
- adding an additional object to the temporal record,
- removing the at least one modified object from the temporal record,
- combining a plurality of modified objects in the temporal record, and
- modifying one or more properties of the at least one modified object in the temporal record.
- Optionally, the at least one object or at least one modified object comprises one or more properties, the one or more properties comprises one or more of an on-screen position, on-screen size and content of the at least one object or at least one modified object.
- Optionally, the content of the at least one object or at least one modified object comprises a set of temporal changes in properties of the at least one object or properties of the at least one modified object over a recorded period of time and/or one or more of a video file or an audio file.
- Optionally, the at least one object or the at least one modified object is stored at a local data storage or a remote data storage as a set of objects or modified objects and temporal changes to each of the objects and modified objects.
- Optionally, in operation the computing arrangement, via an encryption module, encrypts the temporal record prior to synchronizing the temporal record with the plurality of devices.
- Optionally, in operation the computing arrangement or the plurality of devices, via a decryption module, decrypts the encrypted temporal record after synchronization.
- In another aspect, an embodiment of the present disclosure provides a method for recording an online collaboration session, the method comprising:
-
- establishing the online collaboration session to allow for performing simultaneous collaborative work on a digital project, the digital project comprising at least one object, wherein the digital project is shared and simultaneously modified between a plurality of users;
- receiving a first user input from one of the plurality of users and based thereon modifying the at least one object to form at least one modified object;
- recording temporally at least one state of the at least one modified object to compile a temporal record;
- synchronizing the temporal record amongst the plurality of users.
- Optionally, the method further comprises:
-
- receiving a second user input;
- editing the temporal record based on the second user input; and
- outputting the output stream based on the edited temporal record.
- Optionally, in the method, editing the temporal record comprises any one of:
-
- adding an additional object to the temporal record,
- removing the at least one modified object from the temporal record,
- combining a plurality of modified objects in the temporal record, and
- modifying one or more properties of the at least one modified object in the temporal record.
- Optionally, in the method, the at least one object or at least one modified object comprises one or more properties, the one or more properties comprises one or more of an on-screen position, on screen size and content of the at least one object or at least one modified object.
- Optionally, in the method, the content of the at least one object or at least one modified object comprises a set of temporal changes in properties of the at least one object or properties of the at least one modified object over a recorded period of time and/or one or more of a video file or an audio file.
- Optionally, in the method, the at least one object or the at least one modified object is stored at a local data storage or a remote data storage as a set of objects or modified objects and temporal changes to each of the objects and modified objects.
- Optionally, the method further comprises encrypting the temporal record prior to synchronizing the temporal record with the plurality of devices.
- Optionally, the method further comprises decrypting the temporal record after synchronizing the temporal record with the plurality of devices.
- In yet another aspect, an embodiment of the present disclosure provides a computer program product comprising instructions to cause the aforesaid system to carry out the aforesaid method.
- Additional aspects, advantages, features and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative embodiments construed in conjunction with the appended claims that follow.
- It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the appended claims.
- The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
- Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
-
FIG. 1 illustrates an exemplary sequence diagram for establishment of connection between a computing arrangement with a plurality of devices, in accordance with an embodiment of the present disclosure; -
FIG. 2 illustrates an exemplary sequence diagram for suspending ongoing collaborative work using the online collaboration recording system, in accordance with an embodiment of the present disclosure; -
FIG. 3 illustrates an exemplary sequence diagram for resuming collaborative work using the online collaboration recording system, in accordance with an embodiment of the present disclosure; -
FIG. 4 illustrates an exemplary operation object, in accordance with an embodiment of the present disclosure; -
FIGS. 5A and 5B illustrate an exemplary sequence diagram for a given operation transfer between an initiating device and a receiving device whilst performing collaborative work, from a perspective of the initiating device, in accordance with an embodiment of the present disclosure; -
FIGS. 6A and 6B illustrate an exemplary sequence diagram for the given operation transfer ofFIGS. 5A and 5B , from a perspective of the receiving device, in accordance with an embodiment of the present disclosure; -
FIG. 7 is an exemplary sequence diagram for recording at least one modified object to compile the temporal record, in accordance with an embodiment of the present disclosure; -
FIG. 8 illustrates step of a method for recording an online collaboration session, in accordance with an embodiment of the present disclosure; -
FIG. 9 illustrates a formation of a chain of changes in objects' properties, in accordance with an embodiment of the present disclosure; and -
FIGS. 10A, 10B and 100 are exemplary illustrations of digital project contents and a camera viewfinder frame visible from a user viewport visible to a first user (i.e., a host), the camera viewfinder frame visible from a user viewport visible to a second user, in accordance with an embodiment of the present disclosure. - In the accompanying drawings, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
- The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
-
FIG. 1 is an exemplary sequence diagram for implementation of an onlinecollaboration recording system 100 for recording an online collaboration session, in accordance with an embodiment of the present disclosure. The onlinecollaboration recording system 100 comprises acomputing arrangement 102 in communication with a plurality ofdevices 104. Notably, thecomputing arrangement 102 is communicably coupled to the plurality ofdevices 104 via acommunication network 106. Each device (of the plurality of devices 104) is accessible by a user during the online collaborating session. - Throughout the present disclosure, the term “computing arrangement” refers to hardware, software, firmware and/or any combination thereof, suitable for controlling operation of the online
collaboration recording system 100. Notably, thecomputing arrangement 102 allows for recording the online collaboration session. Optionally, thecomputing arrangement 102 includes an arrangement of one or more computational entities that are capable of performing various computational tasks for operation of the onlinecollaboration recording system 100. - Throughout the present disclosure, the term “device” refers to an electronic device associated with (or used by) a user that is capable of enabling the user to perform specific tasks associated with the online collaboration session. Furthermore, the term “device” is intended to be broadly interpreted to include any electronic device that may be used for voice and/or data communication over the
communication network 106. Optionally, the plurality ofdevices 104 enable a plurality of users associated therewith to join and participate in the online collaboration session. In such a case, the plurality ofdevices 104 provides the plurality of users with an interactive user interface, using which the plurality participates in the online collaboration session. Optionally, the plurality ofdevices 104 comprises an Input/Output module (or I/O module) to enable the users to provide inputs to and receive outputs from the online collaboration session. Example of the plurality ofdevices 104 include but are not limited to, cellular phones, personal digital assistants (PDAs), handheld devices, wireless modems, laptop computers, tablet computers, personal computers, etc. - The term “communication network” refers to an arrangement of interconnected programmable and/or non-programmable components that are configured to facilitate data communication between the plurality of
devices 104 and thecomputing arrangement 102. Furthermore, thecommunication network 106 may include, but is not limited to, one or more peer-to-peer network, a hybrid peer-to-peer network, local area networks (LANs), radio access networks (RANs), metropolitan area networks (MANs), wide area networks (WANs), all or a portion of a public network such as the global computer network known as the Internet, a private network, a cellular network and any other communication system or systems at one or more locations. Additionally, thecommunication network 106 includes wired or wireless communication that can be carried out via any number of known protocols, including, but not limited to, Web Real-Time Communication (WebRTC) protocols, Internet Protocol (IP), Wireless Access Protocol (WAP), Frame Relay, or Asynchronous Transfer Mode (ATM). Moreover, any other suitable protocols using voice, video, data, or combinations thereof, can also be employed. Optionally, thecommunication network 106 is robust and have substantially sufficient bandwidth in order to allow the access of the online collaborating session to the users of each device. - Optionally, the
communication network 106 has a star topology. In the star topology, each device (of the plurality of devices 104) is connected to thecomputing arrangement 102 and thecomputing arrangement 102 acts as a central hub or host for facilitating communication between the plurality ofdevices 104. Therefore, in such a case, a given device is communicably coupled to another device in an indirect manner (namely, via the computing arrangement 102). - It will be appreciated that the aforementioned star topology can be implemented in several ways. In an example, the
computing arrangement 102 can be implemented by way of at least one device. In such a case, the central hub or host runs on the at least one device amongst the plurality of devices, thereby enabling thedigital collaboration system 100 to work as an ad-hoc session or a meeting solution. In another example, thecomputing arrangement 102 is implemented by way of a server, more specifically a back-end server. In such a case, the back-end server functions as the central hub. In such a case, the onlinecollaboration recording system 100 works as a cloud or persistent document provider solution. Furthermore, in such a case, the back-end server is coupled in communication to the plurality ofdevices 104, via thecommunication network 106. - Optionally, the
communication network 106 employs WebRTC technology to facilitate communication between thecomputing arrangement 102 and the plurality ofdevices 104. Optionally, in this regard, at least one signaling server establishes communication between thecomputing arrangement 102 and the plurality ofdevices 104 by way of WebRTC signaling. - Optionally, each of the plurality of
devices 104 comprise a corresponding communication module to establish the online collaboration session. It will be appreciated that the communication modules of the plurality ofdevices 104 are compatible with a communication module of thecomputing arrangement 102, for enabling proper communication within the onlinecollaboration recording system 100. Furthermore, all communication modules (of the plurality ofdevices 104, as well as the computing arrangement 102) are compatible with the WebRTC technology. Such communication modules may also be referred to as “Synchronization Engine”. - As an example, in the online
collaboration recording system 100 at least one signaling server room (hereinafter, referred to as “virtual room”) and the plurality of users are discovered, and thecommunication network 106 employs the WebRTC PeerConnection to facilitate real time communication transport between thecomputing arrangement 102 and the plurality ofdevices 104. In such an example, the WebRTC PeerConnection enables the online collaboration session between the central hub and the plurality ofdevices 104, via the at least one signaling server. Furthermore, the WebRTC PeerConnection employs Web Sockets as the at least one signaling server. Moreover, in the real-time the WebRTC data channels are employed for exchanging data, actions and control messages (for example, such as file-transfer) between peers. Furthermore, WebRTC MediaTracks are employed for real-time media (for example, such as audio and/or video). As shown, thecommunication network 106 is responsible for mixing, forwarding and recording media from and/or to all peers. Thecommunication network 106 establishes signaling connection between thecomputing arrangement 102 and the plurality ofdevices 104, via the at least one signaling server. As a result, the virtual room is created, wherein each virtual room has a unique identifier used by other peers to join it. Furthermore, the unique identifier related to a given virtual room can be employed by the plurality of users to join the online collaboration session. - Optionally, upon successful establishment of communication between the
computing arrangement 102 and the plurality ofdevices 104, thecomputing arrangement 102 executes instructions to start a collaborative work or to keep continuing currently ongoing collaborative work. It will be appreciated that the currently ongoing collaborative work would not be suspended before establishing said online collaboration session to allow for users of the plurality ofdevices 104 to complete any currently ongoing work, prior to joining the online collaborating session for collaboratively working on the digital project. Alternatively, optionally, upon successful establishment of communication between thecomputing arrangement 102 and the plurality ofdevices 104, thecomputing arrangement 102 executes instructions to suspend currently ongoing collaborative work. More details pertaining to such suspension of the currently ongoing collaborative work have been described herein later in conjunction withFIG. 2 . - In operation, the
computing arrangement 102 executes instructions to synchronize a digital project in said online collaborating session amongst the plurality ofdevices 104, by: - (i) establishing, via a communication module, said online collaboration session to allow for performing collaborative work on the digital project, the digital project comprising at least one object;
- (ii) receiving, via the communication module or an input interface, a first user input and based thereon modifying one or more properties of the at least one object to form at least one modified object;
- (iii) recording temporally, via a recorder, at least one state of the at least one modified object to compile a temporal record; and
- (iv) synchronizing, via the communication module, the temporal record amongst the plurality of
devices 104. - Throughout the present disclosure, the term “online collaboration session” refers to a communication session that is temporarily established between the
computing arrangement 102 and the plurality ofdevices 104 for facilitating interactive exchange of information between the plurality ofdevices 104. Such interactive exchange of information between the plurality ofdevices 104 pertains to collaborative work that is to be performed on the digital project, by the plurality of users. Notably, in the online collaboration session, the plurality ofdevices 104 communicate with each other via messages and optionally, responses to said messages. Furthermore, in the online collaboration session, thecomputing arrangement 102 detects and manages conflicts between the plurality ofdevices 104. - Optionally, when said online collaboration session is established at (i), the plurality of users join a virtual room (or a network-based room) for performing collaborative work on the digital project. Optionally, in such a case, the plurality of users perform said collaborative work in real time or near real time.
- Throughout the present disclosure, the term “collaborative work” refers to simultaneous working (for example, by way of editing content, creating content, deleting content, and the like) of the plurality of users on a given part or an entirety of the digital project. Simply put, the collaborative work relates to performing changes to the given part of the digital project over time, upon collaboration of the plurality of users. In such a case, the changes implemented on the digital project, by a given user can be recorded and are shared with the remaining users working on the same digital project, simultaneously.
- Throughout the present disclosure, the term “digital project” refers to a computer-based project upon which the plurality of users collaborate to perform meaningful work. Such a digital project could pertain to a number of domains including, but not limited to, business, education, military, medical science. In an example, the digital project can be a video project related to marketing of a product or a business. In another example, the digital project can be an audio-visual project related to demonstrating a technique for research project. In yet another example, the digital project can be a digital presentation related to historical facts. In still another example, the digital project can be a digital document related to findings of a scientific experiment.
- Throughout the present disclosure, the term “object” refers to a data construct of the digital project upon which collaborative work is to be performed by the plurality of users. By “at least one” it is meant that in some implementations, the digital project comprises a single object whereas in other implementations, the digital project comprises a plurality of objects. Moreover, by way of the online
collaboration recording system 100, multiple users can synchronously or asynchronously collaborate for working on the given digital project. Examples of at least one object include, but are not limited to, time object, audio object, image object, text object, drawing object. - Optionally, the at least one object is a camera object, wherein the camera object comprises a recordable property describing camera viewfinder location. The term “camera viewfinder location” refers to location data (specific to type of the digital project) allowing for determining which part of the digital project is covered by an abstract camera. Furthermore, in an exemplary 2-D application, a camera viewfinder frame is an on-screen rectangle used to determine or inform an area of the digital project that is visible while replaying recording or after conversion of said recording to at least one video file. The camera viewfinder frame is independent of the user viewport, but may be synchronized together using at least one of: the camera viewfinder frame may follow the user viewport so as to record similar to local screen capture of the digital project; the user viewport may follow another user viewport, wherein output is up to user; or maybe a combination of the two. Furthermore, the recordable property of the camera viewfinder location describes location of the user viewport in the digital project. Additionally, the recordable property of the camera viewfinder location is dependent on nature of the digital project, such as for example, visible rectangle in a whiteboard or any two-dimensional (2-D) infinite space (i.e., when the camera viewfinder location becomes the camera viewfinder frame), page in a document, slide on presentation, and/or combination of position, orientation vector and field of view in three-dimensional (3-D) space. Furthermore, the camera viewfinder location and the user viewport have been described herein later in conjunction with
FIG. 10 . - Optionally, for synchronously collaborating on the given project, the plurality of users perform collaborative work at the same time. In such a case, any change to the digital project made by a user would be visible to all other users in real time. Optionally, for asynchronously collaborating on the given project, the plurality of users perform collaborative work at different times, wherein the plurality of users have ability to provide simultaneous input. Herein, the simultaneous input may be collected independently on various devices, with all manipulations (i.e., changes) being collected asynchronously and joined together to form a chain of activities. In such a case, any change to the digital project made by a user would be transmitted to all other users in real time, and would be visible to such users when they choose to work on the digital project.
- It will be appreciated that said online
collaboration recording system 100 is not limited to a session with objects in two-dimensional (2-D) infinite space as a same arrangement and set of rules applies for a three-dimensional (3-D) space, wherein the 3-D space is an extension of space and object properties. In particular, the onlinecollaboration recording system 100 is flexible enough to be applied to a broad range of digital projects such as, but not limited to, whiteboards, text files, spreadsheets, video projects. - Optionally, upon successful establishment of the online collaboration session, the
computing arrangement 102 executes instructions to resume the collaborative work on the digital project. More details pertaining to resuming the collaborative work on the digital project have been described herein later in conjunction withFIG. 3 . - At (ii), the first user input is received via the communication module or an input interface. Notably, based upon the first user input, one or more properties of the at least one object are modified to form at least one modified object. The first user input could be in form of a touch input, a voice input, a digital command, a gesture input, and the like. Upon performing such modification on the at least one object, properties of the at least one change and such object becomes the at least one modified object. Therefore, the term “modified object” refers to an object whose properties are modified according to the first user input. It will be appreciated that at a given time, the digital project may include multiple objects but only some objects among such multiple objects may be modified in the aforesaid manner. Such modification of the at least one object can be understood to be a “collaboration action” pertaining to the online collaboration session.
- Optionally, upon receiving the first user input, new data might be added, or existing data might be altered for at least one object. Herein, one or more properties referring to the new data may be modified to form at least one modified state in at least one modified object. In an example, a collaborative session adds new data, wherein the new data comprises recording voice of any given user. Herein, the new data is in form of waveform bytes which are stored in a file on a disk. Subsequently, such a file is represented by an audio object comprising a current time recordable property, denoted as “current time” which refers to a point in time in the audio file when the new data is inserted.
- Optionally, at a given time, only a single collaboration action is performed within the online collaboration session. Alternatively, optionally, at a given time, a plurality of collaboration actions are performed within the online collaboration session.
- Optionally, any collaboration action pertaining to the online collaboration session uses an operation object as a proxy. In this regard, the operation object is a state machine object that provides a generic mechanism for implementing at least one collaboration action in the digital project. Optionally, the operation object is created as a result of end-user input, but could also be created by automated test bot or specifically tailored bot. In such a case, the automated test bot or specifically tailored bot is also coupled in communication with the
computing arrangement 102 via thecommunication network 106. More details of the operation object are elucidated herein later in conjunction withFIG. 4 . - In an embodiment, when the
computing arrangement 102 is implemented as a back-end server, the first user input is received via the communication module. In another embodiment, when at least one computing module of thecomputing arrangement 102 is implemented at one or more of the plurality ofdevices 104, the first user input is received via the input interface of said one or more of plurality ofdevices 104. - Optionally, the at least one object or at least one modified object comprises one or more properties, the one or more properties comprises one or more of an on-screen position, on screen size and content of the at least one object or at least one modified object. The term, “property” refers to an attribute associated with the at least one object or the at least one modified object of the digital project pertaining to which collaborative work is performed by the plurality of users during the online collaboration session. Notably, the one or more properties of the at least one object or at least one modified object are well-defined. Examples of one or more properties may include, but are not limited to on-screen position, on-screen size and content (for example, such as image, audio and the like) of the at least one object or at least one modified object.
- Optionally, the at least one object or at least one modified object comprises one or more recordable properties. Optionally, the one or more recordable properties vary with respect to time. In an example, the one or more recordable properties may relate to an on-screen position of the at least one object or the at least one modified object.
- Optionally, the content of the at least one object or at least one modified object comprises a set of temporal changes in properties of the at least one object or properties of the at least one modified object over a recorded period of time and/or one or more of a video file or an audio file. In such a case, the content of the at least one object or at least one modified object refers to a given data which is a part of the collaborative work. Such change in the content is additionally reflected in related state of the at least one object or the at least one modified object which might be recordable. Furthermore, such state is designed to reflect characteristics (i.e., attributes, or condition) of the content at any given point in recording time which allows for effective playback of changes made to content across time. In an example, the at least one object may be a drawing object, wherein the content of the drawing object is an ordered set of lines and points. Furthermore, new content is added to the drawing object, wherein the new content comprises a new line or new points. Additionally, the drawing object further comprises a recordable state, wherein the recordable state is a pair of indices pointing to the ordered set of lines and points as well as the new line and the new point in the last visible line. In an example, the content of the at least one object or at least one modified object comprises one or more of the video files for example, such as a video data or the audio file for example, such as a voice data.
- Alternatively, optionally, the content of the at least one object or at least one modified object comprises an image file. In such a case, the content of the at least one object or at least one modified object includes images, graphics and the like.
- Yet alternatively, optionally, the content of the at least one object or at least one modified object comprises a text file. In such a case, the content of the at least one object or at least one modified object includes textual data, spreadsheet documents and the like.
- Optionally, the at least one object or the at least one modified object is stored at a local data storage or a remote data storage as a set of objects or modified objects and temporal changes to each of the objects and modified objects. In an example, the local data storage as the set of objects or the modified objects and temporal changes to each of the objects and the modified objects can be implemented by a way of memory unit associated with operation-initiating device and/or memory unit associated with operation-receiving device. In another example, the remote data storage can be implemented by a way of memory module of the
computing arrangement 102. In such a case, the memory module ofcomputing arrangement 102 may be a memory unit ofcomputing arrangement 102 or a database arrangement coupled in communication withcomputing arrangement 102. In yet another example, the remote data storage can be implemented by a way of cloud server arrangement communicably coupled to the onlinecollaboration recording system 100. - Optionally, the digital project further comprises at least one track, wherein a given track represents how the one or more recordable properties of the at least one object or at least one modified object vary with respect to time. In other words, a given track represents changes of a given recordable property of a given object or a given modified object. Such changes are made by the plurality of users working in the digital project during the online collaboration session.
- Optionally, the digital project further comprises at least one asset, wherein a given asset represents at least one file used by a given object. Examples of such files include, but are not limited to, the video file, the audio file, the image file, the text file.
- Optionally, a given asset is associated with a single object. Alternatively, optionally, a given asset is associated with a plurality of objects.
- Optionally, the digital project further comprises a time object, wherein the time object represents a time duration of the digital project in form of a plurality of frames. A single frame is a discrete unit of time in which a current state of the one or more recordable properties of the at least one object or the at least one modified object is recorded. Furthermore, optionally, the time object comprises a current frame index.
- Optionally, a given frame type is specific to a given recordable property. Notably, for a given object or a given modified object, the current state of its one or more recordable properties can be stored in a corresponding frame. Such a frame can be referred to as a “record frame”. Optionally, the current state of the one or more recordable properties of the given object or the given modified object is restored from a corresponding record frame into a replay frame. It will be appreciated that by coordinating restoring of recorded state between all objects (whether modified or not) and their recordable properties, the online
collaboration recording system 100 allows for restoring a state of the digital project at any given point in time. This allows to provide “replay” functionality which can be utilized by the plurality of users to view the temporal record of the collaborative work performed on the digital project instead of a final effect of said collaborative work. This provides the plurality of users with useful contextual information pertaining to the collaborative work performed during the online collaboration session. Optionally, the track records the plurality of frames sequentially relative to an abstract timeline. Herein, the time object is abstracted from absolute time. Furthermore, optionally, the time object comprises a current frame index. Moreover, the tracks share a common time base, denoted by “Time 0”, wherein the “Time 0” defines beginning of recording of the track. Additionally, the digital project comprises a state to represent current timeline position of the digital project, denoted by “current time point”, wherein the “current time point” is relative to the “Time 0”. Subsequently, the digital project further comprises recording duration, wherein the recording duration represents the duration of the overall recording relative to “Time 0”. Consequently, later playback and conversion to video of the recorded state changes are in synchronization across multiple objects. Furthermore, consequently, recording may be iterated, wherein new temporal records may be inserted in the middle or at the end of an existing digital project and mixing new temporal records with previous temporal records. In an example, the user may record changes made to a digital document, wherein the digital document is a whiteboard. Herein, the recorded changes made to the digital document may be represented as timeline of changes made during a collaborative session or across multiple collaborative sessions. Furthermore, the time comprises tracks representing changes made to specific objects in time. - Optionally, upon creation of a given project, at least one unique identifier is created for at least one of: the at least one object, the at least one track, the at least one asset.
- In an example, a given digital project may pertain to creating a video to market a product. In such an example, the digital project may comprise the at least one object, the at least one track, the at least one asset. Moreover, each object, track, and asset has its own unique identifier that is created upon creation of said object, track, and asset. For example, the at least one object may comprise an image object having a unique identifier. In such an example, the defined sets of properties associated with the image object may include position (x,y), size (width, height) of the image object, and a corresponding image as asset. Furthermore, in such an example, the position (x,y) is identified as the at least one recordable property which changes over time and the position track is identified as the asset track for the image object.
- At (iii), at least one state of the at least one modified object is recorded temporally, via the recorder, to compile the temporal record. In an embodiment, the “recorder” is implemented by way of hardware, software, firmware, or a combination of these, suitable for compiling the temporal record of the online collaboration session. Moreover, the term “recording” refers to storing the at least one state of one or more recordable properties of the at least one object or at least one modified object in the digital project, over the time duration of the digital project. This is done by utilizing a recording model that is owned by each object. Herein, the state of the at least one modified object is the recordable property of the at least one modified object. Optionally, the change in the one or more recordable properties of the at least one object or at least one modified object with respect to time is recorded as the at least one track of the digital project. Furthermore, the “temporal record” includes evidence of the collaborative work performed on the digital project, during the online collaboration session. Optionally, the temporal record is a compilation of the at least one track of the digital project. It will be appreciated that authentic moments of ideation and understanding during the online collaboration session are recorded by way of the temporal record.
- Optionally, the process of temporally recording the at least one modified object is related to state transitions of the operation object. Each operation declares which recordable properties of objects it operates upon are changed due to its execution. This means that recording changes of these properties may start when operation transitions to “started” state and may finish when operation transitions to “finished” state.
- Optionally, the process of temporally recording the at least one modified object is related to capturing states and changes within corresponding properties of the at least one object or the at least one modified object. Herein, recordable properties of the at least one object or the at least one modified object is the state of the object. Notably, for every pair of the at least one object or the at least one modified object and the captured state, a recording track is created. Herein, the recording track represents changes made to a given state of the at least one object or the at least one modified object. Additionally, all changes to the state of the at least one object or the at least one modified object are saved in a frame, wherein the frame is specific to a given type of the state. For instance, the type of state for a two-dimensional (2-D) object may be for transformation, denoted as “transform”; the type of state for a resizable 2-D object may be size of the 2-D object, denoted as “size”; the type of state for the drawing object may be range of visible drawing of the drawing object, denoted as “range of visible drawing”; the type of state for animated object may be current frame of the animated object, denoted “current frame”; and the type of state for multimedia object may be current time of the multimedia object, denoted as “current time”. Herein, for all 2-D objects, transformation of all the 2-D objects are recorded as six floating point numbers which are used to describe position, scale and rotation of all the 2-D objects; for resizable 2-D objects, size is recorded as two floating point numbers representing width and height of the 2-D objects; for the drawing object, the range of visible drawing is recorded as two integers, where a first integer may be number of visible lines, and a second integer may be number of visible points in last visible line; for the animated objects, current frame may be recorded as a third integer, where the third integer describes index of currently visible frame of the animated object; for multimedia objects, current time is recorded as one floating point number that describes time in timeline of the multimedia files which is played currently. Herein, each of the at least one object and/or the at least one modified object define their own set of recordable properties, thereby known as type of the state. Notably, the temporal record is a stream of changes to the at least one object or the at least one modified object occurring over duration of the recording, that are initiated independently by multiple users. Thereby, the recording is performed less rigidly as there is digital collaboration with the multiple users as contributors. Furthermore, this allows independent input and individual user's contribution along with allowing multiple viewpoints while recording.
- In an embodiment, the at least one object or the at least one modified object and the state of the at least one object or the at least one modified object might not always be mapped one to one. In an exemplary scenario, the at least one object may be a drawing object comprising object data and recorded state. Herein, object data of the drawing object comprises a vector of drawn lines and their respective points. Furthermore, in order to record the process of drawing the number of lines and points must be known and a last drawn line must be visible at any given time. Thereby, the state of the drawing object consists of data which acts as the state of the object describing recordable property of the drawing of the drawing object.
- Optionally, the process of temporally recording the at least one modified object is performed in a desynchronized manner. In other words, each device records a state of the digital project independently of other devices.
- Optionally, the process of temporally recording the at least one object or the at least one modified object is flexible in nature. Herein, past sections of the temporal record may be changed at any given moment, leading to multiple temporal records. Furthermore, a state of every frame is defined by a chain of recorded changes applied to the existing at least one object or the at least one modified object represented in said frame, thereby forming another temporal record where the at least one modified object reacts to changes performed in the past sections. Consequently, a final recorded state is a combination of multiple temporal records, and might be changed later upon playback.
- Optionally, the process of temporally recording the at least one object or the at least one modified object comprises receiving multiple inputs from multiple users at multiple times. These multiple inputs pertain to changing of the state of the at least one object or the at least one modified object. Herein, the recording is multi-layered, as multiple inputs may be provided at multiple times by re-running recording sequence and adding additional modification to the objects. Furthermore, the voice input and/or the video stream of any user may be recorded as at least one object or as the at least one modified object forming additional layers of the captured states.
- At (iv), the temporal record is synchronized, via the communication module, amongst the plurality of
devices 104. By “synchronizing the temporal record” it is meant that the temporal record is communicated to all users performing the collaborative work on the digital project substantially simultaneously. In other words, “synchronizing the temporal record” pertains to sharing the temporal record between all users working in the online collaborating session at the same time. As a result, said users have an up-to date record of the collaborative work that is performed on the project. This helps said users to be on the same page regarding progress of work on the digital project for collaborating in a very efficient manner. It will be appreciated that the onlinecollaboration recording system 100 serves as an up-to date whiteboard whereupon said users can collaborate efficiently for continuous development and feedback pertaining to the digital project. - Optionally, the temporal record is synchronized by way of the
computing arrangement 102. In such a case, the plurality ofdevices 104 transmit their recorded changes to the at least one object (which are optionally recorded in form of tracks) to thecomputing arrangement 102 whereat such data unified to compile the temporal record. Thereafter, the temporal record is synchronously transmitted to the plurality ofdevices 104. - Optionally, in operation the
computing arrangement 102 edits the at least one modified object in the temporal record for outputting an output stream, by: - (v) receiving, via the communication module or the input interface, a second user input;
- (vi) editing, via an editor, the temporal record based on the second user input; and
- (vi) outputting, via an output interface, the output stream based on the edited temporal record. Therefore, the second user input pertains to editing of the temporal record. Such editing of the at least one modified object can be understood to be another “collaboration action” pertaining to the online collaboration session. The output stream comprises the edited temporal record and provides the up-to-date edited temporal record to the plurality of users.
- Optionally, the editing of the temporal record is performed in a non-linear manner. Notably, the temporal record can be compiled by assembling recordings of collaborative work performed at various time instants in a flexible manner (for example, by rearranging such recordings, overriding previously saved recordings, and the like). As a result, the editing need not be done in any time-specific manner but any portion of the temporal record. Beneficially, the editing of the temporal record provides a customizable temporal record, thereby creating well-edited temporal records. Therefore, such temporal records provide most relevant information associated with the digital project to all the users who have access to the digital project.
- Optionally, the
computing arrangement 102, in operation, edits the temporal record by any one of: -
- adding an additional object to the temporal record,
- removing the at least one modified object from the temporal record,
- combining a plurality of modified objects in the temporal record, and
- modifying one or more properties of the at least one modified object in the temporal record. Such editing operations allow for performing object-based editing to modify content of the temporal record. Therefore, the temporal record of the collaborative work on the digital project is customizable according the users' preferences and/or requirements.
- In an example, the temporal record may include three objects, wherein the content of the three objects are three video files. In such a case, one additional object comprising a video file may be added to the temporal record.
- In another example, the temporal record may include five objects, wherein the content of the five objects is one audio file each. In such a case, two objects may be removed from the temporal records, thereby resulting in three objects having one audio file each.
- In yet another example, the temporal record may include three objects, wherein the content of the two objects is one audio file each and the content of one object is a video file. In such a case, two objects having similar content may be combined in the temporal record.
- In still another example, the temporal record may include two objects, wherein the content of the two objects is one video file each. In such a case, two objects may have different properties. Furthermore, in such a case, the properties may be modified in the temporal record for simplification.
- Optionally, the
computing arrangement 102 allows for recording collaborative edits to the digital projects in a format which allows for later playback and conversion to video on demand. Furthermore, flexible recording format allows to manipulate objects (such as the camera viewfinder location) after edits are made and is not dependent on location of a given user during editing of the digital project. - Optionally, in operation the
computing arrangement 102, via an encryption module, encrypts the temporal record prior to synchronizing the temporal record with the plurality ofdevices 104. The term “encryption” refers to conversion the temporal record into a specific code, thereby preventing any unauthorized access to the temporal record. Notably, the temporal record is encrypted, via an encryption module, thereby providing security to the temporal record. In such a case, only authorized users can access the temporal record via their associated devices. As a result, there is provided a solution for a safe and secure sharing of the digital project. The encryption can be implemented by various commonly known techniques. Examples of the encryption technique include, but are not limited to, hashing, public-key cryptography, private-key cryptography. - Optionally, in operation the
computing arrangement 102 or the plurality ofdevices 104, via a decryption module, decrypts the encrypted temporal record after synchronization. When the temporal record is encrypted, via the encryption module for providing security to the temporal record, the decryption module is utilized for decrypting the encrypted temporal record after synchronization. Notably, the encrypted temporal record is in form of the specific code which is not easily decoded by the user. However, to allow the encrypted temporal record to be understood and subsequently used by the user there is a need to convert such encrypted temporal record into a readable format. Therefore, the decryption module is used in order to convert the encrypted temporal record into a readable form. -
FIG. 2 is an exemplary sequence diagram for suspending ongoing collaborative work using the onlinecollaboration recording system 100, in accordance with an embodiment of the present disclosure. In the exemplary sequence diagram, thecomputing arrangement 102 disables performing local operations and thereafter, sends requests to the plurality ofdevices 104 to suspend collaborative work, via thecommunication network 106. Upon receiving the requests to suspend collaborative work, the plurality ofdevices 104 disable performing of local operations, and thereafter, send responses for said request to thecomputing arrangement 102, via thecommunication network 106. Thecomputing arrangement 102 waits to establish a new collaboration session until it receives responses from each device of the plurality ofdevices 104. Upon receiving responses from each device of the plurality ofdevices 104, thecomputing arrangement 102 suspends all ongoing collaborative work. -
FIG. 2 is merely an example, which should not unduly limit the scope of the claims herein. A person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure. -
FIG. 3 is an exemplary sequence diagram for resuming collaborative work using the onlinecollaboration recording system 100, in accordance with an embodiment of the present disclosure. In the exemplary sequence diagram, thecomputing arrangement 102 resets its operation list and creates “resume work” requests to be sent to the plurality ofdevices 104, and enables performing local operations. Such “resume work” requests are sent from thecomputing arrangement 102 to the plurality ofdevices 104 via thecommunication network 106. The plurality ofdevices 104 execute the “resume work” requests and reset their operation lists. Thereafter, the plurality ofdevices 104 enable performing local operations. Once all devices enable performing local operations, collaborative work can be performed on the digital project. -
FIG. 3 is merely an example, which should not unduly limit the scope of the claims herein. A person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure. - Optionally, resuming collaborative work using the online
collaboration recording system 100, thecomputing arrangement 102 aggregates updates to a plurality of digital projects that are sent to a first group of devices connected to thecomputing arrangement 102 until a new device sends an open project confirmation request to thecomputing arrangement 102. Such a new device from the second group sends the open project confirmation request after receiving and loading the digital project into memory. Upon receiving the open project confirmation request, thecomputing arrangement 102 sends all project updates to said new device followed by “resume work” request. The new device executes all project updates as received. Upon receiving “resume work” request, the new device is considered as fully connected with the up-to-date digital project which gives an end-user associated with the new device the ability to perform collaborative work on the digital project. -
FIG. 4 illustrates anexemplary operation object 400, in accordance with an embodiment of the present disclosure. As shown, theexemplary operation object 400 details various states of at least one operation that is to be performed on a given object of the digital project. Notably, the at least one operation pertains to the at least one collaboration action that is to be performed for the given object. Upon implementation of a given operation, the given object undergoes a state transition from one state to another state. Optionally, such a state transition is communicated (for example, using a WebRTC data channel) from a first device (namely, an initiating device of a user who initiates the given operation) to a second device (namely, a receiving device of a user who receives an update of the given operation) by sending an operation message with information specific to state transition and operation type pair that would allow for recreation of the given operation by the second user. It will be appreciated that since theoperation object 400 is generic in nature, it allows for performing discrete operations (for example, such as changing colour of an object) as well as long, continuous operations (for example, such as moving an object by moving a finger on a screen of a device). - The
exemplary operation object 400 depicts five states of an operation (such as preparing, readying, executing, cancelling and finishing) that is to be performed on the given object of the digital project. - Optionally, the receiving device is: a device among the plurality of
devices 104, thecomputing arrangement 102. -
FIG. 4 is merely an example, which should not unduly limit the scope of the claims herein. A person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure. -
FIGS. 5A and 5B illustrate an exemplary sequence diagram for a given operation transfer between the initiatingdevice 502 and the receiving device whilst performing collaborative work, from a perspective of the initiating device, in accordance with an embodiment of the present disclosure. InFIGS. 5A and 5B , the initiatingdevice 502 creates an operation that is to be performed upon the given object, based upon an input from the user associated with the initiatingdevice 502. An operation-specific setup is prepared and validated at the initiatingdevice 502. If the operation-specific setup is unable to lock the object, the operation is aborted. If the operation-specific setup is able to lock the object, the operation is said to be in ready state. The initiatingdevice 502 transmits the operation-specific setup to the receiving device via thecommunication network 106, whilst also starting execution of the operation locally. The operation stays in executing state until it attains completion (namely, finishing) or is cancelled. -
FIGS. 5A and 5B are merely examples, which should not unduly limit the scope of the claims herein. A person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure. -
FIGS. 6A and 6B illustrate an exemplary sequence diagram for the given operation transfer ofFIGS. 5A and 5B , from a perspective of the receivingdevice 602, in accordance with an embodiment of the present disclosure. InFIGS. 6A and 6B , the receivingdevice 602 receives the operation-specific setup transmitted by the initiatingdevice 502, via thecommunication network 106. Subsequently, at the receivingdevice 602, a state of a remote operation changes to ‘preparing’. Thereafter, the remote operation is checked for possible conflicts, and its status changes to ‘ready’ when all conflicts (if any) are resolved. The receivingdevice 602 begins executing the remote operation. The remote operation stays in executing state until it attains completion (namely, finishing) or is cancelled. -
FIGS. 6A and 6B are merely examples, which should not unduly limit the scope of the claims herein. A person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure. -
FIG. 7 is an exemplary sequence diagram for recording the at least one modified object to compile the temporal record, in accordance with an embodiment of the present disclosure. At first, a recording request is sent from thecomputing arrangement 102 to the plurality ofdevices 104, via thecommunication network 106. Subsequently, work is suspended within the online collaboration session. A recording timer is started and recording begins at the plurality ofdevices 104. Thereafter, work is resumed within the online collaboration session. Collaborative work is performed whilst being recorded until a request to stop recording is sent from thecomputing arrangement 102 to the plurality ofdevices 104. Upon this, recording is suspended and collaborative work on the digital project is resumed without recording. -
FIG. 8 illustrates step of amethod 800 for recording an online collaboration session, in accordance with an embodiment of the present disclosure. At astep 802, the online collaboration session is established to allow for performing simultaneous collaborative work on a digital project, the digital project comprising at least one object, wherein the digital project is shared and simultaneously modified between a plurality of users. At a step 804, the first user input is received from one of the plurality of users and based thereon the at least one object is modified to form at least one modified object. At astep 806, the at least one modified object is recorded temporally to compile the temporal record. At astep 808, the temporal record is synchronized amongst the plurality of users. - The
steps 802 to 808 are only illustrative and other alternatives can also be provided where one or more steps are added, one or more steps are removed, or one or more steps are provided in a different sequence without departing from the scope of the claims herein. -
FIG. 9 illustrates a formation of a chain of changes in objects' properties, which provides further details of the embodiments of the present disclosure. -
FIGS. 10A, 10B and 100 are exemplary illustrations of digital project contents and acamera viewfinder frame 1002 visible from auser viewport 1004 visible to a first user (i.e., a host), thecamera viewfinder frame 1002 visible from auser viewport 1006 visible to a second user, in accordance with an embodiment of the present disclosure.FIG. 10A is an exemplary representation of 2-D digital project contents. Notably, abounding frame 1008 of the digital project contents might be artificial since the digital project may be infinite in 2-D space, but in the present exemplary illustration, no additional content lies outside thebounding frame 1008 of the digital project contents. The digital project comprises a bar chart object, a table object, and a list object. Herein, thecamera viewfinder frame 1002 shows a frame which covers the table object.FIG. 10B represents theuser viewport 1004 of the host, as viewed by the host only, wherein theuser viewport 1004 comprises the table object and the list object. Theuser viewport 1004 of the host also covers thecamera viewfinder frame 1002.FIG. 100 represents theuser viewport 1006 of the second user, as viewed by the second user only, which covers the bar chart object and the table object, wherein the table object is not entirely within theuser viewport 1006. Herein, shaded portion of the table object is not visible to the second user but is still covered by thecamera viewfinder frame 1002. -
FIG. 10A further represents temporal record that is recorded during an online collaborative session involving users operating in theuser viewport 1004 of the host and theuser viewport 1006 of the second user, wherein thebounding frame 1008 of the digital project contents of said temporal record comprises recordings of all objects and their state changes. Herein, the state changes are recorded regardless of any specific user viewport or camera viewfinder frame locations, but thecamera viewfinder frame 1002 describes specific area of the digital project to be shown while replaying project on user request, or when converting such recording to a video format. Typically, visual content lying outside bounds of theuser viewport 1004 is not visible to the host and is not recorded when recording collaborative sessions. Thus, for example, the bar chart object (shown inFIG. 100 ) and its state changes would not have been recorded by existing collaboration systems. In contrast, the embodiments of the present disclosure beneficially enable: -
- recording of the objects and their state changes both outside the
camera viewfinder frame 1002 and outside of theuser viewport 1004 of the host, so the temporal record thus created is a true representation of collaborative changes made to states of all the objects during a session, - performing changes by the host in an area outside of the area covered by the
camera viewfinder frame 1002, like the list object to be moved into area of thecamera viewfinder frame 1002 during project recording, and - host control over area covered by camera viewfinder, like moving the
camera viewfinder frame 1002 from the table object to the list object, wherein the host control is not limited to manipulation during the recording due to a camera being a regular object (as an example, host might decide to change camera viewfinder frame location to one covering the bar chart object for a period of recorded session, when second user was making changes to it).
- recording of the objects and their state changes both outside the
- Optionally, the
method 800 further comprises: -
- receiving a second user input;
- editing the temporal record based on the second user input; and
- outputting the output stream based on the edited temporal record.
- Optionally, in the
method 800, editing the temporal record comprises any one of: -
- adding an additional object to the temporal record,
- removing the at least one modified object from the temporal record,
- combining a plurality of modified objects in the temporal record, and
- modifying one or more properties of the at least one modified object in the temporal record.
- Optionally, in the
method 800, the at least one object or at least one modified object comprises one or more properties, the one or more properties comprises one or more of an on-screen position, on screen size and content of the at least one object or at least one modified object. - Optionally, in the
method 800, the content of the at least one object or at least one modified object comprises a set of temporal changes in properties of the at least one object or properties of the at least one modified object over a recorded period of time and/or one or more of a video file or an audio file. - Optionally, in the
method 800, the at least one object or the at least one modified object is stored at a local data storage or a remote data storage as a set of objects or modified objects and temporal changes to each of the objects and modified objects. - Optionally, the
method 800 further comprises encrypting the temporal record prior to synchronizing the temporal record with the plurality of devices. - Optionally, the
method 800 further comprises decrypting the temporal record after synchronizing the temporal record with the plurality of devices. - In yet another aspect, an embodiment of the present disclosure provides a computer program product comprising instructions to cause the aforementioned online collaboration recording system to carry out the aforementioned method. Specifically, the computer program product comprises a non-transitory machine-readable data storage medium having stored thereon program instructions that, when accessed by the computing arrangement, cause the computing arrangement to execute the aforementioned method.
- The present disclosure provides the aforementioned online collaboration recording system and the aforementioned method for recording an online collaboration session. The online collaboration recording system allows for compiling a temporal record of an entirety or a portion of collaborative work performed during the online collaboration session and not simply an end result of such collaborative work. As a result, a viewer of the temporal record is provided useful contextual information pertaining to the collaborative work performed during the online collaboration session. Beneficially, the temporal record is compiled as a core functionality of the online collaboration recording system. Moreover, the online collaboration recording system optionally allows for editing the temporal record by way of object-based editing to modify content of the temporal record. The online collaboration recording system provides a single solution for creation, execution, recording, and sharing of the collaborative work between multiple users. The aforementioned method is easy to implement, and allows for capturing the online collaboration session in a non-linear manner. Specifically, the temporal record can be compiled by assembling recordings of collaborative work performed at various time instants in a flexible manner (for example, by rearranging such recordings, overriding previously saved recordings, and the like). Furthermore, the online collaboration recording system can be easily integrated with existing networks, file storage systems, devices and the like. Therefore, cost of implementing such a system are very nominal.
- Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as “including”, “comprising”, “incorporating”, “have”, “is” used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural.
Claims (18)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/675,056 US20220256118A1 (en) | 2018-10-05 | 2022-02-18 | System and method for recording online collaboration |
US18/476,878 US12088961B2 (en) | 2018-10-05 | 2023-09-28 | System and method for recording online collaboration |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862741735P | 2018-10-05 | 2018-10-05 | |
US16/593,172 US11258834B2 (en) | 2018-10-05 | 2019-10-04 | System and method for recording online collaboration |
US17/675,056 US20220256118A1 (en) | 2018-10-05 | 2022-02-18 | System and method for recording online collaboration |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/593,172 Continuation-In-Part US11258834B2 (en) | 2018-10-05 | 2019-10-04 | System and method for recording online collaboration |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/476,878 Continuation US12088961B2 (en) | 2018-10-05 | 2023-09-28 | System and method for recording online collaboration |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220256118A1 true US20220256118A1 (en) | 2022-08-11 |
Family
ID=82704196
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/675,056 Abandoned US20220256118A1 (en) | 2018-10-05 | 2022-02-18 | System and method for recording online collaboration |
US18/476,878 Active US12088961B2 (en) | 2018-10-05 | 2023-09-28 | System and method for recording online collaboration |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/476,878 Active US12088961B2 (en) | 2018-10-05 | 2023-09-28 | System and method for recording online collaboration |
Country Status (1)
Country | Link |
---|---|
US (2) | US20220256118A1 (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020065912A1 (en) * | 2000-11-30 | 2002-05-30 | Catchpole Lawrence W. | Web session collaboration |
Family Cites Families (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6535909B1 (en) | 1999-11-18 | 2003-03-18 | Contigo Software, Inc. | System and method for record and playback of collaborative Web browsing session |
US7349944B2 (en) | 1999-11-18 | 2008-03-25 | Intercall, Inc. | System and method for record and playback of collaborative communications session |
US6859909B1 (en) | 2000-03-07 | 2005-02-22 | Microsoft Corporation | System and method for annotating web-based documents |
US8443035B2 (en) * | 2000-09-01 | 2013-05-14 | OP40 Holding, Inc. | System and method for collaboration using web browsers |
US7647373B2 (en) | 2001-03-13 | 2010-01-12 | Eplus Capital, Inc. | System and process for network collaboration through embedded annotation and rendering instructions |
US7636754B2 (en) | 2002-03-21 | 2009-12-22 | Cisco Technology, Inc. | Rich multi-media format for use in a collaborative computing system |
US20040107270A1 (en) | 2002-10-30 | 2004-06-03 | Jamie Stephens | Method and system for collaboration recording |
US20040143603A1 (en) | 2002-11-21 | 2004-07-22 | Roy Kaufmann | Method and system for synchronous and asynchronous note timing in a system for enhancing collaboration using computers and networking |
US20040143630A1 (en) | 2002-11-21 | 2004-07-22 | Roy Kaufmann | Method and system for sending questions, answers and files synchronously and asynchronously in a system for enhancing collaboration using computers and networking |
US20040153504A1 (en) | 2002-11-21 | 2004-08-05 | Norman Hutchinson | Method and system for enhancing collaboration using computers and networking |
US7248684B2 (en) | 2002-12-11 | 2007-07-24 | Siemens Communications, Inc. | System and method for processing conference collaboration records |
US20060031755A1 (en) | 2004-06-24 | 2006-02-09 | Avaya Technology Corp. | Sharing inking during multi-modal communication |
US7284192B2 (en) | 2004-06-24 | 2007-10-16 | Avaya Technology Corp. | Architecture for ink annotations on web documents |
US20060010197A1 (en) | 2004-07-06 | 2006-01-12 | Francis Ovenden | Multimedia collaboration and communications |
US20070118794A1 (en) | 2004-09-08 | 2007-05-24 | Josef Hollander | Shared annotation system and method |
US8233597B2 (en) | 2005-02-11 | 2012-07-31 | Cisco Technology, Inc. | System and method for the playing of key phrases in voice mail messages |
US20070005697A1 (en) | 2005-06-29 | 2007-01-04 | Eric Yuan | Methods and apparatuses for detecting content corresponding to a collaboration session |
US20070005699A1 (en) | 2005-06-29 | 2007-01-04 | Eric Yuan | Methods and apparatuses for recording a collaboration session |
US7945621B2 (en) | 2005-06-29 | 2011-05-17 | Webex Communications, Inc. | Methods and apparatuses for recording and viewing a collaboration session |
US8379821B1 (en) | 2005-11-18 | 2013-02-19 | At&T Intellectual Property Ii, L.P. | Per-conference-leg recording control for multimedia conferencing |
WO2007066918A1 (en) | 2005-12-05 | 2007-06-14 | Ja-Yong Koo | Network system for contents collaboration on a real-time community based on items of contents and method thereof |
US8209181B2 (en) | 2006-02-14 | 2012-06-26 | Microsoft Corporation | Personal audio-video recorder for live meetings |
KR100856403B1 (en) | 2006-03-03 | 2008-09-04 | 삼성전자주식회사 | Video conference recording method and video conference terminal for the same |
US8214395B2 (en) | 2006-04-21 | 2012-07-03 | Microsoft Corporation | Tracking and editing a resource in a real-time collaborative session |
US7954049B2 (en) | 2006-05-15 | 2011-05-31 | Microsoft Corporation | Annotating multimedia files along a timeline |
US20080120371A1 (en) | 2006-11-16 | 2008-05-22 | Rajat Gopal | Relational framework for non-real-time audio/video collaboration |
US8423612B2 (en) | 2007-01-08 | 2013-04-16 | Cisco Technology, Inc. | Methods and apparatuses for selectively accessing an application |
WO2008101130A2 (en) | 2007-02-14 | 2008-08-21 | Museami, Inc. | Music-based search engine |
JP4404130B2 (en) | 2007-10-22 | 2010-01-27 | ソニー株式会社 | Information processing terminal device, information processing device, information processing method, and program |
CA2659698C (en) | 2008-03-21 | 2020-06-16 | Dressbot Inc. | System and method for collaborative shopping, business and entertainment |
FR2931330B1 (en) | 2008-05-13 | 2011-04-01 | Kadrige | METHOD AND SYSTEM FOR AUTOMATICALLY RECORDING A COMMUNICATION SESSION |
US20090292618A1 (en) | 2008-05-22 | 2009-11-26 | Ginza Walk, Llc | System & method for multiple users to conduct online browsing & shopping together in real time |
US8139099B2 (en) | 2008-07-07 | 2012-03-20 | Seiko Epson Corporation | Generating representative still images from a video recording |
US20140033073A1 (en) | 2008-10-01 | 2014-01-30 | Nigel Pegg | Time-shifted collaboration playback |
US20130047095A1 (en) | 2010-02-08 | 2013-02-21 | Oscar Divorra Escoda | Cloud desktop system with multi-touch capabilities |
US9141710B2 (en) | 2010-10-27 | 2015-09-22 | International Business Machines Corporation | Persisting annotations within a cobrowsing session |
US20150019486A1 (en) | 2010-11-18 | 2015-01-15 | Zensar Technologies Ltd. | System and Method for Delta Change Synchronization of Data Changes across a Plurality of Nodes |
WO2012091723A1 (en) | 2010-12-30 | 2012-07-05 | Konica Minolta Holdings, Inc. | Method for holding a meeting, using server and terminals connected to a network |
JP2012209614A (en) | 2011-03-29 | 2012-10-25 | Brother Ind Ltd | Conference system, conference terminal, and conference program |
US9489659B1 (en) | 2012-04-02 | 2016-11-08 | Cisco Technology, Inc. | Progressive sharing during a collaboration session |
FR2996086B1 (en) | 2012-09-25 | 2014-10-24 | Kadrige | METHOD FOR REMOTELY PRESENTING BETWEEN AT LEAST TWO TERMINALS CONNECTED THROUGH A NETWORK |
JP6171319B2 (en) | 2012-12-10 | 2017-08-02 | 株式会社リコー | Information processing apparatus, information processing method, information processing system, and program |
US9953036B2 (en) * | 2013-01-09 | 2018-04-24 | Box, Inc. | File system monitoring in a system which incrementally updates clients with events that occurred in a cloud-based collaboration platform |
US10313433B2 (en) | 2013-03-14 | 2019-06-04 | Thoughtwire Holdings Corp. | Method and system for registering software systems and data-sharing sessions |
US10855731B2 (en) | 2013-04-11 | 2020-12-01 | Nec Corporation | Information processing apparatus, data processing method thereof, and program |
WO2015062631A1 (en) | 2013-10-29 | 2015-05-07 | Nec Europe Ltd. | Method and system for recording a multiuser web session and replaying a multiuser web session |
US10459985B2 (en) | 2013-12-04 | 2019-10-29 | Dell Products, L.P. | Managing behavior in a virtual collaboration session |
US9998555B2 (en) | 2014-04-08 | 2018-06-12 | Dropbox, Inc. | Displaying presence in an application accessing shared and synchronized content |
US10091287B2 (en) | 2014-04-08 | 2018-10-02 | Dropbox, Inc. | Determining presence in an application accessing shared and synchronized content |
US10171579B2 (en) | 2014-04-08 | 2019-01-01 | Dropbox, Inc. | Managing presence among devices accessing shared and synchronized content |
US20160149969A1 (en) | 2014-11-26 | 2016-05-26 | Microsoft Technology Licensing, Llc | Multi-device collaboration |
US20160173467A1 (en) | 2014-12-15 | 2016-06-16 | Microsoft Technology Licensing, Llc | Document collaboration through networking credentials |
US10558677B2 (en) | 2015-03-23 | 2020-02-11 | Dropbox, Inc. | Viewing and editing content items in shared folder backed integrated workspaces |
US10431187B2 (en) | 2015-06-29 | 2019-10-01 | Ricoh Company, Ltd. | Terminal apparatus, screen recording method, program, and information processing system |
AU2016306786B2 (en) | 2015-08-13 | 2022-03-17 | Bluebeam, Inc. | Method for archiving a collaboration session with a multimedia data stream and view parameters |
US10255023B2 (en) * | 2016-02-12 | 2019-04-09 | Haworth, Inc. | Collaborative electronic whiteboard publication process |
JP6570761B2 (en) | 2016-04-25 | 2019-09-04 | ドロップボックス, インコーポレイテッド | Synchronization engine with storage constraints |
US11409952B2 (en) * | 2016-08-16 | 2022-08-09 | Myscript | System and method for collaborative ink management |
US10229518B2 (en) | 2017-04-10 | 2019-03-12 | Prysm, Inc. | Drag to undo/redo a digital ink canvas using a visible history palette |
US10673913B2 (en) | 2018-03-14 | 2020-06-02 | 8eo, Inc. | Content management across a multi-party conference system by parsing a first and second user engagement stream and transmitting the parsed first and second user engagement stream to a conference engine and a data engine from a first and second receiver |
US10796086B2 (en) | 2018-08-25 | 2020-10-06 | Microsoft Technology Licensing, Llc | Selectively controlling modification states for user-defined subsets of objects within a digital document |
-
2022
- 2022-02-18 US US17/675,056 patent/US20220256118A1/en not_active Abandoned
-
2023
- 2023-09-28 US US18/476,878 patent/US12088961B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020065912A1 (en) * | 2000-11-30 | 2002-05-30 | Catchpole Lawrence W. | Web session collaboration |
Also Published As
Publication number | Publication date |
---|---|
US20240098217A1 (en) | 2024-03-21 |
US12088961B2 (en) | 2024-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102356401B (en) | By before meeting and post-meeting experience be integrated into meeting life cycle | |
US8266214B2 (en) | System and method for collaborative web-based multimedia layered platform with recording and selective playback of content | |
US11910048B2 (en) | Synchronizing video content among clients in a collaboration system | |
US20120331402A1 (en) | System and Method to Create a Collaborative Web-based Multimedia Contextual Document | |
KR20140088123A (en) | Real time document presentation data synchronization through generic service | |
US20210125192A1 (en) | Methods for monitoring communications channels and determining triggers and actions in role-based collaborative systems | |
US20240098121A1 (en) | Techniques for efficient communication during a video collaboration session | |
US20150019486A1 (en) | System and Method for Delta Change Synchronization of Data Changes across a Plurality of Nodes | |
US11700292B2 (en) | Collaboration components for sharing content from electronic documents | |
US11424945B1 (en) | Techniques for avoiding conflicting user actions during a video collaboration session | |
KR20210083690A (en) | Animation Content Production System, Method and Computer program | |
Gericke et al. | Message capturing as a paradigm for asynchronous digital whiteboard interaction | |
WO2023076649A1 (en) | Ingesting 3d objects from a virtual environment for 2d data representation | |
US11258834B2 (en) | System and method for recording online collaboration | |
US12088961B2 (en) | System and method for recording online collaboration | |
TW201626257A (en) | Networking cooperation method and machine using such method | |
US20220342524A1 (en) | Online conference tools for meeting-assisted content editing and posting content on a meeting board | |
AU2022301933A1 (en) | Techniques for efficient communication during a video collaboration session | |
Bhimani et al. | Vox populi: enabling community-based narratives through collaboration and content creation | |
JP6461146B2 (en) | Social media platform | |
CN108604359A (en) | The method and system of sharing media content between several users | |
KR101547013B1 (en) | Method and system for managing production of contents based scenario | |
KR101899085B1 (en) | Collaboration method for scenario creation and system | |
Chunwijitra | An advanced cloud-based e-learning platform for higher education for low speed internet | |
US20140032772A1 (en) | Methods and systems for using metadata to represent social context information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EXPLAIN EVERYTHING, INC., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUPCZAK, MACIEJ;KRYSTEK, LUKASZ;SLIWINSKI, PIOTR;AND OTHERS;SIGNING DATES FROM 20220218 TO 20220221;REEL/FRAME:059377/0467 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: PROMETHEAN, INC., WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EXPLAIN EVERYTHING, INC.;REEL/FRAME:068110/0517 Effective date: 20221117 |
|
AS | Assignment |
Owner name: PROMETHEAN LIMITED, UNITED KINGDOM Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:PROMETHEAN, INC.;REEL/FRAME:068501/0153 Effective date: 20240905 |