US20220222426A1 - Digital processing systems and methods for display pane scroll locking during collaborative document editing in collaborative work systems - Google Patents

Digital processing systems and methods for display pane scroll locking during collaborative document editing in collaborative work systems Download PDF

Info

Publication number
US20220222426A1
US20220222426A1 US17/565,801 US202117565801A US2022222426A1 US 20220222426 A1 US20220222426 A1 US 20220222426A1 US 202117565801 A US202117565801 A US 202117565801A US 2022222426 A1 US2022222426 A1 US 2022222426A1
Authority
US
United States
Prior art keywords
editor
word processing
electronic
collaborative
document
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/565,801
Other versions
US11397847B1 (en
Inventor
Ron ZIONPOUR
Tal HARAMATI
Guy GREENHUT
Amir BARDUGO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Monday com Ltd
Original Assignee
Monday com Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/IB2021/000024 external-priority patent/WO2021144656A1/en
Priority claimed from PCT/IB2021/000090 external-priority patent/WO2021161104A1/en
Priority claimed from PCT/IB2021/000297 external-priority patent/WO2021220058A1/en
Priority claimed from PCT/IB2021/062440 external-priority patent/WO2022153122A1/en
Application filed by Monday com Ltd filed Critical Monday com Ltd
Priority to US17/565,801 priority Critical patent/US11397847B1/en
Assigned to Monday.com Ltd. reassignment Monday.com Ltd. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BARDUGO, AMIR, HARAMATI, TAL, ZIONPOUR, RON, GREENHUT, GUY
Publication of US20220222426A1 publication Critical patent/US20220222426A1/en
Application granted granted Critical
Publication of US11397847B1 publication Critical patent/US11397847B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/14Error detection or correction of the data by redundancy in operation
    • G06F11/1402Saving, restoring, recovering or retrying
    • G06F11/1446Point-in-time backing up or restoration of persistent data
    • G06F11/1458Management of the backup or restore process
    • G06F11/1469Backup restoration techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/176Support for shared access to files; File sharing support
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/176Support for shared access to files; File sharing support
    • G06F16/1767Concurrency control, e.g. optimistic or pessimistic approaches
    • G06F16/1774Locking methods, e.g. locking methods for file systems allowing shared and concurrent access to files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/18File system types
    • G06F16/1873Versioning file systems, temporal file systems, e.g. file system supporting different historic versions of files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6209Protecting access to data via a platform, e.g. using keys or access control rules to a single file or object, e.g. in a secure envelope, encrypted and accessed using a key, or with access control rules appended to the object itself
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/629Protecting access to data via a platform, e.g. using keys or access control rules to features or functions of an application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/106Display of layout of documents; Previewing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/109Font handling; Temporal or kinetic typography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/114Pagination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/117Tagging; Marking up; Designating a block; Setting of attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/134Hyperlinking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/169Annotation, e.g. comment data or footnotes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/186Templates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/197Version control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1827Network arrangements for conference optimisation or adaptation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/101Access control lists [ACL]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/102Entity profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/105Multiple levels of security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72436User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. SMS or e-mail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/44Secrecy systems
    • H04N1/448Rendering the image unintelligible, e.g. scrambling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/84Using snapshots, i.e. a logical point-in-time copy of the data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services

Definitions

  • the system may include at least one processor configured to: access the electronic collaborative word processing document; present a first instance of the electronic collaborative word processing document via a first hardware device running a first editor; present a second instance of the electronic collaborative word processing document via a second hardware device running a second editor; receive from the first editor during a common editing period, first edits to the electronic collaborative word processing document, wherein the first edits occur on a first earlier page of the electronic collaborative word processing document and result in a pagination change; receive from the second editor during the common editing period, second edits to the electronic collaborative word processing document, wherein the second edits occur on a second page of the electronic collaborative word processing document later than the first page; during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits received by the second hardware device; and upon receipt of a scroll-up command via the second editor during the common editing period, cause the display associated
  • FIG. 1 is a block diagram of an exemplary computing device which may be employed in connection with embodiments of the present disclosure.
  • FIG. 2 is a block diagram of an exemplary computing architecture for collaborative work systems, consistent with embodiments of the present disclosure.
  • FIG. 3 illustrates an example of an electronic collaborative word processing document, consistent with some embodiments of the present disclosure.
  • FIG. 4 illustrates an example of an instance of a collaborative word processing document, consistent with some embodiments of the present disclosure.
  • FIG. 5 illustrates an example user interface of a collaborative word processing document with a locked display, consistent with some embodiments of the present disclosure.
  • FIG. 6 illustrates another example of a collaborative word processing document with an active work location, consistent with some embodiments of the present disclosure.
  • FIG. 7 illustrates another example of a collaborative word processing document with a locked display, consistent with some embodiments of the present disclosure.
  • FIG. 8 illustrates a block diagram of an example process for managing display interference in an electronic collaborative word processing document, consistent with some embodiments of the present disclosure.
  • FIG. 9 illustrates a block diagram of another example process for managing display interference in an electronic collaborative word processing document, consistent with some embodiments of the present disclosure.
  • FIG. 10 illustrates an exemplary editor for an electronic collaborative word processing document operating in collaborative mode, consistent with some embodiments of the present disclosure.
  • FIG. 11 illustrates an exemplary editor for an electronic collaborative word processing document with an option for enabling dual mode editing to enable private changes displayed, consistent with some embodiments of the present disclosure.
  • FIG. 12 illustrates a block diagram of an example process for enabling dual mode editing in collaborative documents to enable private changes.
  • This disclosure presents various mechanisms for collaborative work systems. Such systems may involve software that enables multiple users to work collaboratively.
  • workflow management software may enable various members of a team to cooperate via a common online platform. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.
  • Certain embodiments disclosed herein include devices, systems, and methods for collaborative work systems that may allow a user to interact with information in real time. To avoid repetition, the functionality of some embodiments is described herein solely in connection with a processor or at least one processor. It is to be understood that such exemplary descriptions of functionality applies equally to methods and computer readable media and constitutes a written description of systems, methods, and computer readable media.
  • the underlying platform may allow a user to structure a systems, methods, or computer readable media in many ways using common building blocks, thereby permitting flexibility in constructing a product that suits desired needs. This may be accomplished through the use of boards.
  • a board may be a table configured to contain items (e.g., individual items presented in horizontal rows) defining objects or entities that are managed in the platform (task, project, client, deal, etc.). Unless expressly noted otherwise, the terms “board” and “table” may be considered synonymous for purposes of this disclosure. In some embodiments, a board may contain information beyond which is displayed in a table. Boards may include sub-boards that may have a separate structure from a board. Sub-boards may be tables with sub-items that may be related to the items of a board. Columns intersecting with rows of items may together define cells in which data associated with each item may be maintained. Each column may have a heading or label defining an associated data type.
  • a row When used herein in combination with a column, a row may be presented horizontally and a column vertically.
  • the term “row” may refer to one or more of a horizontal and/or a vertical presentation.
  • a table or tablature as used herein refers to data presented in horizontal and vertical rows, (e.g., horizontal rows and vertical columns) defining cells in which data is presented.
  • Tablature may refer to any structure for presenting data in an organized manner, as previously discussed, such as cells presented in horizontal rows and vertical columns, vertical rows and horizontal columns, a tree data structure, a web chart, or any other structured representation, as explained throughout this disclosure.
  • a cell may refer to a unit of information contained in the tablature defined by the structure of the tablature.
  • a cell may be defined as an intersection between a horizontal row with a vertical column in a tablature having rows and columns.
  • a cell may also be defined as an intersection between a horizontal and a vertical row, or as an intersection between a horizontal and a vertical column.
  • a cell may be defined as a node on a web chart or a node on a tree data structure.
  • tablature may include any type of information, depending on intended use. When used in conjunction with a workflow management application, the tablature may include any information associated with one or more tasks, such as one or more status values, projects, countries, persons, teams, progress statuses, a combination thereof, or any other information related to a task.
  • dashboards may be utilized to present or summarize data derived from one or more boards.
  • a dashboard may be a non-table form of presenting data, using, for example, static or dynamic graphical representations.
  • a dashboard may also include multiple non-table forms of presenting data. As discussed later in greater detail, such representations may include various forms of graphs or graphics.
  • dashboards (which may also be referred to more generically as “widgets”) may include tablature.
  • Software links may interconnect one or more boards with one or more dashboards thereby enabling the dashboards to reflect data presented on the boards. This may allow, for example, data from multiple boards to be displayed and/or managed from a common location. These widgets may provide visualizations that allow a user to update data derived from one or more boards.
  • Boards may be stored in a local memory on a user device or may be stored in a local network repository. Boards may also be stored in a remote repository and may be accessed through a network. In some instances, permissions may be set to limit board access to the board's “owner” while in other embodiments a user's board may be accessed by other users through any of the networks described in this disclosure.
  • that change may be updated to the board stored in a memory or repository and may be pushed to the other user devices that access that same board. These changes may be made to cells, items, columns, boards, dashboard views, logical rules, or any other data associated with the boards.
  • cells are tied together or are mirrored across multiple boards, a change in one board may cause a cascading change in the tied or mirrored boards or dashboards of the same or other owners.
  • a block may include any organizational unit of information in a digital file, such as a single text character, word, sentence, paragraph, page, graphic, or any combination thereof.
  • Blocks may include static or dynamic information, and may be linked to other sources of data for dynamic updates. Blocks may be automatically organized by the system, or may be manually selected by a user according to preference. In one embodiment, a user may select a segment of any information in an electronic word processing document and assign it as a particular block for input, editing, formatting, or any other further configuration.
  • An electronic collaborative word processing document may be stored in one or more repositories connected to a network accessible by one or more users through their computing devices.
  • one or more users may simultaneously edit an electronic collaborative word processing document.
  • the one or more users may access the electronic collaborative word processing document through one or more user devices connected to a network.
  • User access to an electronic collaborative word processing document may be managed through permission settings set by an author of the electronic collaborative word processing document.
  • An electronic collaborative word processing document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document.
  • Embodiments described herein may refer to a non-transitory computer readable medium containing instructions that when executed by at least one processor, cause the at least one processor to perform a method.
  • Non-transitory computer readable mediums may be any medium capable of storing data in any memory in a way that may be read by any computing device with a processor to carry out methods or any other instructions stored in the memory.
  • the non-transitory computer readable medium may be implemented as hardware, firmware, software, or any combination thereof.
  • the software may preferably be implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices.
  • the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
  • the machine may be implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces.
  • the computer platform may also include an operating system and microinstruction code.
  • the various processes and functions described in this disclosure may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown.
  • various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit.
  • a non-transitory computer readable medium may be any computer readable medium except for a transitory propagating signal.
  • the memory may include a Random Access Memory (RAM), a Read-Only Memory (ROM), a hard disk, an optical disk, a magnetic medium, a flash memory, other permanent, fixed, volatile or non-volatile memory, or any other mechanism capable of storing instructions.
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • the memory may include one or more separate storage devices collocated or disbursed, capable of storing data structures, instructions, or any other data.
  • the memory may further include a memory portion containing instructions for the processor to execute.
  • the memory may also be used as a working scratch pad for the processors or as a temporary storage.
  • a processor may be any physical device or group of devices having electric circuitry that performs a logic operation on input or inputs.
  • the at least one processor may include one or more integrated circuits (IC), including application-specific integrated circuit (ASIC), microchips, microcontrollers, microprocessors, all or part of a central processing unit (CPU), graphics processing unit (GPU), digital signal processor (DSP), field-programmable gate array (FPGA), server, virtual server, or other circuits suitable for executing instructions or performing logic operations.
  • the instructions executed by at least one processor may, for example, be pre-loaded into a memory integrated with or embedded into the controller or may be stored in a separate memory.
  • the at least one processor may include more than one processor.
  • Each processor may have a similar construction, or the processors may be of differing constructions that are electrically connected or disconnected from each other.
  • the processors may be separate circuits or integrated in a single circuit.
  • the processors may be configured to operate independently or collaboratively.
  • the processors may be coupled electrically, magnetically, optically, acoustically, mechanically or by other means that permit them to interact.
  • a network may constitute any type of physical or wireless computer networking arrangement used to exchange data.
  • a network may be the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN or WAN network, and/or other suitable connections that may enable information exchange among various components of the system.
  • a network may include one or more physical links used to exchange data, such as Ethernet, coaxial cables, twisted pair cables, fiber optics, or any other suitable physical medium for exchanging data.
  • a network may also include a public switched telephone network (“PSTN”) and/or a wireless cellular network.
  • PSTN public switched telephone network
  • a network may be a secured network or unsecured network.
  • one or more components of the system may communicate directly through a dedicated communication network.
  • Direct communications may use any suitable technologies, including, for example, BLUETOOTHTM, BLUETOOTH LETM (BLE), Wi-Fi, near field communications (NFC), or other suitable communication methods that provide a medium for exchanging data and/or information between separate entities.
  • Certain embodiments disclosed herein may also include a computing device for generating features for work collaborative systems
  • the computing device may include processing circuitry communicatively connected to a network interface and to a memory, wherein the memory contains instructions that, when executed by the processing circuitry, configure the computing device to receive from a user device associated with a user account instruction to generate a new column of a single data type for a first data structure, wherein the first data structure may be a column oriented data structure, and store, based on the instructions, the new column within the column-oriented data structure repository, wherein the column-oriented data structure repository may be accessible and may be displayed as a display feature to the user and at least a second user account.
  • the computing devices may be devices such as mobile devices, desktops, laptops, tablets, or any other devices capable of processing data.
  • Such computing devices may include a display such as an LED display, augmented reality (AR), virtual reality (VR) display.
  • Certain embodiments disclosed herein may include a processor configured to perform methods that may include triggering an action in response to an input.
  • the input may be from a user action or from a change of information contained in a user's table, in another table, across multiple tables, across multiple user devices, or from third-party applications.
  • Triggering may be caused manually, such as through a user action, or may be caused automatically, such as through a logical rule, logical combination rule, or logical templates associated with a board.
  • a trigger may include an input of a data item that is recognized by at least one processor that brings about another action.
  • the methods including triggering may cause an alteration of data and may also cause an alteration of display of data contained in a board or in memory.
  • An alteration of data may include a recalculation of data, the addition of data, the subtraction of data, or a rearrangement of information.
  • triggering may also cause a communication to be sent to a user, other individuals, or groups of individuals.
  • the communication may be a notification within the system or may be a notification outside of the system through a contact address such as by email, phone call, text message, video conferencing, or any other third-party communication application.
  • Some embodiments include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.
  • machine learning algorithms may be trained using training examples, for example in the cases described below.
  • Some non-limiting examples of such machine learning algorithms may include classification algorithms, data regressions algorithms, image segmentation algorithms, visual detection algorithms (such as object detectors, face detectors, person detectors, motion detectors, edge detectors, etc.), visual recognition algorithms (such as face recognition, person recognition, object recognition, etc.), speech recognition algorithms, mathematical embedding algorithms, natural language processing algorithms, support vector machines, random forests, nearest neighbors algorithms, deep learning algorithms, artificial neural network algorithms, convolutional neural network algorithms, recursive neural network algorithms, linear machine learning models, non-linear machine learning models, ensemble algorithms, and so forth.
  • a trained machine learning algorithm may comprise an inference model, such as a predictive model, a classification model, a regression model, a clustering model, a segmentation model, an artificial neural network (such as a deep neural network, a convolutional neural network, a recursive neural network, etc.), a random forest, a support vector machine, and so forth.
  • the training examples may include example inputs together with the desired outputs corresponding to the example inputs.
  • training machine learning algorithms using the training examples may generate a trained machine learning algorithm, and the trained machine learning algorithm may be used to estimate outputs for inputs not included in the training examples.
  • engineers, scientists, processes and machines that train machine learning algorithms may further use validation examples and/or test examples.
  • validation examples and/or test examples may include example inputs together with the desired outputs corresponding to the example inputs, a trained machine learning algorithm and/or an intermediately trained machine learning algorithm may be used to estimate outputs for the example inputs of the validation examples and/or test examples, the estimated outputs may be compared to the corresponding desired outputs, and the trained machine learning algorithm and/or the intermediately trained machine learning algorithm may be evaluated based on a result of the comparison.
  • a machine learning algorithm may have parameters and hyper parameters, where the hyper parameters are set manually by a person or automatically by a process external to the machine learning algorithm (such as a hyper parameter search algorithm), and the parameters of the machine learning algorithm are set by the machine learning algorithm according to the training examples.
  • the hyper-parameters are set according to the training examples and the validation examples, and the parameters are set according to the training examples and the selected hyper-parameters.
  • FIG. 1 is a block diagram of an exemplary computing device 100 for generating a column and/or row oriented data structure repository for data consistent with some embodiments.
  • the computing device 100 may include processing circuitry 110 , such as, for example, a central processing unit (CPU).
  • the processing circuitry 110 may include, or may be a component of, a larger processing unit implemented with one or more processors.
  • the one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information.
  • the processing circuitry such as processing circuitry 110 may be coupled via a bus 105 to a memory 120 .
  • the memory 120 may further include a memory portion 122 that may contain instructions that when executed by the processing circuitry 110 , may perform the method described in more detail herein.
  • the memory 120 may be further used as a working scratch pad for the processing circuitry 110 , a temporary storage, and others, as the case may be.
  • the memory 120 may be a volatile memory such as, but not limited to, random access memory (RAM), or non-volatile memory (NVM), such as, but not limited to, flash memory.
  • the processing circuitry 110 may be further connected to a network device 140 , such as a network interface card, for providing connectivity between the computing device 100 and a network, such as a network 210 , discussed in more detail with respect to FIG. 2 below.
  • the processing circuitry 110 may be further coupled with a storage device 130 .
  • the storage device 130 may be used for the purpose of storing single data type column-oriented data structures, data elements associated with the data structures, or any other data structures. While illustrated in FIG. 1 as a single device, it is to be understood that storage device 130 may include multiple devices either collocated or distributed.
  • the processing circuitry 110 and/or the memory 120 may also include machine-readable media for storing software.
  • “Software” as used herein refers broadly to any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, may cause the processing system to perform the various functions described in further detail herein.
  • FIG. 2 is a block diagram of computing architecture 200 that may be used in connection with various disclosed embodiments.
  • the computing device 100 may be coupled to network 210 .
  • the network 210 may enable communication between different elements that may be communicatively coupled with the computing device 100 , as further described below.
  • the network 210 may include the Internet, the world-wide-web (WWW), a local area network (LAN), a wide area network (WAN), a metro area network (MAN), and other networks capable of enabling communication between the elements of the computing architecture 200 .
  • the computing device 100 may be a server deployed in a cloud computing environment.
  • One or more user devices 220 - 1 through user device 220 - m may be communicatively coupled with the computing device 100 via the network 210 .
  • a user device 220 may be for example, a smart phone, a mobile phone, a laptop, a tablet computer, a wearable computing device, a personal computer (PC), a smart television and the like.
  • a user device 220 may be configured to send to and receive from the computing device 100 data and/or metadata associated with a variety of elements associated with single data type column-oriented data structures, such as columns, rows, cells, schemas, and the like.
  • One or more data repositories 230 - 1 through data repository 230 - n may be communicatively coupled with the computing device 100 via the network 210 , or embedded within the computing device 100 .
  • Each data repository 230 may be communicatively connected to the network 210 through one or more database management services (DBMS) 235 - 1 through DBMS 235 - n .
  • the data repository 230 may be for example, a storage device containing a database, a data warehouse, and the like, that may be used for storing data structures, data items, metadata, or any information, as further described below.
  • one or more of the repositories may be distributed over several physical storage devices, e.g., in a cloud-based computing environment. Any storage device may be a network accessible storage device, or a component of the computing device 100 .
  • FIG. 3 is an exemplary embodiment of a presentation of an electronic collaborative word processing document 301 via an editing interface or editor 300 .
  • the editor 300 may include any user interface components 302 through 312 to assist with input or modification of information in an electronic collaborative word processing document 301 .
  • editor 300 may include an indication of an entity 312 , which may include at least one individual or group of individuals associated with an account for accessing the electronic collaborative word processing document.
  • User interface components may provide the ability to format a title 302 of the electronic collaborative word processing document, select a view 304 , perform a lookup for additional features 306 , view an indication of other entities 308 accessing the electronic collaborative word processing document at a certain time (e.g., at the same time or at a recorded previous time), and configure permission access 310 to the electronic collaborative word processing document.
  • the electronic collaborative word processing document 301 may include information that may be organized into blocks as previously discussed. For example, a block 320 may itself include one or more blocks of information. Each block may have similar or different configurations or formats according to a default or according to user preferences.
  • block 322 may be a “Title Block” configured to include text identifying a title of the document, and may also contain, embed, or otherwise link to metadata associated with the title.
  • a block may be pre-configured to display information in a particular format (e.g., in bold font).
  • Other blocks in the same electronic collaborative word processing document 301 such as compound block 320 or input block 324 may be configured differently from title block 322 .
  • the platform may provide an indication of the entity 318 responsible for inputting or altering the information.
  • the entity responsible for inputting or altering the information in the electronic collaborative word processing document may include any entity accessing the document, such as an author of the document or any other collaborator who has permission to access the document.
  • a collaborative word processing document multiple users may simultaneously edit a single document in real time or near real time. Edits by a first user in one section of a document may interfere with the display of a second editor making edits to the same document, which may hamper the second editor's ability to make simultaneous edits in the document.
  • the problem may be compounded when large groups make simultaneous edits to the same document, or when one user adds a large amount of content to the document.
  • the introduction of text, graphics, or other objects to an earlier page in a collaborative word processing document may adjust the location of text or objects in a later page of the document or may shift a user's viewport so that the user's active editing location is no longer within the user's view. This reduces efficiency in collaboration between users and may lead to unintended editing errors by the user. Therefore, there is a need for unconventional innovations for managing display interference in an electronic collaborative word processing document to enable multiple users simultaneously edit a collaborative word processing document.
  • Such unconventional approaches may enable computer systems to implement functions to improve the efficiency of electronic collaborative word processing documents.
  • a system may provide display locking techniques to increase the efficiency of electronic collaborative word processing documents.
  • Various embodiments of the present disclosure describe unconventional systems, methods, and computer readable media for managing display interference in an electronic collaborative word processing document.
  • Various embodiments of the present disclosure may include at least one processor configured to access the electronic collaborative word processing document, present a first instance of the electronic collaborative word processing document via a first hardware device running a first editor, and present a second instance of the electronic collaborative word processing document via a second hardware device running a second editor.
  • the at least one processor may be configured to receive from the first editor during a common editing period, first edits to the electronic collaborative word processing document made on earlier page of the electronic collaborative word processing document that result in a pagination change.
  • the at least one processor may be further configured to receive from the second editor during the common editing period, second edits to the electronic collaborative word processing document made on a second page of the electronic collaborative word processing document later than the first page.
  • the at least one processor may be configured to, during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits received by the second hardware device, and upon receipt of a scroll-up command via the second editor during the common editing period, cause the display associated with the second hardware device to reflect the pagination change caused by the first edits.
  • the various embodiments in the present disclosure describe at least a technological solution, based on improvements to operations of computer systems and platforms, to the technical challenge of managing display interference caused by simultaneous edits to an electronic collaborative word processing document.
  • Display interference may refer to an undesirable adjustment of a viewing display, or editing location within an electronic collaborative word processing document caused by edits made by another user, or by any other alterations in the electronic collaborative word processing document.
  • Display interference may include any shift in the location of information or data displayed within an electronic collaborative word processing document. For example, a user may be editing paragraph “A” on a second page of a collaborative word processing document. Another user may add two pages of text on a first page of the same collaborative word processing document. The addition of two pages of text to the collaborative word processing document may cause paragraph “A” to move to a fourth page in the collaborative word processing document that is out of the current view of the first user.
  • Display interference is not limited to an unwanted shift of an active editing location outside of the current viewport.
  • Display interference may include unwanted shifts of an active editing location within a viewport.
  • display interference may include the addition of a single line of text to a collaborative word processing document that causes paragraph “A” to move one line of text down in the collaborative word processing document, with paragraph “A” either remaining wholly or partially within the current viewport.
  • Display interference is not limited to vertical shifts in information or data displayed within an electronic collaborative word processing document and may include horizontal shifts or a combination of vertical and horizontal shifts in the display of information or data caused by other edits within the document.
  • display interference is not limited to movement in the location of information or data in an active editing location and may include the movement in the location of any information or data within a collaborative word processing document.
  • Managing display interference may include any steps taken by the system to resolve display interference that may occur on one or more displays of one or more users accessing an electronic collaborative word processing document, which is discussed in further detail below.
  • An electronic collaborative word processing document may be a file read by a computer program that provides for the input, editing, formatting, display, and output of text, graphics, widgets, objects, tables, or other elements typically used in computer desktop publishing applications.
  • An electronic collaborative word processing document may be stored in one or more repositories connected to a network accessible by one or more users via at least one associated computing device.
  • one or more users may simultaneously edit an electronic collaborative word processing document, with all users' edits displaying in real-time or near real time within the same collaborative word processing document file.
  • the one or more users may access the electronic collaborative word processing document through one or more user devices connected to a network.
  • An electronic collaborative word processing document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document.
  • this disclosure refers to electronic collaborative word processing documents, the systems, methods, and techniques disclosed herein are not limited to word processing documents and may be adapted for use in other productivity applications such as documents, presentations, worksheets, databases, charts, graphs, digital paintings, electronic music and digital video or any other application software used for producing information.
  • FIG. 3 is an exemplary embodiment of a presentation of an electronic collaborative word processing document 301 via an editing interface or editor 300 .
  • the editor 300 may include any user interface components 302 through 312 to assist with input or modification of information in an electronic collaborative word processing document 301 .
  • editor 300 may include an indication of an entity 312 , which may include at least one individual or group of individuals associated with an account for accessing the electronic collaborative word processing document.
  • User interface components may provide the ability to format a title 302 of the electronic collaborative word processing document, select a view 304 , perform a lookup for additional features 306 , view an indication of other entities 308 accessing the electronic collaborative word processing document at a certain time (e.g., at the same time or at a recorded previous time), and configure permission access 310 to the electronic collaborative word processing document.
  • the electronic collaborative word processing document 301 may include information that may be organized into blocks as previously discussed. For example, a block 320 may itself include one or more blocks of information. Each block may have similar or different configurations or formats according to a default or according to user preferences.
  • block 322 may be a “Title Block” configured to include text identifying a title of the document, and may also contain, embed, or otherwise link to metadata associated with the title.
  • a block may be pre-configured to display information in a particular format (e.g., in bold font).
  • Other blocks in the same electronic collaborative word processing document 301 such as compound block 320 or input block 324 may be configured differently from title block 322 .
  • the platform may provide an indication of the entity 318 responsible for inputting or altering the information.
  • the entity responsible for inputting or altering the information in the electronic collaborative word processing document may include any entity accessing the document, such as an author of the document or any other collaborator who has permission to access the document.
  • Some disclosed embodiments may include accessing an electronic collaborative word processing document.
  • An electronic collaborative word processing document may be stored in one or more data repositories and the document may be retrieved by one or more users for downloading, receiving, processing, editing, or viewing, the electronic collaborative word processing document.
  • An electronic collaborative word processing document may be accessed by a user using a user device through a network.
  • Accessing an electronic collaborative word processing document may involve retrieving data through any electrical medium such as one or more signals, instructions, operations, functions, databases, memories, hard drives, private data networks, virtual private networks, Wi-Fi networks, LAN or WAN networks, Ethernet cables, coaxial cables, twisted pair cables, fiber optics, public switched telephone networks, wireless cellular networks, BLUETOOTHTM, BLUETOOTH LETM (BLE), Wi-Fi, near field communications (NFC), or any other suitable communication method that provide a medium for exchanging data.
  • accessing information may include adding, editing, deleting, re-arranging, or otherwise modifying information directly or indirectly from the network.
  • a user may access the electronic collaborative word processing document using a user device, which may include a computer, laptop, smartphone, tablet, VR headset, smart watch, or any other electronic display device capable of receiving and sending data.
  • accessing the electronic word processing document may include retrieving the electronic word processing document from a web browser cache. Additionally or alternatively, accessing the electronic word processing document may include connecting with a live data stream of the electronic word processing document from a remote source. In some embodiments, accessing the electronic word processing document may include logging into an account having a permission to access the document.
  • accessing the electronic word processing document may be achieved by interacting with an indication associated with the electronic word processing document, such as an icon or file name, which may cause the system to retrieve (e.g., from a storage medium) a particular electronic word processing document associated with the indication.
  • an indication associated with the electronic word processing document such as an icon or file name
  • an electronic collaborative word processing document may be stored in repository 230 - 1 as shown in FIG. 2 .
  • Repository 230 - 1 may be configured to store software, files, or code, such as electronic collaborative word processing documents developed using computing device 100 or user device 220 - 1 .
  • Repository 230 - 1 may further be accessed by computing device 100 , user device 220 - 1 , or other components of system 200 for downloading, receiving, processing, editing, or viewing, the electronic collaborative word processing document.
  • Repository 230 - 1 may be any suitable combination of data storage devices, which may optionally include any type or combination of slave databases, load balancers, dummy servers, firewalls, back-up databases, and/or any other desired database components.
  • repository 230 - 1 may be employed as a cloud service, such as a Software as a Service (SaaS) system, a Platform as a Service (PaaS), or Infrastructure as a Service (IaaS) system.
  • SaaS Software as a Service
  • PaaS Platform as a Service
  • IaaS Infrastructure as a Service
  • repository 230 - 1 may be based on infrastructure of services of Amazon Web ServicesTM (AWS), Microsoft AzureTM, Google Cloud PlatformTM, Cisco MetapodTM, JoyentTM, vmWareTM, or other cloud computing providers.
  • Repository 230 - 1 may include other commercial file sharing services, such as DropboxTM, Google DocsTM, or iCloudTM.
  • repository 230 - 1 may be a remote storage location, such as a network drive or server in communication with network 210 .
  • repository 230 - 1 may also be a local storage device, such as local memory of one or more computing devices (e.g., computing device 100 ) in a distributed computing
  • Some disclosed embodiments may include presenting a first instance of an electronic collaborative word processing document.
  • Presenting an instance of an electronic word processing document may include causing a display of the information contained in the electronic word processing document via a display device.
  • An electronic collaborative word processing document may be presented in multiple instances on multiple user devices. Presenting multiple instances of the electronic collaborative word processing document on multiple devices may facilitate collaborative editing of the same document because multiple users may access and edit the same document file at the same time from different user devices.
  • a first instance of the electronic collaborative word processing document may include the presentation of data and information contained in the electronic collaborative word processing document to a first user.
  • a user may view or edit a first instance of the electronic collaborative word processing document and the user may control the location of the user's view (e.g., an active display window) or edits in the first instance of the electronic collaborative word processing document.
  • This location may be independent or distinct from other users' views or editing locations in any other instance of the electronic collaborative word processing document.
  • edits made by a user in an instance of the electronic collaborative word processing document are synchronized in real time or near-real time to all other instances of the same electronic collaborative word processing document.
  • a first instance of an electronic collaborative word processing document may be presented via a first hardware device running a first editor.
  • a first hardware device may include a computer, laptop, smartphone, tablet, VR headset, smart watch, or any other electronic display device capable of receiving and sending data.
  • a first editor may be a user interface that provides for the input, editing, formatting, display, and output of text, graphics, widgets, objects, tables, or other elements in an electronic word processing document.
  • a first editor may receive user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data.
  • a user accesses an electronic collaborative word processing document using a computer and views the document in an editor that receives text and other input via a mouse and keyboard.
  • FIG. 4 illustrates an instance of a collaborative electronic word processing document presented within an editor 400 .
  • editor 400 may be displayed by a computing device (e.g., the computing device 100 illustrated in FIG. 1 ), software running thereon, or any other projecting device ((e.g., projector, AR or VR lens, or any other display device) as previously discussed).
  • Editor 400 may include various tools for displaying information associated with the document of for editing the document. For example, editor 400 may display a title 402 indicating the title of the document.
  • Formatting bar 404 may depict various tools to adjust formatting of information or objects within the document.
  • Help bar 406 may be included which may provide hyperlinks to information about various features of the editor 400 .
  • Share button 410 may be included to invite additional users to edit another instance of the collaborative electronic word processing document.
  • Editor 400 may include tool bar 412 and interface bar 414 .
  • Some disclosed embodiments may include presenting a second instance of the electronic collaborative word processing document.
  • Presenting a second instance of the electronic collaborative word processing document may be achieved in the same or similar manner as presenting a first instance of the electronic collaborative word processing document, as discussed above.
  • Presenting a second instance may include the display of data and information contained in the electronic collaborative word processing document to a second user.
  • a second user may view or edit a second instance of the electronic collaborative word processing document and the second user may control the location of the second user's view or edits in the second instance of the electronic collaborative word processing document.
  • Views presented and edits made in the second instance of the electronic collaborative word processing document may be made independently of the views presented or edits made by other users in any other instance, such as in the first instance discussed previously above.
  • the first instance and the second instance of the electronic collaborative word processing document may display different portions of the document and may receive edits to the electronic collaborative word processing document at different locations within the document. Edits made by a user in the first or the second instance of the electronic collaborative word processing document may be incorporated into other instances of the electronic collaborative word processing document in real time. In some embodiments, the first instance and the second instance of the electronic collaborative word processing document may share a common viewport displaying some of the same data and information in both the first and second instances of the document. Edits made in the first or second instance may be demarcated by user identification indicators in the first and second instance.
  • User identification indicators may include a graphic, a user ID indicator, a color, a font, or any other differentiator that indicates the source of an edit in an instance of the electronic collaborative word processing document.
  • the second instance of the electronic collaborative word processing document may be presented via a second hardware device running a second editor, in a similar manner to the first hardware device and the first editor described herein. Any number of hardware devices may run an editor to access another instance of the electronic collaborative word processing document.
  • editor 400 may indicate the that multiple users are accessing an electronic collaborative word processing document through the display of user indicator, such as user display indicator 408 which indicates two users are running an instance of the electronic collaborative word processing document.
  • Editor 400 may include current user indicator 416 .
  • Current user indicator 416 may indicate the identification of the user running the displayed instance of the collaborative word processing document.
  • the objects and information displayed for editing may be controlled by the current user shown in 416 in each instance of the electronic collaborative word processing document.
  • FIG. 4 may depict an editing location that is actively edited by the current user, such as editing location 424 , Editing location 424 may be a block as described herein. Other blocks may be shown in the viewport of editor 400 but may not be the active editing location.
  • FIG. 4 includes Title block 422 and paragraph block 420 which are not actively being edited by the user.
  • the location that a different user is actively editing in another instance of the electronic collaborative word processing document may be indicated by icon 418 , which may indicate the active working location of another user, which in this example is paragraph block 420 .
  • Some embodiments may include receiving from a first editor during a common editing period, first edits to an electronic collaborative word processing document.
  • a common editing period may include a time when at least two instances of the electronic collaborative word processing document are presented in two editors.
  • a common editing period may include two users each viewing and editing the same electronic collaborative word processing document in two instances displayed on separate hardware devices associated with each of the two users.
  • a common editing period is not limited to situations when two users are editing a document and may include any number of users editing a document in real or near real time.
  • An edit to an electronic collaborative word processing document may include the addition, manipulation, or deletion of objects or data, and may include addition, manipulation, or deletion of text, graphics, tables, images, formatting, highlights, manipulation of fonts, icons, shapes, references, headers, footers, or any other addition, deletion, or manipulation of objects or any other data within the electronic word processing document.
  • Receiving the first edits may include the system receiving an edit request from a computing device associated with a user. The request may be transmitted over a network to a repository where the electronic collaborative word processing document is stored. At least one processor may then perform a lookup of permission settings to confirm whether the computing device has authorization to make the edit. In a situation where authorization is confirmed, the system may then implement and store the edit with the electronic collaborative word processing document such that any other computing devices accessing the document may retrieve the document with the implemented change.
  • edits made during a common editing period may be transmitted or received through a communications interface.
  • a communications interface may be a platform capable of sending and retrieving data through any electrical medium such as the types described herein that manage and track edits made in a collaborative electronic word processing document from one or more editors.
  • the communications interface may be integrated with the electronic collaborative word processing document editor.
  • protocols may be incorporated into the editor that manage exchanges of data between multiple editors running one or more instances of the electronic collaborative word processing document.
  • the communications interface may be separate from the editor and may run on separate hardware devices.
  • a communications interface may run on a computing device, such as computing device 100 (of FIG. 1 ), and may transmit or receive edits made by a first editor running on user devices 220 - 1 and a second editor running on user device 220 - 2 through network 210 (of FIG. 2 ). More broadly, a communications interface may refer to any platform capable transmitting or receiving edits made to an electronic collaborative word processing document through a network or other electronic medium.
  • first edits may occur on a first earlier page of an electronic collaborative word processing document and result in a pagination change.
  • a pagination change may include any alteration to a length of an electronic document, such as by a line of text, a page of text, or multiple pages of text.
  • the pagination change may a result of an addition, deletion, rearrangement, or any other modification to the information in the electronic collaborative word processing document.
  • data and objects in the electronic collaborative word processing document may be arranged in a publication display format that depict the display of data and objects on printed pages, such as the display found in desktop publishing applications or other editing software.
  • Objects and data may be arranged so that pages of data are displayed sequentially, for example, in a vertical or a horizontal arrangement of the display of pages.
  • a pagination change may occur when edits include the addition or arrangement of content in the document that causes certain data and content in the document to move to another page, or to move to another location on the same page.
  • a document may contain paragraph “A” located in the middle of the second page of the document.
  • First edits may occur on the first page of the document that introduce the addition of two additional pages of text. This may result in a pagination change of paragraph “A,” which may move from page two to page four in the document.
  • a pagination change is not limited to the movement of objects and data from one page to another and may include movements of objects and data within the same page either by a single line, part of a line, a paragraph, or horizontally within a single line. More broadly, a pagination change may refer to any adjustment in the location of objects or text within the location of a page in the collaborative electronic word processing document.
  • Some disclosed embodiments may include receiving from a second editor during the common editing period, second edits to an electronic collaborative word processing document.
  • Second edits may include the addition, manipulation, or deletion of objects or data, and may include addition, manipulation, or deletion of text, graphics, tables, images, formatting, highlights, manipulation of fonts, icons, shapes, references, headers, footers, or any other addition, deletion, or manipulation of objects or data within the electronic word processing document as previously discussed.
  • second edits refer to edits made in a second instance of the collaborative electronic word processing document. Second edits may occur either earlier in time, later in time, or simultaneously with first edits and are not limited to edits occurring later in time than first edits in the document.
  • second edits may occur on a second page of an electronic collaborative word processing document later than a first page.
  • objects and data within the electronic collaborative word processing document may be displayed on sequentially arranged pages, for example, in a vertical or a horizontal arrangement of the display of pages.
  • Second edits may occur on a second page in that the second page is arranged sequentially after edits from the first page in the document.
  • a second page may be later than a first page if it occurs anywhere in the sequence of pages after the first edits.
  • second edits in the document may occur on page 4 of the document and first edits may occur on page 2.
  • the second edits on the second page occur later than the first page in that page 4 is displayed sequentially after page 2.
  • first and second edits may occur on the same page, with the second edits occurring sequentially after the first edits within the same page. For example, if a second edit occurs lower on a page than a first edit, then the second edit may be considered later than first edits.
  • second edits may be associated with a block in an electronic collaborative word processing document.
  • the electronic collaborative word processing document may organize objects and data into blocks.
  • a block may be any collection of objects or data within the electronic collaborative word processing document, as described herein.
  • the electronic collaborative word processing document may contain one or more title blocks which display formatting text information.
  • Blocks may include other text portions of the document, such as a sentence, a group of sentences, a paragraph, or a collection of paragraphs, or any grouping of text.
  • Blocks are not limited to text alone, and objects such as charts, graphics, widgets, objects, or tables, or any other component in the document may be recognized as a block.
  • Some disclosed embodiments may include recognizing an active work location of the second editor.
  • An active work location of an editor may include any portion of the electronic collaborative word processing document displayed in or receiving edits from the editor.
  • a user may be editing a portion of the electronic word processing document using an instance of an editor, and the active work location may correspond to the location of the edits.
  • a user may be viewing a portion of the document, and the active work location may correspond to the location of the viewport displayed by the editor.
  • there may be multiple active work locations for example, when a user may be editing one portion of the electronic word processing document while viewing a second portion of the electronic word processing document, such as using multiple viewports or by scrolling away from the edit location.
  • Recognizing the active work location may be performed in various ways and may include any process of determining at least a portion of an electronic collaborative word processing document for display or alteration on a computing device.
  • the recognition of the active work location may be based on a cursor location in the second instance of the collaborative electronic word processing document.
  • a cursor location may include any indication of a location on a display that represents an intent to interact (e.g., manipulate text, select data objects, view information, activate a link, or any other interaction) with the location of the indication is presented in the display.
  • the cursor location may be displayed visually or may be omitted from display according to preference.
  • a cursor location may be determined by an editing location or by a hovering location.
  • a user may be editing a document at the location of an editing cursor and the system may recognize the cursor as the active work location.
  • the system may recognize adjacent objects and data around the cursor location as included in the active work location. For example, adjacent letters, words, sentences, or paragraphs near the cursor may be included as part of the active work location depending on certain contexts.
  • a user may use a device (e.g., a mouse) to move a cursor location and hover over a certain portion of a collaborative electronic word processing document without selecting a specific location for editing, such a scrolling location.
  • the recognition of the active work location may be based on a scrolling location in the second instance of the collaborative electronic word processing document.
  • a scrolling location may include any displayed portion of the collaborative electronic word processing document, which may be displayed independently of the editing cursor location.
  • the system may recognize a location within the viewport as the active work location.
  • a scrolling location may be recognized in various ways. For example, determining a scrolling location may be based on an amount of time a viewport displays a location of the document, based on a distance away from the editing cursor, or based on user preferences.
  • the recognition of the active work location may be based on a block location in the second instance of the collaborative electronic word processing document
  • a block location may include a relative or absolute position of a block within an electronic collaborative word processing document.
  • each block within the electronic collaborative word processing document may include a unique block identification (“ID”) with associated location data.
  • ID unique block identification
  • the associated location data may determine a block's location within the electronic collaborative word processing document.
  • the location data may describe a block's location with respect to other blocks, describe a sequence of blocks for display, or describe a block's intended position within a document based on distances from margins or other blocks or a combination of these factors.
  • the system may recognize that a block is an active work location based on the location of edits or the viewport displayed in the second editor, or any other way based on data received by the editor.
  • FIG. 4 shows active work location 424 indicated by the user's cursor 425 positioned in a text block.
  • Distance 426 may indicate the positioning of the active work area 424 from the edge of the display or viewport.
  • Data associated with the active work location 424 and blocks, such as blocks 420 , 422 , and 424 may be stored in a repository, such as repository 230 - 1 , so that the system can track the positioning of the active work location 424 and block locations 420 , 422 , and 424 in relation to a first or second instance of the editor.
  • Location data of the user's cursor 425 , or of the users scrolling location may also be stored in a repository.
  • the user's scrolling location may be defined by a collection or grouping of blocks. In the example shown in FIG. 4 , the user's scrolling location contains the collection of blocks 422 and 424 .
  • the system may record an active work location for the first user and cause the system to display information from the stored collaborative electronic word document at that first active work location to computing device 220 - 1 .
  • the system may recognize a second active work location for the second user and cause the second computing device 220 - 2 to display only the second active work location independently from the display of the first computing device 220 - 1 .
  • Locking a display may refer to fixing the location of objects and information in a viewport of a hardware device. For example, the location of objects and information depicted on a screen of the second hardware device may shift during operation of an editor. When a display is locked, the location of objects and information depicted on a screen of the second hardware device may remain in a location that does not change, independent of the location of the objects and information in relation to their placement in a document. In some embodiments, locking a display may indicate that objects and information depicted on a screen are fixed at the pixel level.
  • locking a display may indicate that objects and information depicted on a screen are fixed with respect to a determined measurement taken from the boundaries of the viewport. In yet other embodiments, locking a display may indicate that objects and information depicted on a screen are fixed with respect to one direction but may not be fixed with respect to another direction. For example, a display may depict a block at a location in the document. Locking a display may fix the distance between the first line of the block and a boundary of the display but edits to the block may cause the distance from other lines of the block to the boundary of the display to change. Locking a display is not limited to fixing pixel locations or dimensions to boundaries of the viewport to blocks but may include any fixing of the display with respect to any objects or information within the electronic collaborative word processing document.
  • locking a display may suppress a pagination change caused by the first edits received by the second hardware device during a common editing period.
  • a pagination change may be a shift in the location of objects or data from one page in a document to another page based on changes in objects or data on an earlier page, as described previously above.
  • the pagination change may occur as a result of a single user editing a document, or as a result of multiple users editing the document at the same time during a common editing period, as previously discussed above.
  • introduction of objects or information at an earlier location in an electronic collaborative word processing document may cause the location of objects and information located on a later page in the document to shift to another page due to the first edits.
  • Pagination changes may be caused by any edits of objects and data at an earlier page in a document, and may include, as non-limiting examples, introduction, modification, or removal of text, formatting, images, objects, comments, redlines, tables, graphs, charts, references, headers, covers, shapes, icons, models, links, bookmarks, headers, footers, text boxes, or any other objects or data.
  • paragraph “A” on page three of a document may shift to page five in the document if two pages of table data are added to the document at a location before paragraph “A.”
  • locking a display to suppress a pagination change may include fixing the location of objects or information to a location within a page of an electronic collaborative word processing document as described herein.
  • a user may be editing text on a third page in an electronic collaborative word processing document using a second hardware device in a common editing period, and another user may introduce two additional pages of text and graphics at a location earlier in the document using a first hardware device.
  • the system may freeze the location of the text on the third page in a display of the second hardware device and will not adjust the location of this text to a new page based on the edits in an earlier location of the document caused by the first hardware device.
  • FIG. 5 depicts an electronic collaborative word processing document with a locked display at an active work location.
  • FIG. 5 is an example of the same interface in FIG. 4 after first and second edits have been made to the document.
  • second edits have made by a user operating a second hardware device, such as hardware device 220 - 2 (of FIG. 2 ), at location 506 .
  • Location 506 represents the same active editing location shown in FIG. 4 at 424 .
  • a first user operating a different hardware device 220 - 1 has introduced first edits 504 to the document at a location earlier in the document than the active editing location 506 being edited by the second user on hardware device 220 - 2 .
  • the display shown on hardware device 220 - 2 is locked in that the vertical distance 508 from the active work location to the edge of the display in FIG. 5 is the same distance as vertical distance 426 in FIG. 1 made prior to the first and second edits.
  • the system has adjusted the location of text earlier in the document shown on hardware device 220 - 2 , such as text in block 502 , while the display is locked.
  • FIG. 6 and FIG. 7 depict another example of locking a display.
  • a display may be locked with the introduction of widgets, figures, or charts at an earlier location in the document.
  • FIG. 6 shows an active work location 64 of a user running an editor.
  • FIG. 7 depicts the same editor later in time after a different user has introduced widgets 706 and 708 in the document.
  • the distance 606 from the active work location to the bottom of the editor before the addition of the widgets and the distance 706 from the active work location to the bottom of the editor after the addition of the widgets is constant in a locked display.
  • locking the display scrolling associated with the second display may be based on the recognized active work location so as not to interrupt viewing of the active work location.
  • the system may recognize an active work location as described herein and then freeze or lock the display of the active work location at a location on the screen when edits made at an earlier location in the document would otherwise result in a shift in the location of the active work location, as discussed previously, Not interrupting viewing of the active work location may include maintaining the display of the active work location even though other users make alterations to a document. For example, if the active work location is confined to information in a block and the block includes a paragraph, the system may recognize that the paragraph is the active work location and may fix the location of the paragraph in the display.
  • blocks may include header lines, charts, graphs, or widgets or any other objects or information as described herein.
  • the system may recognize that a block that includes a chart is the active work location and may fix the location of the chart in the display.
  • the system may track the relative arrangement of blocks based on certain data associated with the blocks. For example, each block may retain location data that positions that block in relationship to the location of other blocks within the document. This data may be independent of location data associated with the display of information in the electronic collaborative word processing document.
  • the system may compute or record the relative arrangement of the display of blocks within the document by updating data describing the relative position of the blocks but may not update the location of the block associated with the active work location within the document when the display is fixed. In this way, a second editor can receive edits from a first editor that updates block information, including the relative position data associated with the introduction of new blocks at an earlier location in the document, but allows the second editor to lock the display of the active work location.
  • a lock may remain in place until an active work location is changed in a second editor.
  • the active work location may be changed based on user actions, user preferences, or other determinations by the system. For example, the active work location may be changed upon a user moving the cursor to a second location, scrolling to a new location in the document, editing a different block, an amount of time since the last user input, selecting an icon or toggle associated with the display lock, or any other change in the editing location by the user.
  • the display may update to reflect a revised location of the active work location based on edits that occurred at an earlier page in the document.
  • the system may receive a scroll-up command via a second editor during the common editing period.
  • a scroll-up command may be any input from a user that indicates a user intent to change the viewport to display additional information. For example, a user may roll a mouse wheel, click a scroll bar on a document, or provide input through a keyboard, voice headset, haptic controls, or other user device that indicates a user desire to adjust a display.
  • a scroll commend in general may be any input to indicate a direction in which the system may re-render the viewport to display additional information in any direction in relation to the electronic document being displayed in the viewport.
  • receipt of a scroll-up command may cause the display associated with the second hardware device to reflect the pagination change caused by the first edits.
  • Reflecting the pagination change caused by the first edits may include updating or re-rendering the display to reflect a revised location of objects and information currently displayed on the second editor to reflect location changes caused by edits that occurred at an earlier page in the document.
  • the second editor's viewport of the electronic collaborative document may lock the second editor's display so that a second user of the second editor is not interrupted and displays information on the previous page 2.
  • the system may render the second editor's viewport to view the newly added page 2 from the first editor in a seamless manner.
  • a scroll-up command that causes a second hardware device to reflect the pagination change may include a scroll to a page other than a page currently displayed on a second display. For example, adjustments to the viewing location of less than one page may not cause the system to reflect the pagination change caused by the first edits.
  • a user may want to view a different part of a page associated with an active work location and may scroll up to another part of the page without changing the viewing page. In this embodiment, the system may not reflect the pagination change caused by first edits on an earlier page.
  • the system may update the display to reflect a revised location of objects and information currently displayed on the second editor to reflect location changes caused by edits that occurred at an earlier page in the document.
  • FIG. 8 illustrates a block diagram of an example process 800 for managing display interference in an electronic collaborative word processing document. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram.
  • the process 800 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1 ) of a computing device (e.g., the computing device 100 in FIGS. 1 and 2 ) to perform operations or functions described herein and may be described hereinafter with reference to FIGS. 4 to 7 by way of example.
  • processor e.g., the processing circuitry 110 in FIG. 1
  • a computing device e.g., the computing device 100 in FIGS. 1 and 2
  • some aspects of the process 800 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1 ) or a non-transitory computer-readable medium.
  • some aspects of the process 800 may be implemented as hardware (e.g., a specific-purpose circuit).
  • the process 800 may be implemented as a combination of software and hardware.
  • FIG. 8 includes process blocks 802 to 816 .
  • a processing means may access an electronic collaborative word processing document, as discussed previously in the disclosure above.
  • the processing means may present a first instance of the electronic collaborative word processing document in a first editor, as discussed previously in the disclosure above.
  • the processing means may present a second instance of the electronic collaborative word processing document in a second editor, as discussed previously in the disclosure above.
  • the processing means may receive from the first editor during a common editing period, first edits to the electronic collaborative word processing document, as discussed previously in the disclosure above.
  • the processing means may receive from the first editor during a common editing period, second edits to the electronic collaborative word processing document, as discussed previously in the disclosure above.
  • the processing means may, during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits, as discussed previously in the disclosure above.
  • the processing means may receive a scroll-up command via the second editor during the common editing period, as discussed previously in the disclosure above.
  • the processing means may update the display to reflect the pagination change caused by the first edits, as discussed previously in the disclosure above.
  • FIG. 9 illustrates a block diagram of an example process 900 for managing display interference in an electronic collaborative word processing document. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram.
  • the process 900 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1 ) of a computing device (e.g., the computing device 100 in FIGS. 1 and 2 ) to perform operations or functions described herein and may be described hereinafter with reference to FIGS. 4 to 7 by way of example.
  • processor e.g., the processing circuitry 110 in FIG. 1
  • a computing device e.g., the computing device 100 in FIGS. 1 and 2
  • some aspects of the process 900 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1 ) or a non-transitory computer-readable medium.
  • some aspects of the process 900 may be implemented as hardware (e.g., a specific-purpose circuit).
  • the process 900 may be implemented as a combination of software and hardware.
  • FIG. 9 includes process blocks 902 to 908 .
  • a processing means may receive via a communications interface during a common editing period, first edits from a first editor accessing a first instance of the electronic collaborative document via a first hardware device, wherein the first edits occur on a first earlier page of the electronic collaborative word processing document and result in a pagination change, as discussed previously in the disclosure above.
  • the processing means may receive during the common editing period, second edits from a second editor accessing a second instance of the electronic collaborative document via a second hardware device, as discussed previously in the disclosure above.
  • the processing means may, during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits received via the communications interface, as discussed previously in the disclosure above.
  • the processing means may, upon receipt of a scroll-up command via the second editor during the common editing period, cause the display associated with the second hardware device to reflect the pagination change caused by the first edits, as discussed previously in the disclosure above.
  • a collaborative word processing document multiple users may simultaneously edit a single document in real time, near real time, or asynchronously. Problems may arise when certain edits made by a user in a collaborative word processing document are visible to or shared with all other users in the collaborative word processing document.
  • a user may input data into an electronic collaborative word processing document that the user does not intend to share with all other users of the collaborative word processing document. For example, a user may input confidential salary data in a portion of a collaborative word processing document that the user wishes to hide from some or all other users in the same document.
  • a user may wish to mask or hide the user's edits to one or more portions of a collaborative word processing document for a period of time.
  • a user may wish to make several private revisions, or drafts, to a portion of a collaborative word processing document, and then share the user's final edits with the other users in the collaborative word processing document at a later time.
  • users editing a collaborative word processing document may wish to control the timing and visibility to some or all other users of certain edits that are shared within the collaborative word processing document. Therefore, there is a need for unconventional innovations for enabling dual mode editing in collaborative documents to enable private changes.
  • Such unconventional approaches may enable computer systems to implement functions to improve the efficiency of electronic collaborative word processing documents.
  • a system may provide dual mode editing in collaborative documents to enable private changes to increase the efficiency of electronic collaborative word processing documents.
  • Various embodiments of the present disclosure describe unconventional systems, methods, and computer readable media for enabling dual mode editing in collaborative documents to enable private changes in an electronic collaborative word processing document.
  • Various embodiments of the present disclosure may include at least one processor configured to access an electronic collaborative document in which a first editor and at least one second editor are enabled to simultaneously edit and view each other's edits to the electronic collaborative document, and output first display signals for presenting an interface on a display of the first editor, the interface including a toggle enabling the first editor to switch between a collaborative mode and a private mode.
  • the at least one processor may be configured to receive from the first editor operating in the collaborative mode, first edits to the electronic collaborative document and to output second display signals to the first editor and the at least one second editor, the second display signals reflecting the first edits made by the first editor.
  • the at least one processor may be configured to receive from the first editor interacting with the interface, a private mode change signal reflecting a request to change from the collaborative mode to the private mode, and in response to the first mode change signal, initiate in connection with the electronic collaborative document the private mode for the first editor.
  • the at least one processor may be configured to, in the private mode, receive from the first editor, second edits to the electronic collaborative document, and in response to the second edits, output third display signals to the first editor while withholding the third display signals from the at least one second editor, such that the second edits are enabled to appear on a display of the first editor and are prevented from appearing on at least one display of the at least one second editor.
  • the various embodiments in the present disclosure describe at least a technological solution, based on improvements to operations of computer systems and platforms, to the technical challenge of managing display interference caused by simultaneous edits to an electronic collaborative word processing document.
  • Some disclosed embodiments may involve systems, methods, and computer readable media for enabling dual mode editing in collaborative documents to enable private changes.
  • Enabling dual mode editing may refer to presenting an interactable interface with the ability to provide two independent modes of making changes to an electronic document.
  • changes made in an electronic collaborative word processing document may be public changes and may also be known as collaborative mode.
  • a public change may include any edit to an electronic collaborative document that may be shared with or accessible to all users (or a designated group of users) in the electronic collaborative document in real-time or near-real time.
  • a user may, through dual mode editing, enable private changes.
  • Enabling a private change may include providing options to a user on an associated computing device to make any edit to an electronic collaborative document that is not shared with all other users in real-time, or not shared to at least some users who may have access to an electronic collaborative document.
  • Dual mode editing to enable private changes may operate in various ways. For example, in collaborative mode, all of a user's changes may be shared and displayed with all other users accessing an electronic collaborative document. When in private mode, a user may designate edits to a portion of an electronic collaborative document to be visible to a subset of all users who have access to the collaborative document. In another example, some or all of a user's edits may not be visible to other users with access to an electronic collaborative document until the user signals that the edits should be visible to other users. More generally, dual mode editing to enable private changes allows a user to make any edit to an electronic collaborative document while restricting the timing or audience of the user's edits.
  • Dual mode editing to enable private changes may be enabled in electronic collaborative documents.
  • a collaborative document may include any electronic file that may be read by a computer program that provides for the input, editing, formatting, display, and output of text, graphics, widgets, data, objects, tables, or other elements typically used in computer desktop publishing applications.
  • An electronic collaborative document may be stored in one or more repositories connected to a network accessible by one or more users via at least one associated computing device.
  • one or more users may simultaneously edit an electronic collaborative document, with all users' edits displaying in real-time or near real-time within the same collaborative document file. The one or more users may access the electronic collaborative document through one or more user devices connected to a network.
  • An electronic collaborative document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document.
  • this disclosure subsequently refers to electronic collaborative word processing documents, the systems, methods, and techniques disclosed herein are not limited to word processing documents and may be adapted for use in other productivity applications such as documents, presentations, worksheets, databases, charts, graphs, digital paintings, electronic music and digital video or any other application software used for producing information.
  • FIG. 3 is an exemplary embodiment of a presentation of an electronic collaborative word processing document 301 via an editing interface or editor 300 .
  • the editor 300 may include any user interface components 302 through 312 to assist with input or modification of information in an electronic collaborative word processing document 301 .
  • editor 300 may include an indication of an entity 312 , which may include at least one individual or group of individuals associated with an account for accessing the electronic collaborative word processing document.
  • User interface components may provide the ability to format a title 302 of the electronic collaborative word processing document, select a view 304 , perform a lookup for additional features 306 , view an indication of other entities 308 accessing the electronic collaborative word processing document at a certain time (e.g., at the same time or at a recorded previous time), and configure permission access 310 to the electronic collaborative word processing document.
  • the electronic collaborative word processing document 301 may include information that may be organized into blocks as previously discussed. For example, a block 320 may itself include one or more blocks of information. Each block may have similar or different configurations or formats according to a default or according to user preferences.
  • block 322 may be a “Title Block” configured to include text identifying a title of the document, and may also contain, embed, or otherwise link to metadata associated with the title.
  • a block may be pre-configured to display information in a particular format (e.g., in bold font).
  • Other blocks in the same electronic collaborative word processing document 301 such as compound block 320 or input block 324 may be configured differently from title block 322 .
  • the platform may provide an indication of the entity 318 responsible for inputting or altering the information.
  • the entity responsible for inputting or altering the information in the electronic collaborative word processing document may include any entity accessing the document, such as an author of the document or any other collaborator who has permission to access the document.
  • An electronic collaborative document may be stored in one or more data repositories and the document may be retrieved by one or more users for downloading, receiving, processing, editing, or viewing, the electronic collaborative document.
  • An electronic collaborative document may be accessed by a user using a user device through a network.
  • Accessing an electronic collaborative document may involve retrieving data through any electrical medium such as one or more signals, instructions, operations, functions, databases, memories, hard drives, private data networks, virtual private networks, Wi-Fi networks, LAN or WAN networks, Ethernet cables, coaxial cables, twisted pair cables, fiber optics, public switched telephone networks, wireless cellular networks, BLUETOOTHTM, BLUETOOTH LETM (BLE), Wi-Fi, near field communications (NFC), or any other suitable communication method that provide a medium for exchanging data.
  • accessing information may include adding, editing, deleting, re-arranging, or otherwise modifying information directly or indirectly from the network.
  • a user may access the electronic collaborative document using a user device, which may include a computer, laptop, smartphone, tablet, VR headset, smart watch, or any other electronic display device capable of receiving and sending data.
  • accessing the electronic document may include retrieving the electronic document from a web browser cache. Additionally or alternatively, accessing the electronic document may include connecting with a live data stream of the electronic word processing document from a remote source.
  • accessing the electronic document may include logging into an account having a permission to access the document. For example, accessing the electronic document may be achieved by interacting with an indication associated with the electronic word processing document, such as an icon or file name, which may cause the system to retrieve (e.g., from a storage medium) a particular electronic document associated with the indication.
  • an electronic collaborative document may be stored in repository 230 - 1 as shown in FIG. 2 .
  • Repository 230 - 1 may be configured to store software, files, or code, such as electronic collaborative documents developed using computing device 100 or user device 220 - 1 .
  • Repository 230 - 1 may further be accessed by computing device 100 , user device 220 - 1 , or other components of system 200 for downloading, receiving, processing, editing, or viewing, the electronic collaborative document.
  • Repository 230 - 1 may be any suitable combination of data storage devices, which may optionally include any type or combination of slave databases, load balancers, dummy servers, firewalls, back-up databases, and/or any other desired database components.
  • repository 230 - 1 may be employed as a cloud service, such as a Software as a Service (SaaS) system, a Platform as a Service (PaaS), or infrastructure as a Service (IaaS) system.
  • SaaS Software as a Service
  • PaaS Platform as a Service
  • IaaS infrastructure as a Service
  • repository 230 - 1 may be based on infrastructure of services of Amazon Web ServicesTM (AWS), Microsoft AzureTM, Google Cloud PlatformTM, Cisco MetapodTM, JoyentTM, vmWareTM, or other cloud computing providers.
  • Repository 230 - 1 may include other commercial file sharing services, such as DropboxTM, Google DocsTM, or iCloudTM.
  • repository 230 - 1 may be a remote storage location, such as a network drive or server in communication with network 210 .
  • repository 230 - 1 may also be a local storage device, such as local memory of one or more computing devices (e.g., computing device 100 ) in a distributed computing
  • a first editor and at least one second editor may be enabled to simultaneously edit and view each other's edits to the electronic collaborative document.
  • a first editor may be a user interface that provides for the input, editing, formatting, display, and output of text, graphics, widgets, objects, tables, or other elements in an electronic word processing document or any other electronic collaborative document.
  • a first editor may receive user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data.
  • a user accesses an electronic collaborative document using a computer and views the document in an editor that receives text and other input via a mouse and keyboard.
  • Another instance of the electronic collaborative document may be presented via a second hardware device running a second editor, in a similar manner to the first hardware device and the first editor described herein. Any number of hardware devices may run an editor to access another instance of the electronic collaborative word processing document.
  • Edits made in the first or second instance may be demarcated by user identification indicators in the first and second instance.
  • User identification indicators may include a graphic, a user ID indicator, a color, a font, or any other differentiator that indicates the source of an edit in an instance of the electronic collaborative document.
  • FIG. 10 illustrates an electronic collaborative document (e.g., an electronic collaborative word processing document) presented within an editor 1000 operating in collaborative mode.
  • editor 1000 may be displayed by a computing device (e.g., the computing device 100 illustrated in FIG. 1 ), software running thereon, or any other projecting device ((e.g., projector, AR or VR lens, or any other display device) as previously discussed).
  • Editor 1000 may include various tools for displaying information associated with the document of for editing the document. For example, editor 1000 may display a title 1002 indicating the title of the document. Formatting bar 1004 may depict various tools to adjust formatting of information or objects within the document. Help bar 1006 may be included which may provide hyperlinks to information about various features of the editor 1000 .
  • Share button 1010 may be included to invite additional users to edit another instance of the collaborative electronic word processing document.
  • Editor 1000 may include tool bar 1012 and interface bar 1014 .
  • Editor 1000 may indicate the that multiple users are accessing an electronic collaborative document through the display of user indicator, such as user display indicator 1008 which indicates two users are running an instance of the electronic collaborative document.
  • Editor 1000 may include current user indicator 1016 .
  • Current user indicator 1016 may indicate the identification of the user running the displayed instance of the collaborative document.
  • the objects and information displayed for editing may be controlled by the current user shown in 1016 in each instance of the electronic collaborative document.
  • FIG. 10 may depict an editing location that is actively edited by the current user, such as editing location 1024 indicated by cursor 1026 .
  • a second user is actively editing in another instance of the electronic collaborative document paragraph block 1020 .
  • edits made by the first user in the first editor are immediately displayed in the editor viewed by the second user, and vice versa. For instance, any information or data added at the active work location 1024 will be visible to the second user, and any information added by the second user to paragraph block 1020 will be visible in editor 1000 . Future edits to additional fields, such as title block 1022 will also be visible in both editors.
  • the first user and the second user may correspond to users operating one or more user devices shown in FIG. 2 .
  • first user may operate user device 220 - 1 (of FIG. 2 ) to view editor 1000 (of FIG. 10 ).
  • Second user may operate the second editor through user device 220 - 2 (of FIG. 2 ). Additional users may further access the electronic collaborative document using additional user devices.
  • a display signal may be an electronic instruction that transmits display information.
  • a display signal may be any phenomena capable of transmitting electronic display information and may include a time varying voltage, current, or electromagnetic wave or any other method of transmitting data through an electrical medium.
  • Outputting a display signal may include transmitting a signal containing instructions to present an interface on a display of a first editor.
  • a first display signal may represent a display signal that may be transmitted at a certain period of time before subsequent display signals or before toggling a change in the dual mode.
  • Presenting an interface on a display of a first editor may include displaying a visualization with activatable elements that a user may interact with and provide input on a user device, which may include a computer, laptop, smartphone, tablet, VR headset, smart watch, or any other electronic display device capable of receiving and sending data.
  • An interface may display data and information associated with the editor and the collaborative electronic document. The interface may receive user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data.
  • presenting an interface on a display of a first editor may include a user accessing an electronic collaborative word processing document using a computer and viewing the document in an editor that receives text and other input via a mouse and keyboard.
  • an interface may include a toggle enabling a first editor to switch between a collaborative mode and a private mode.
  • collaborative mode may be a manner of displaying an electronic collaborative document where changes made by one or more users are public changes.
  • a public change is any edit to an electronic collaborative document that is immediately shared with all users in the electronic collaborative document in real-time.
  • a private mode may be a manner of displaying an electronic collaborative document where edits made by a user to an electronic collaborative document is not shared with all other users in real-time.
  • private mode may operate in various ways. For example, a user may designate edits to a portion of an electronic collaborative document to be visible to a subset of all users who have access to the collaborative document.
  • some or all of a user's edits may not be visible to other users with access to an electronic collaborative document until the user signals toggles back to collaborative mode.
  • private mode allows a user to make any edit to an electronic collaborative document while restricting the visibility of the user's edits to other users viewing other instances of an electronic collaborative document for a period of time.
  • the interface may switch between a collaborative mode and a private mode via a toggle.
  • a toggle may be any activatable graphical user interface element that enables a change between one state to another state.
  • a toggle may be a button or other icon in a user interface that can be selected by a user.
  • the toggle is presented outside of the interface in an HTML hyperlink or file path.
  • the system may generate a unique hyperlink for an instance of an electronic collaborative document with a selection between collaborative mode and private mode pre-enabled.
  • an interface on a display of an editor may be displayed in collaborative mode or private mode as indicated in the instructions in the hyperlink.
  • a toggle enabling the first editor to switch between a collaborative mode and a private mode may include any activatable element on an interface that may send instructions to a processor to operate in a collaborative mode, to operate in a private mode, or to switch from an actively operating collaborative mode to private mode and vice versa.
  • Some aspects of the present disclosure may involve receiving from a first editor operating in the collaborative mode, first edits to the electronic collaborative document.
  • An edit to an electronic collaborative document may include the addition, manipulation, or deletion of objects or data, and may include addition, manipulation, or deletion of text, graphics, tables, images, formatting, highlights, manipulation of fonts, icons, shapes, references, headers, footers, or any other addition, deletion, or manipulation of objects or any other data within the electronic collaborative document.
  • Receiving the first edits may include the system receiving an edit request from a computing device associated with a user. The request may be transmitted over a network to a repository where the electronic collaborative document is stored. At least one processor may then perform a lookup of permission settings to confirm whether the computing device has authorization to make the edit. In a situation where authorization is confirmed, the system may then implement and store the edit with the electronic collaborative document such that any other computing devices accessing the document may retrieve the document with the implemented change.
  • Some embodiments may involve outputting second display signals to a first editor and at least one second editor, the second display signals reflecting first edits made by the first editor.
  • a second display signal may be a display signal that is made at a later time than a first display signal, which may be output and transmitted to cause a rendering of information as discussed previously above.
  • the second display signal may reflect first edits made by the first editor.
  • an edit made by a user may be immediately shared with all other users operating instances of the electronic collaborative document in additional editors in real-time.
  • second display signals may be transmitted to the first and second editor reflecting the changes, resulting in the edits being visible in both the first and second editors to each user.
  • Second display signals are not limited to transmission to a first and second editor, but may also include transmission to any number of editors accessing the electronic collaborative document.
  • a private mode change signal may be any electronic communications instruction from an editor indicating an intent to enable private mode operation from a collaborate mode operation.
  • a private mode change signal may be indicated by user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data, which may then be received by at least one processor to carry out the associated instructions.
  • the private mode change signal may be generated by a user selecting a toggle in a graphical user interface.
  • Some embodiments may include, in response to a first mode change signal, initiating in connection with an electronic collaborative document a private mode for the first editor.
  • Initiating the private mode for the first editor in connection with an electronic collaborative document may include causing some or all of the edits made in the first editor associated with a first editor to be withheld from display in other instances of the collaborative electronic document in other editors.
  • Private mode may be initiated for all or part of an electronic collaborative document. For example, initiating private mode may cause all changes made in the first editor to be visible in the first editor only and not be visible in other instances of the collaborative electronic document in the second editor or any other editor.
  • private mode may be initiated in a portion of the collaborative electronic document.
  • collaborative electronic documents may be organized into one or more blocks of information.
  • Private mode may be enabled for one or more blocks as designated by the user through the editor.
  • changes made to blocks via the first editor that have private mode initiated will not display in other instances of the electronic collaborative word processing document, and changes made to blocks where private mode is not initiated changes will continue to display edits made by the first editor in real time.
  • FIG. 11 depicts an exemplary editor 1100 for an electronic collaborative document with an option for enabling dual mode editing to enable private changes displayed.
  • Editor 1100 may include private mode change toggle 1106 which may cause a private mode change signal to be transmitted as described herein. Once editor 1100 activates the private mode change toggle 1106 via a user input, the private mode for the first editor may be initiated.
  • Second edits may include the addition, manipulation, or deletion of objects or data, and may include addition, manipulation, or deletion of text, graphics, tables, images, formatting, highlights, manipulation of fonts, icons, shapes, references, headers, footers, or any other addition, deletion, or manipulation of objects or data within the electronic collaborative document as previously discussed.
  • second edits may refer to edits made in a second instance of the collaborative electronic word processing document while private mode is enabled. Second edits may occur either earlier in time, later in time, or simultaneously with first edits and are not limited to edits occurring later in time than first edits in the document. For example, a user may toggle between collaborative mode and private mode multiple times. In this example, all edits made while operating in private mode may be considered second edits, even if the edits were made before or after first edits made while the editor is in collaborative mode.
  • Some aspects of the present disclosure may involve, in response to second edits, outputting third display signals to a first editor while withholding third display signals from at least one second editor.
  • a third display signal may be a display signal that contains data for second edits that may be transmitted to cause a presentation of the second edits, consistent with the earlier discussion.
  • Withholding a display signal may include not transmitting the display signal so that an editor does not receive information associated with the display signal.
  • the processor may transmit the third display signal with second edits made by the first editor to a display (e.g., the first editor may be re-rendered to include the second edits in a presentation) while the processor may not transmit or withhold the third display signal to the second editor (e.g., resulting in the second editor to not re-render with the second edits.
  • the third display signal may be differentiated between the first and second display signals in that the third display signal contains second edits made by an editor while private mode is enabled. Outputting third display signals to the first editor while withholding the third display signals from the at least one second editor may enable second edits to appear on a display of the first editor and prevent second edits from appearing on at least one display of the at least one second editor.
  • Third display signals that are unique from first or second display signals may be transmitted containing instructions to display the second edits.
  • the third display signals may be selectively transmitted some but not all editors.
  • a user operating a first editor may add text data to a document after enabling private mode.
  • the user's text will display in the editor operated by the user (e.g., second edits may appear on a display of the first editor).
  • the third display signals may be withheld from the second editor, which means the second edits may not display in the second editor (e.g., second edits are prevented from appearing on at least one display of at least one second editor).
  • the user operating the first editor designates which editors receive third display signals containing second edits and designates which editors do not receive third display signals and continue to receive second display signals instead.
  • editor 1100 may include first edits made in collaborative mode that are visible to all users accessing the electronic collaborative document.
  • text block 1101 includes first edits made by editor 1100 that are displayed in editor 1100 and in other editors accessing the electronic collaborative document.
  • Text block 1102 displays edits made in editor 1100 in private mode, Text block 1102 is displayed in editor 1100 but not in any other editors viewing the same electronic collaborative document. Additional edits made at cursor 1104 while private mode is enabled may not be displayed in other editors viewing the same electronic collaborative document until collaborative mode is enabled.
  • Some aspects of the present disclosure may involve receiving from a first editor interacting with an interface, a collaborative mode change signal reflecting a request to change from a private mode to a collaborative mode.
  • a collaborative mode change signal may be any electronic communications instruction from the editor indicating an intent to enable collaborative mode operations.
  • a collaborative mode change signal may be indicated by user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data.
  • the collaborative mode change signal may be generated by a user selecting a toggle in a graphical user interface.
  • subsequent edits made by the first editor may be enabled to be viewed by the at least one second editor.
  • a subsequent edit may include an edit made by the first editor after receipt of the collaborative mode change signal.
  • edits When edits are made by a first editor in collaborative mode, these edits may be immediately shared in real time to all other users and rendered on associated displays of the users accessing the collaborative electronic document.
  • the collaborative mode change signal may be toggled for the entire document.
  • all subsequent edits made to the document in collaborative mode may be displayed in other editors viewing other instances of the electronic document.
  • the collaborative mode change signal may be applied to one or more portions of a document.
  • only subsequent edits to certain portions of the electronic collaborative document may be displayed to all other editors in real time, while other portions of the electronic collaborative document remain in private mode.
  • a collaborative mode change signal may be toggled with respect to one or more blocks and may operate at the block level.
  • Some aspects of the present disclosure may involve segregating second edits made in private mode, such that upon return to a collaborative mode, viewing of the second edits are withheld from at least one second editor.
  • Segregating second edits made in private mode may involve a method of saving and storing data that independently tracks and stores data associated with second edits in a manner that does not transmit the stored data until additional instructions are received to release the segregated second edits to particular editors.
  • Data indicating that the edits were made in private mode may be stored as a property of the document, and in some embodiments, may be stored as a property of each individual block in the document. For example, a first editor may be enabled in private mode and may make second edits in private mode to one or more blocks of an electronic document.
  • edits may be initially withheld from display to other instances of the electronic document.
  • the editor may close the document and reopen it at a later time and toggle collaborative mode.
  • the second edits made to the one or more blocks may be displayed in the first editor but may not be displayed in the second editor or other editors because the second edits have been segregated when they were made in private mode.
  • segregating edits made in private mode may refer to any method of data manipulation and storage that tracks the state of the dual mode at the of the editor at the time the second edits are made.
  • Some aspects of the present disclosure may involve receiving from a first editor a release signal, and in response thereto, enabling at least one second editor to view the second edits.
  • Receiving a release signal from an editor may include any electronic communications instruction from the editor that transmits a user desire to publish second edits to the second editor.
  • Enabling an editor to view edits may include transmitting a display signal to a particular computing device associated with an editor to cause information associated with particular edits to be rendered on a screen associated with the editor.
  • an editor may utilize both a collaborative mode and a private mode when editing an electronic document. Edits made in the electronic document while operating in collaborative mode may be shared and displayed in real time to all other users. Edits made in private mode may not be shared with all other users in the electronic collaborative document.
  • switching between collaborative mode and private mode may not publish the edits made to the electronic collaborative document that were made in private mode.
  • a release signal may operate to publish edits made in private mode to the other users in the electronic collaborative document.
  • An editor may transmit a release signal in response to various inputs.
  • the editor may include a button, toggle, switch, or other GUI element that releases all second edits made to an electronic collaborative document.
  • release signals may be transmitted that correspond to a portion of the electronic document.
  • a release signal may be transmitted that applies to one or more blocks in the electronic document.
  • a user may indicate a desire to transmit a release signal by selecting a block and selecting a release icon.
  • the editor may allow a user to right click on a block and select an option to release second edits in the block.
  • release signals may trigger automatically in accordance with various user settings. For example, user settings may cause release signals to be transmitted based on pre-determined intervals of time, based on certain users with superior administrative privileges viewing the document in another editor, or based on a predetermined action performed by the user, such as closing the editor.
  • enabling the at least one second editor to view the second edits may include displaying to the at least one second editor, in association with the second edits, an identity of the first editor.
  • An identity of the first editor may be associated with the user operating the editor and may include any indicator (e.g., alphanumeric, graphical, or combination thereof).
  • a user operating a first editor may have a user account with personal identifying information, such as a name, username, photo, employee ID, or any other personal information.
  • Displaying to the at least on second editor an identity of the first editor may include causing a presentation of an indication of the user account associated with the first editor.
  • the identity of the first editor may be displayed with an icon that is visible in the second editor. The icon may contain personally identifying information such as a name, initials, a photo, or other data.
  • the identity of the first editor may be displayed to the second editor in association with the second edits.
  • Displaying an identity in association with second edits may include rendering a visual indicator of the identity of the first editor in or near the second edits in a co-presentation, or in response to an interaction (e.g., a cursor hover over the second edits).
  • the visual link may include an icon, a font, a highlight, a graphic, a color of text or any other data property identifying the identity of the first editor placed adjacent or in the edits in the display.
  • the identity of the first editor may be displayed in response to an input in the second editor. For instance, the user operating the second editor may receive display information indicating second edits displayed in a color. Upon selecting the edits or placing a cursor near the edits, a popup may be displayed that identifies the identity of the first editor using a visual indicator as described herein.
  • At least one processor may compare second edits made in private mode to original text in an electronic collaborative document, identify differences based on the comparison, and present the differences in connection with text of the electronic collaborative document to thereby indicate changes originally made during private mode.
  • Original text may include any or all text or data in an electronic collaborative document that the document contained prior to second edits made by the first editor.
  • the processor may identify the second edits made in private mode by segregating the data associated with second edits as described herein. Comparing second edits to original text in an electronic collaborative document may include a calculation of differences and/or similarities between data contained in the second edits to the original text in an electronic document.
  • changes to text are presented by displaying additional or deleted text in a particular color, font, or format.
  • additional text may be displayed in red with underlines and deleted text may be indicated by a strikethrough.
  • changes to the document may be indicated by highlighting, font changes, imbedded objects or pop-up indicators, or any other method capable of visually distinguishing types of data in an electronic collaborative document.
  • the color associated with the changes to text or other objects corresponds with the identity of the user who made the second edits.
  • Some aspects of the present disclosure may include receiving from a first editor, in association with a text block, a retroactive privatization signal, and upon receipt of the retroactive privatization signal, withhold the text block from display to at least one second editor.
  • a retroactive privatization signal may be a data signal that indicates a portion of text that should be withheld from display to a second editor or any additional editors.
  • a retroactive privatization signal may function to transfer a portion or all of a document to private mode, thereby allowing the first editor to view and manipulate objects, text, or data in the portion of the document in private mode.
  • Receiving a retroactive privatization signal may be associated with a text block which may involve obtaining instructions to retroactively mark a particular region of text as private.
  • a user running a first editor may wish to hide certain portions of a document containing confidential financial information from view of one or all other users.
  • the user may select the block or blocks of text data containing the confidential information and transmit a privatization signal which causes the display signals being transmitted to the other users to not display the blocks containing confidential financial information.
  • Any block or blocks of data may be designated associated with a retroactive privatization signal, which may transmit the objects, text, and data inside the block or blocks to private mode (e.g., causing a re-rendering of the displays of the users to omit the data designated to be retroactively private).
  • Withholding the text block from display to a second editor may include causing a re-rendering of a display of the second editor to delete, omit, obscure, or reduce access to information marked as private.
  • a retroactive privatization signal may be disabled by an editor sending a release signal.
  • Some aspects of the present disclosure may include receiving from a first editor operating in private mode an exemption signal for at least one particular editor, to thereby enable the at least one particular editor to view the second edits.
  • Receiving an exemption signal may include obtaining an electronic transmittal of data or instructions from a computing device associated with a user interacting with an editor to enable a particular editor to receive display signals causing a display to show the changes made in private mode by the first editor.
  • a user operating the first editor may wish to make private edits in private mode and may wish to share the edits with a particular user without publishing the edits to all other users in the electronic collaborative document by sending a release signal.
  • the first editor may designate one or more other editors to receive third display signals containing the second edits made in the first editor.
  • Receiving an exemption signal for at least one particular editor to thereby enable the at least one particular editor to view the second edits may include receiving instructions in an exemption signal that may allow a user to share edits with some users and hide the edits from other users. For example, a large team of several dozen users may collaborate on a single electronic collaborative document. In this example, there may be a desire to include a section of the document that contains confidential information, such as salary information. A user may enable private mode editing to begin privately adding confidential data to the document that may be hidden from all other users.
  • an exemption signal may be applied to one or more blocks in an electronic document, thereby enabling particular editors to view the second edits associated with that block.
  • FIG. 12 illustrates a block diagram of an example process 1200 for enabling dual mode editing in collaborative documents to enable private changes. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram.
  • the process 1200 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1 ) of a computing device (e.g., the computing device 100 in FIGS. 1 and 2 ) to perform operations or functions described herein and may be described hereinafter with reference to FIGS. 10 to 11 by way of example.
  • processor e.g., the processing circuitry 110 in FIG. 1
  • a computing device e.g., the computing device 100 in FIGS. 1 and 2
  • some aspects of the process 1200 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1 ) or a non-transitory computer-readable medium. In some embodiments, some aspects of the process 1200 may be implemented as hardware (e.g., a specific-purpose circuit). In some embodiments, the process 1200 may be implemented as a combination of software and hardware.
  • FIG. 12 includes process blocks 1202 to 1216 .
  • a processing means may access an electronic collaborative document in which a first editor and at least one second editor are enabled to simultaneously edit and view each other's edits to the electronic collaborative document, as discussed previously in the disclosure above.
  • the processing means may output first display signals for presenting an interface on a display of the first editor, the interface including a toggle enabling the first editor to switch between a collaborative mode and a private mode, as discussed previously in the disclosure above.
  • the processing means may receive from the first editor operating in the collaborative mode, first edits to the electronic collaborative document, as discussed previously in the disclosure above.
  • the processing means may output second display signals to the first editor and the at least one second editor, the second display signals reflecting the first edits made by the first editor, as discussed previously in the disclosure above.
  • the processing means may receive from the first editor interacting with the interface, a private mode change signal reflecting a request to change from the collaborative mode to the private mode, as discussed previously in the disclosure above.
  • the processing means may, in response to the first mode change signal, initiate in connection with the electronic collaborative document the private mode for the first editor, as discussed previously in the disclosure above.
  • the processing means may, in the private mode, receive from the first editor, second edits to the electronic collaborative document, as discussed previously in the disclosure above.
  • the processing means may in response to the second edits, output third display signals to the first editor while withholding the third display signals from the at least one second editor, such that the second edits are enabled to appear on a display of the first editor and are prevented from appearing on at least one display of the at least one second editor, as discussed previously in the disclosure above.
  • Implementation of the method and system of the present disclosure may involve performing or completing certain selected tasks or steps manually, automatically, or a combination thereof.
  • several selected steps may be implemented by hardware (HW) or by software (SW) on any operating system of any firmware, or by a combination thereof.
  • HW hardware
  • SW software
  • selected steps of the disclosure could be implemented as a chip or a circuit.
  • selected steps of the disclosure could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
  • selected steps of the method and system of the disclosure could be described as being performed by a data processor, such as a computing device for executing a plurality of instructions.
  • machine-readable medium refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal.
  • machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.
  • implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof.
  • ASICs application specific integrated circuits
  • These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • any device featuring a data processor and the ability to execute one or more instructions may be described as a computing device, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, a smart watch or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network”.
  • the systems and techniques described here can be implemented on a computer having a display device (a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer.
  • a display device a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen
  • a keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • the systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components.
  • the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
  • LAN local area network
  • WAN wide area network
  • the Internet the global information network
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • Disclosed embodiments may include any one of the following bullet-pointed features alone or in combination with one or more other bullet-pointed features, whether implemented as a method, by at least one processor, and/or stored as executable instructions on non-transitory computer-readable media:
  • the above described embodiments can be implemented by hardware, or software (program codes), or a combination of hardware and software. If implemented by software, it can be stored in the above-described computer-readable media. The software, when executed by the processor can perform the disclosed methods.
  • the computing units and other functional units described in the present disclosure can be implemented by hardware, or software, or a combination of hardware and software.
  • One of ordinary skill in the art will also understand that multiple ones of the above described modules/units can be combined as one module or unit, and each of the above described modules/units can be further divided into a plurality of sub-modules or sub-units.
  • each block in a flowchart or block diagram may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical functions.
  • functions indicated in a block may occur out of order noted in the figures. For example, two blocks shown in succession may be executed or implemented substantially concurrently, or two blocks may sometimes be executed in reverse order, depending upon the functionality involved. Some blocks may also be omitted.
  • each block of the block diagrams, and combination of the blocks may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or by combinations of special purpose hardware and computer instructions.
  • Computer programs based on the written description and methods of this specification are within the skill of a software developer.
  • the various programs or program modules can be created using a variety of programming techniques.
  • One or more of such software sections or modules can be integrated into a computer system, non-transitory computer readable media, or existing software.

Abstract

Systems, methods, and computer-readable media for managing a display interference in an electronic collaborative word processing document are disclosed. The systems and methods may involve accessing a electronic collaborative word processing document; presenting a first and a second instance of the electronic collaborative word processing document via a first and a second hardware device running a first and a second editor, respectively; receiving from the first and the second editor during a common editing period, first and second edits to the electronic collaborative word processing document, respectively; locking a display associated with the second hardware device to suppress the pagination change caused by the first edits received by the second hardware device; and upon receipt of a scroll-up command via the second editor during the common editing period, causing the display associated with the second hardware device to reflect the pagination change caused by the first edits.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is based on and claims benefit of priority of International Patent Application No. PCT/IB2021/062440 filed on Dec. 29, 2021, which claims priority to U.S. Provisional Patent Application No. 63/233,925, filed Aug. 17, 2021, U.S. Provisional Patent Application No. 63/273,448, filed Oct. 29, 2021, U.S. Provisional Patent Application No. 63/273,453, filed Oct. 29, 2021, International Patent Application No. PCT/IB2021/000024, filed on Jan. 14, 2021, International Patent Application No. PCT/IB2021/000090, filed on Feb. 11, 2021, and International Patent Application No. PCT/IB2021/000297, filed on Apr. 28, 2021, the contents of all of which are incorporated herein by reference in their entireties.
  • TECHNICAL FIELD
  • Embodiments consistent with the present disclosure include systems and methods for collaborative work systems. The disclosed systems and methods may be implemented using a combination of conventional hardware and software as well as specialized hardware and software, such as a machine constructed and/or programmed specifically for performing functions associated with the disclosed method steps. Consistent with other disclosed embodiments, non-transitory computer-readable storage media may store program instructions, which may be executable by at least one processing device and perform any of the steps and/or methods described herein.
  • BACKGROUND
  • Operation of modern enterprises can be complicated and time consuming. In many cases, managing the operation of a single project requires integration of several employees, departments, and other resources of the entity. To manage the challenging operation, project management software applications may be used. Such software applications allow a user to organize, plan, and manage resources by providing project-related information in order to optimize the time and resources spent on each project. It would be useful to improve these software applications to increase operation management efficiency.
  • SUMMARY
  • One aspect of the present disclosure is directed to systems, methods, and computer readable media for managing display interference in an electronic collaborative word processing document. The system may include at least one processor configured to: access the electronic collaborative word processing document; present a first instance of the electronic collaborative word processing document via a first hardware device running a first editor; present a second instance of the electronic collaborative word processing document via a second hardware device running a second editor; receive from the first editor during a common editing period, first edits to the electronic collaborative word processing document, wherein the first edits occur on a first earlier page of the electronic collaborative word processing document and result in a pagination change; receive from the second editor during the common editing period, second edits to the electronic collaborative word processing document, wherein the second edits occur on a second page of the electronic collaborative word processing document later than the first page; during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits received by the second hardware device; and upon receipt of a scroll-up command via the second editor during the common editing period, cause the display associated with the second hardware device to reflect the pagination change caused by the first edits.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of an exemplary computing device which may be employed in connection with embodiments of the present disclosure.
  • FIG. 2 is a block diagram of an exemplary computing architecture for collaborative work systems, consistent with embodiments of the present disclosure.
  • FIG. 3 illustrates an example of an electronic collaborative word processing document, consistent with some embodiments of the present disclosure.
  • FIG. 4 illustrates an example of an instance of a collaborative word processing document, consistent with some embodiments of the present disclosure.
  • FIG. 5 illustrates an example user interface of a collaborative word processing document with a locked display, consistent with some embodiments of the present disclosure.
  • FIG. 6 illustrates another example of a collaborative word processing document with an active work location, consistent with some embodiments of the present disclosure.
  • FIG. 7 illustrates another example of a collaborative word processing document with a locked display, consistent with some embodiments of the present disclosure.
  • FIG. 8 illustrates a block diagram of an example process for managing display interference in an electronic collaborative word processing document, consistent with some embodiments of the present disclosure.
  • FIG. 9 illustrates a block diagram of another example process for managing display interference in an electronic collaborative word processing document, consistent with some embodiments of the present disclosure.
  • FIG. 10 illustrates an exemplary editor for an electronic collaborative word processing document operating in collaborative mode, consistent with some embodiments of the present disclosure.
  • FIG. 11 illustrates an exemplary editor for an electronic collaborative word processing document with an option for enabling dual mode editing to enable private changes displayed, consistent with some embodiments of the present disclosure.
  • FIG. 12 illustrates a block diagram of an example process for enabling dual mode editing in collaborative documents to enable private changes.
  • DETAILED DESCRIPTION
  • Exemplary embodiments are described with reference to the accompanying drawings. The figures are not necessarily drawn to scale. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It should also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
  • In the following description, various working examples are provided for illustrative purposes. However, is to be understood the present disclosure may be practiced without one or more of these details.
  • Throughout, this disclosure mentions “disclosed embodiments,” which refer to examples of inventive ideas, concepts, and/or manifestations described herein. Many related and unrelated embodiments are described throughout this disclosure. The fact that some “disclosed embodiments” are described as exhibiting a feature or characteristic does not mean that other disclosed embodiments necessarily share that feature or characteristic.
  • This disclosure presents various mechanisms for collaborative work systems. Such systems may involve software that enables multiple users to work collaboratively. By way of one example, workflow management software may enable various members of a team to cooperate via a common online platform. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.
  • This disclosure is constructed to provide a basic understanding of a few exemplary embodiments with the understanding that features of the exemplary embodiments may be combined with other disclosed features or may be incorporated into platforms or embodiments not described herein while still remaining within the scope of this disclosure. For convenience, and form of the word “embodiment” as used herein is intended to refer to a single embodiment or multiple embodiments of the disclosure.
  • Certain embodiments disclosed herein include devices, systems, and methods for collaborative work systems that may allow a user to interact with information in real time. To avoid repetition, the functionality of some embodiments is described herein solely in connection with a processor or at least one processor. It is to be understood that such exemplary descriptions of functionality applies equally to methods and computer readable media and constitutes a written description of systems, methods, and computer readable media. The underlying platform may allow a user to structure a systems, methods, or computer readable media in many ways using common building blocks, thereby permitting flexibility in constructing a product that suits desired needs. This may be accomplished through the use of boards. A board may be a table configured to contain items (e.g., individual items presented in horizontal rows) defining objects or entities that are managed in the platform (task, project, client, deal, etc.). Unless expressly noted otherwise, the terms “board” and “table” may be considered synonymous for purposes of this disclosure. In some embodiments, a board may contain information beyond which is displayed in a table. Boards may include sub-boards that may have a separate structure from a board. Sub-boards may be tables with sub-items that may be related to the items of a board. Columns intersecting with rows of items may together define cells in which data associated with each item may be maintained. Each column may have a heading or label defining an associated data type. When used herein in combination with a column, a row may be presented horizontally and a column vertically. However, in the broader generic sense as used herein, the term “row” may refer to one or more of a horizontal and/or a vertical presentation. A table or tablature as used herein, refers to data presented in horizontal and vertical rows, (e.g., horizontal rows and vertical columns) defining cells in which data is presented. Tablature may refer to any structure for presenting data in an organized manner, as previously discussed, such as cells presented in horizontal rows and vertical columns, vertical rows and horizontal columns, a tree data structure, a web chart, or any other structured representation, as explained throughout this disclosure. A cell may refer to a unit of information contained in the tablature defined by the structure of the tablature. For example, a cell may be defined as an intersection between a horizontal row with a vertical column in a tablature having rows and columns. A cell may also be defined as an intersection between a horizontal and a vertical row, or as an intersection between a horizontal and a vertical column. As a further example, a cell may be defined as a node on a web chart or a node on a tree data structure. As would be appreciated by a skilled artisan, however, the disclosed embodiments are not limited to any specific structure, but rather may be practiced in conjunction with any desired organizational arrangement. In addition, tablature may include any type of information, depending on intended use. When used in conjunction with a workflow management application, the tablature may include any information associated with one or more tasks, such as one or more status values, projects, countries, persons, teams, progress statuses, a combination thereof, or any other information related to a task.
  • While a table view may be one way to present and manage the data contained on a board, a table's or board's data may be presented in different ways. For example, in some embodiments, dashboards may be utilized to present or summarize data derived from one or more boards. A dashboard may be a non-table form of presenting data, using, for example, static or dynamic graphical representations. A dashboard may also include multiple non-table forms of presenting data. As discussed later in greater detail, such representations may include various forms of graphs or graphics. In some instances, dashboards (which may also be referred to more generically as “widgets”) may include tablature. Software links may interconnect one or more boards with one or more dashboards thereby enabling the dashboards to reflect data presented on the boards. This may allow, for example, data from multiple boards to be displayed and/or managed from a common location. These widgets may provide visualizations that allow a user to update data derived from one or more boards.
  • Boards (or the data associated with boards) may be stored in a local memory on a user device or may be stored in a local network repository. Boards may also be stored in a remote repository and may be accessed through a network. In some instances, permissions may be set to limit board access to the board's “owner” while in other embodiments a user's board may be accessed by other users through any of the networks described in this disclosure. When one user makes a change in a board, that change may be updated to the board stored in a memory or repository and may be pushed to the other user devices that access that same board. These changes may be made to cells, items, columns, boards, dashboard views, logical rules, or any other data associated with the boards. Similarly, when cells are tied together or are mirrored across multiple boards, a change in one board may cause a cascading change in the tied or mirrored boards or dashboards of the same or other owners.
  • Boards and widgets may be part of a platform that may enable users to interact with information in real time in collaborative work systems involving electronic collaborative word processing documents. Electronic collaborative word processing documents (and other variations of the term) as used herein are not limited to only digital files for word processing, but may include any other processing document such as presentation slides, tables, databases, graphics, sound files, video files or any other digital document or file. Electronic collaborative word processing documents may include any digital file that may provide for input, editing, formatting, display, and/or output of text, graphics, widgets, objects, tables, links, animations, dynamically updated elements, or any other data object that may be used in conjunction with the digital file. Any information stored on or displayed from an electronic collaborative word processing document may be organized into blocks. A block may include any organizational unit of information in a digital file, such as a single text character, word, sentence, paragraph, page, graphic, or any combination thereof. Blocks may include static or dynamic information, and may be linked to other sources of data for dynamic updates. Blocks may be automatically organized by the system, or may be manually selected by a user according to preference. In one embodiment, a user may select a segment of any information in an electronic word processing document and assign it as a particular block for input, editing, formatting, or any other further configuration.
  • An electronic collaborative word processing document may be stored in one or more repositories connected to a network accessible by one or more users through their computing devices. In one embodiment, one or more users may simultaneously edit an electronic collaborative word processing document. The one or more users may access the electronic collaborative word processing document through one or more user devices connected to a network. User access to an electronic collaborative word processing document may be managed through permission settings set by an author of the electronic collaborative word processing document. An electronic collaborative word processing document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document.
  • Various embodiments are described herein with reference to a system, method, device, or computer readable medium. It is intended that the disclosure of one is a disclosure of all. For example, it is to be understood that disclosure of a computer readable medium described herein also constitutes a disclosure of methods implemented by the computer readable medium, and systems and devices for implementing those methods, via for example, at least one processor. It is to be understood that this form of disclosure is for ease of discussion only, and one or more aspects of one embodiment herein may be combined with one or more aspects of other embodiments herein, within the intended scope of this disclosure.
  • Embodiments described herein may refer to a non-transitory computer readable medium containing instructions that when executed by at least one processor, cause the at least one processor to perform a method. Non-transitory computer readable mediums may be any medium capable of storing data in any memory in a way that may be read by any computing device with a processor to carry out methods or any other instructions stored in the memory. The non-transitory computer readable medium may be implemented as hardware, firmware, software, or any combination thereof. Moreover, the software may preferably be implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine may be implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described in this disclosure may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. Furthermore, a non-transitory computer readable medium may be any computer readable medium except for a transitory propagating signal.
  • The memory may include a Random Access Memory (RAM), a Read-Only Memory (ROM), a hard disk, an optical disk, a magnetic medium, a flash memory, other permanent, fixed, volatile or non-volatile memory, or any other mechanism capable of storing instructions. The memory may include one or more separate storage devices collocated or disbursed, capable of storing data structures, instructions, or any other data. The memory may further include a memory portion containing instructions for the processor to execute. The memory may also be used as a working scratch pad for the processors or as a temporary storage.
  • Some embodiments may involve at least one processor. A processor may be any physical device or group of devices having electric circuitry that performs a logic operation on input or inputs. For example, the at least one processor may include one or more integrated circuits (IC), including application-specific integrated circuit (ASIC), microchips, microcontrollers, microprocessors, all or part of a central processing unit (CPU), graphics processing unit (GPU), digital signal processor (DSP), field-programmable gate array (FPGA), server, virtual server, or other circuits suitable for executing instructions or performing logic operations. The instructions executed by at least one processor may, for example, be pre-loaded into a memory integrated with or embedded into the controller or may be stored in a separate memory.
  • In some embodiments, the at least one processor may include more than one processor. Each processor may have a similar construction, or the processors may be of differing constructions that are electrically connected or disconnected from each other. For example, the processors may be separate circuits or integrated in a single circuit. When more than one processor is used, the processors may be configured to operate independently or collaboratively. The processors may be coupled electrically, magnetically, optically, acoustically, mechanically or by other means that permit them to interact.
  • Consistent with the present disclosure, disclosed embodiments may involve a network. A network may constitute any type of physical or wireless computer networking arrangement used to exchange data. For example, a network may be the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN or WAN network, and/or other suitable connections that may enable information exchange among various components of the system. In some embodiments, a network may include one or more physical links used to exchange data, such as Ethernet, coaxial cables, twisted pair cables, fiber optics, or any other suitable physical medium for exchanging data. A network may also include a public switched telephone network (“PSTN”) and/or a wireless cellular network. A network may be a secured network or unsecured network. In other embodiments, one or more components of the system may communicate directly through a dedicated communication network. Direct communications may use any suitable technologies, including, for example, BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), or other suitable communication methods that provide a medium for exchanging data and/or information between separate entities.
  • Certain embodiments disclosed herein may also include a computing device for generating features for work collaborative systems, the computing device may include processing circuitry communicatively connected to a network interface and to a memory, wherein the memory contains instructions that, when executed by the processing circuitry, configure the computing device to receive from a user device associated with a user account instruction to generate a new column of a single data type for a first data structure, wherein the first data structure may be a column oriented data structure, and store, based on the instructions, the new column within the column-oriented data structure repository, wherein the column-oriented data structure repository may be accessible and may be displayed as a display feature to the user and at least a second user account. The computing devices may be devices such as mobile devices, desktops, laptops, tablets, or any other devices capable of processing data. Such computing devices may include a display such as an LED display, augmented reality (AR), virtual reality (VR) display.
  • Certain embodiments disclosed herein may include a processor configured to perform methods that may include triggering an action in response to an input. The input may be from a user action or from a change of information contained in a user's table, in another table, across multiple tables, across multiple user devices, or from third-party applications. Triggering may be caused manually, such as through a user action, or may be caused automatically, such as through a logical rule, logical combination rule, or logical templates associated with a board. For example, a trigger may include an input of a data item that is recognized by at least one processor that brings about another action.
  • In some embodiments, the methods including triggering may cause an alteration of data and may also cause an alteration of display of data contained in a board or in memory. An alteration of data may include a recalculation of data, the addition of data, the subtraction of data, or a rearrangement of information. Further, triggering may also cause a communication to be sent to a user, other individuals, or groups of individuals. The communication may be a notification within the system or may be a notification outside of the system through a contact address such as by email, phone call, text message, video conferencing, or any other third-party communication application.
  • Some embodiments include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.
  • Other terms used throughout this disclosure in differing exemplary contexts may generally share the following common definitions.
  • In some embodiments, machine learning algorithms (also referred to as machine learning models or artificial intelligence in the present disclosure) may be trained using training examples, for example in the cases described below. Some non-limiting examples of such machine learning algorithms may include classification algorithms, data regressions algorithms, image segmentation algorithms, visual detection algorithms (such as object detectors, face detectors, person detectors, motion detectors, edge detectors, etc.), visual recognition algorithms (such as face recognition, person recognition, object recognition, etc.), speech recognition algorithms, mathematical embedding algorithms, natural language processing algorithms, support vector machines, random forests, nearest neighbors algorithms, deep learning algorithms, artificial neural network algorithms, convolutional neural network algorithms, recursive neural network algorithms, linear machine learning models, non-linear machine learning models, ensemble algorithms, and so forth. For example, a trained machine learning algorithm may comprise an inference model, such as a predictive model, a classification model, a regression model, a clustering model, a segmentation model, an artificial neural network (such as a deep neural network, a convolutional neural network, a recursive neural network, etc.), a random forest, a support vector machine, and so forth. In some examples, the training examples may include example inputs together with the desired outputs corresponding to the example inputs. Further, in some examples, training machine learning algorithms using the training examples may generate a trained machine learning algorithm, and the trained machine learning algorithm may be used to estimate outputs for inputs not included in the training examples. In some examples, engineers, scientists, processes and machines that train machine learning algorithms may further use validation examples and/or test examples. For example, validation examples and/or test examples may include example inputs together with the desired outputs corresponding to the example inputs, a trained machine learning algorithm and/or an intermediately trained machine learning algorithm may be used to estimate outputs for the example inputs of the validation examples and/or test examples, the estimated outputs may be compared to the corresponding desired outputs, and the trained machine learning algorithm and/or the intermediately trained machine learning algorithm may be evaluated based on a result of the comparison. In some examples, a machine learning algorithm may have parameters and hyper parameters, where the hyper parameters are set manually by a person or automatically by a process external to the machine learning algorithm (such as a hyper parameter search algorithm), and the parameters of the machine learning algorithm are set by the machine learning algorithm according to the training examples. In some implementations, the hyper-parameters are set according to the training examples and the validation examples, and the parameters are set according to the training examples and the selected hyper-parameters.
  • FIG. 1 is a block diagram of an exemplary computing device 100 for generating a column and/or row oriented data structure repository for data consistent with some embodiments. The computing device 100 may include processing circuitry 110, such as, for example, a central processing unit (CPU). In some embodiments, the processing circuitry 110 may include, or may be a component of, a larger processing unit implemented with one or more processors. The one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information. The processing circuitry such as processing circuitry 110 may be coupled via a bus 105 to a memory 120.
  • The memory 120 may further include a memory portion 122 that may contain instructions that when executed by the processing circuitry 110, may perform the method described in more detail herein. The memory 120 may be further used as a working scratch pad for the processing circuitry 110, a temporary storage, and others, as the case may be. The memory 120 may be a volatile memory such as, but not limited to, random access memory (RAM), or non-volatile memory (NVM), such as, but not limited to, flash memory. The processing circuitry 110 may be further connected to a network device 140, such as a network interface card, for providing connectivity between the computing device 100 and a network, such as a network 210, discussed in more detail with respect to FIG. 2 below. The processing circuitry 110 may be further coupled with a storage device 130. The storage device 130 may be used for the purpose of storing single data type column-oriented data structures, data elements associated with the data structures, or any other data structures. While illustrated in FIG. 1 as a single device, it is to be understood that storage device 130 may include multiple devices either collocated or distributed.
  • The processing circuitry 110 and/or the memory 120 may also include machine-readable media for storing software. “Software” as used herein refers broadly to any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, may cause the processing system to perform the various functions described in further detail herein.
  • FIG. 2 is a block diagram of computing architecture 200 that may be used in connection with various disclosed embodiments. The computing device 100, as described in connection with FIG. 1, may be coupled to network 210. The network 210 may enable communication between different elements that may be communicatively coupled with the computing device 100, as further described below. The network 210 may include the Internet, the world-wide-web (WWW), a local area network (LAN), a wide area network (WAN), a metro area network (MAN), and other networks capable of enabling communication between the elements of the computing architecture 200. In some disclosed embodiments, the computing device 100 may be a server deployed in a cloud computing environment.
  • One or more user devices 220-1 through user device 220-m, where ‘m’ in an integer equal to or greater than 1, referred to individually as user device 220 and collectively as user devices 220, may be communicatively coupled with the computing device 100 via the network 210. A user device 220 may be for example, a smart phone, a mobile phone, a laptop, a tablet computer, a wearable computing device, a personal computer (PC), a smart television and the like. A user device 220 may be configured to send to and receive from the computing device 100 data and/or metadata associated with a variety of elements associated with single data type column-oriented data structures, such as columns, rows, cells, schemas, and the like.
  • One or more data repositories 230-1 through data repository 230-n, where ‘n’ in an integer equal to or greater than 1, referred to individually as data repository 230 and collectively as data repository 230, may be communicatively coupled with the computing device 100 via the network 210, or embedded within the computing device 100. Each data repository 230 may be communicatively connected to the network 210 through one or more database management services (DBMS) 235-1 through DBMS 235-n. The data repository 230 may be for example, a storage device containing a database, a data warehouse, and the like, that may be used for storing data structures, data items, metadata, or any information, as further described below. In some embodiments, one or more of the repositories may be distributed over several physical storage devices, e.g., in a cloud-based computing environment. Any storage device may be a network accessible storage device, or a component of the computing device 100.
  • FIG. 3 is an exemplary embodiment of a presentation of an electronic collaborative word processing document 301 via an editing interface or editor 300. The editor 300 may include any user interface components 302 through 312 to assist with input or modification of information in an electronic collaborative word processing document 301. For example, editor 300 may include an indication of an entity 312, which may include at least one individual or group of individuals associated with an account for accessing the electronic collaborative word processing document. User interface components may provide the ability to format a title 302 of the electronic collaborative word processing document, select a view 304, perform a lookup for additional features 306, view an indication of other entities 308 accessing the electronic collaborative word processing document at a certain time (e.g., at the same time or at a recorded previous time), and configure permission access 310 to the electronic collaborative word processing document. The electronic collaborative word processing document 301 may include information that may be organized into blocks as previously discussed. For example, a block 320 may itself include one or more blocks of information. Each block may have similar or different configurations or formats according to a default or according to user preferences. For example, block 322 may be a “Title Block” configured to include text identifying a title of the document, and may also contain, embed, or otherwise link to metadata associated with the title. A block may be pre-configured to display information in a particular format (e.g., in bold font). Other blocks in the same electronic collaborative word processing document 301, such as compound block 320 or input block 324 may be configured differently from title block 322. As a user inputs information into a block, either via input block 324 or a previously entered block, the platform may provide an indication of the entity 318 responsible for inputting or altering the information. The entity responsible for inputting or altering the information in the electronic collaborative word processing document may include any entity accessing the document, such as an author of the document or any other collaborator who has permission to access the document.
  • In a collaborative word processing document, multiple users may simultaneously edit a single document in real time or near real time. Edits by a first user in one section of a document may interfere with the display of a second editor making edits to the same document, which may hamper the second editor's ability to make simultaneous edits in the document. The problem may be compounded when large groups make simultaneous edits to the same document, or when one user adds a large amount of content to the document. The introduction of text, graphics, or other objects to an earlier page in a collaborative word processing document may adjust the location of text or objects in a later page of the document or may shift a user's viewport so that the user's active editing location is no longer within the user's view. This reduces efficiency in collaboration between users and may lead to unintended editing errors by the user. Therefore, there is a need for unconventional innovations for managing display interference in an electronic collaborative word processing document to enable multiple users simultaneously edit a collaborative word processing document.
  • Such unconventional approaches may enable computer systems to implement functions to improve the efficiency of electronic collaborative word processing documents. By using unique and unconventional methods of classifying and storing data associated with a collaborative word processing document or by grouping editable segments of the collaborative word processing document into unique and discrete segments, a system may provide display locking techniques to increase the efficiency of electronic collaborative word processing documents. Various embodiments of the present disclosure describe unconventional systems, methods, and computer readable media for managing display interference in an electronic collaborative word processing document. Various embodiments of the present disclosure may include at least one processor configured to access the electronic collaborative word processing document, present a first instance of the electronic collaborative word processing document via a first hardware device running a first editor, and present a second instance of the electronic collaborative word processing document via a second hardware device running a second editor. The at least one processor may be configured to receive from the first editor during a common editing period, first edits to the electronic collaborative word processing document made on earlier page of the electronic collaborative word processing document that result in a pagination change. The at least one processor may be further configured to receive from the second editor during the common editing period, second edits to the electronic collaborative word processing document made on a second page of the electronic collaborative word processing document later than the first page. The at least one processor may be configured to, during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits received by the second hardware device, and upon receipt of a scroll-up command via the second editor during the common editing period, cause the display associated with the second hardware device to reflect the pagination change caused by the first edits.
  • Thus, the various embodiments in the present disclosure describe at least a technological solution, based on improvements to operations of computer systems and platforms, to the technical challenge of managing display interference caused by simultaneous edits to an electronic collaborative word processing document.
  • Some disclosed embodiments may involve systems, methods, and computer readable media for managing display interference in an electronic collaborative word processing document. Display interference may refer to an undesirable adjustment of a viewing display, or editing location within an electronic collaborative word processing document caused by edits made by another user, or by any other alterations in the electronic collaborative word processing document. Display interference may include any shift in the location of information or data displayed within an electronic collaborative word processing document. For example, a user may be editing paragraph “A” on a second page of a collaborative word processing document. Another user may add two pages of text on a first page of the same collaborative word processing document. The addition of two pages of text to the collaborative word processing document may cause paragraph “A” to move to a fourth page in the collaborative word processing document that is out of the current view of the first user. This movement of paragraph “A” is one example of a display interference. Display interference is not limited to an unwanted shift of an active editing location outside of the current viewport. Display interference may include unwanted shifts of an active editing location within a viewport. For example, display interference may include the addition of a single line of text to a collaborative word processing document that causes paragraph “A” to move one line of text down in the collaborative word processing document, with paragraph “A” either remaining wholly or partially within the current viewport. Display interference is not limited to vertical shifts in information or data displayed within an electronic collaborative word processing document and may include horizontal shifts or a combination of vertical and horizontal shifts in the display of information or data caused by other edits within the document. Furthermore, display interference is not limited to movement in the location of information or data in an active editing location and may include the movement in the location of any information or data within a collaborative word processing document. Managing display interference may include any steps taken by the system to resolve display interference that may occur on one or more displays of one or more users accessing an electronic collaborative word processing document, which is discussed in further detail below.
  • Some aspects of the present disclosure may involve display interference within an electronic collaborative word processing document. An electronic collaborative word processing document may be a file read by a computer program that provides for the input, editing, formatting, display, and output of text, graphics, widgets, objects, tables, or other elements typically used in computer desktop publishing applications. An electronic collaborative word processing document may be stored in one or more repositories connected to a network accessible by one or more users via at least one associated computing device. In one embodiment, one or more users may simultaneously edit an electronic collaborative word processing document, with all users' edits displaying in real-time or near real time within the same collaborative word processing document file. The one or more users may access the electronic collaborative word processing document through one or more user devices connected to a network. An electronic collaborative word processing document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document. Though this disclosure refers to electronic collaborative word processing documents, the systems, methods, and techniques disclosed herein are not limited to word processing documents and may be adapted for use in other productivity applications such as documents, presentations, worksheets, databases, charts, graphs, digital paintings, electronic music and digital video or any other application software used for producing information.
  • FIG. 3 is an exemplary embodiment of a presentation of an electronic collaborative word processing document 301 via an editing interface or editor 300. The editor 300 may include any user interface components 302 through 312 to assist with input or modification of information in an electronic collaborative word processing document 301. For example, editor 300 may include an indication of an entity 312, which may include at least one individual or group of individuals associated with an account for accessing the electronic collaborative word processing document. User interface components may provide the ability to format a title 302 of the electronic collaborative word processing document, select a view 304, perform a lookup for additional features 306, view an indication of other entities 308 accessing the electronic collaborative word processing document at a certain time (e.g., at the same time or at a recorded previous time), and configure permission access 310 to the electronic collaborative word processing document. The electronic collaborative word processing document 301 may include information that may be organized into blocks as previously discussed. For example, a block 320 may itself include one or more blocks of information. Each block may have similar or different configurations or formats according to a default or according to user preferences. For example, block 322 may be a “Title Block” configured to include text identifying a title of the document, and may also contain, embed, or otherwise link to metadata associated with the title. A block may be pre-configured to display information in a particular format (e.g., in bold font). Other blocks in the same electronic collaborative word processing document 301, such as compound block 320 or input block 324 may be configured differently from title block 322. As a user inputs information into a block, either via input block 324 or a previously entered block, the platform may provide an indication of the entity 318 responsible for inputting or altering the information. The entity responsible for inputting or altering the information in the electronic collaborative word processing document may include any entity accessing the document, such as an author of the document or any other collaborator who has permission to access the document.
  • Some disclosed embodiments may include accessing an electronic collaborative word processing document. An electronic collaborative word processing document may be stored in one or more data repositories and the document may be retrieved by one or more users for downloading, receiving, processing, editing, or viewing, the electronic collaborative word processing document. An electronic collaborative word processing document may be accessed by a user using a user device through a network. Accessing an electronic collaborative word processing document may involve retrieving data through any electrical medium such as one or more signals, instructions, operations, functions, databases, memories, hard drives, private data networks, virtual private networks, Wi-Fi networks, LAN or WAN networks, Ethernet cables, coaxial cables, twisted pair cables, fiber optics, public switched telephone networks, wireless cellular networks, BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), or any other suitable communication method that provide a medium for exchanging data. In some embodiments, accessing information may include adding, editing, deleting, re-arranging, or otherwise modifying information directly or indirectly from the network. A user may access the electronic collaborative word processing document using a user device, which may include a computer, laptop, smartphone, tablet, VR headset, smart watch, or any other electronic display device capable of receiving and sending data. In some embodiments, accessing the electronic word processing document may include retrieving the electronic word processing document from a web browser cache. Additionally or alternatively, accessing the electronic word processing document may include connecting with a live data stream of the electronic word processing document from a remote source. In some embodiments, accessing the electronic word processing document may include logging into an account having a permission to access the document. For example, accessing the electronic word processing document may be achieved by interacting with an indication associated with the electronic word processing document, such as an icon or file name, which may cause the system to retrieve (e.g., from a storage medium) a particular electronic word processing document associated with the indication.
  • For example, an electronic collaborative word processing document may be stored in repository 230-1 as shown in FIG. 2. Repository 230-1 may be configured to store software, files, or code, such as electronic collaborative word processing documents developed using computing device 100 or user device 220-1. Repository 230-1 may further be accessed by computing device 100, user device 220-1, or other components of system 200 for downloading, receiving, processing, editing, or viewing, the electronic collaborative word processing document. Repository 230-1 may be any suitable combination of data storage devices, which may optionally include any type or combination of slave databases, load balancers, dummy servers, firewalls, back-up databases, and/or any other desired database components. In some embodiments, repository 230-1 may be employed as a cloud service, such as a Software as a Service (SaaS) system, a Platform as a Service (PaaS), or Infrastructure as a Service (IaaS) system. For example, repository 230-1 may be based on infrastructure of services of Amazon Web Services™ (AWS), Microsoft Azure™, Google Cloud Platform™, Cisco Metapod™, Joyent™, vmWare™, or other cloud computing providers. Repository 230-1 may include other commercial file sharing services, such as Dropbox™, Google Docs™, or iCloud™. In some embodiments, repository 230-1 may be a remote storage location, such as a network drive or server in communication with network 210. In other embodiments repository 230-1 may also be a local storage device, such as local memory of one or more computing devices (e.g., computing device 100) in a distributed computing environment.
  • Some disclosed embodiments may include presenting a first instance of an electronic collaborative word processing document. Presenting an instance of an electronic word processing document may include causing a display of the information contained in the electronic word processing document via a display device. An electronic collaborative word processing document may be presented in multiple instances on multiple user devices. Presenting multiple instances of the electronic collaborative word processing document on multiple devices may facilitate collaborative editing of the same document because multiple users may access and edit the same document file at the same time from different user devices. A first instance of the electronic collaborative word processing document may include the presentation of data and information contained in the electronic collaborative word processing document to a first user. For example, a user may view or edit a first instance of the electronic collaborative word processing document and the user may control the location of the user's view (e.g., an active display window) or edits in the first instance of the electronic collaborative word processing document. This location may be independent or distinct from other users' views or editing locations in any other instance of the electronic collaborative word processing document. In one embodiment, edits made by a user in an instance of the electronic collaborative word processing document are synchronized in real time or near-real time to all other instances of the same electronic collaborative word processing document.
  • A first instance of an electronic collaborative word processing document may be presented via a first hardware device running a first editor. A first hardware device may include a computer, laptop, smartphone, tablet, VR headset, smart watch, or any other electronic display device capable of receiving and sending data. A first editor may be a user interface that provides for the input, editing, formatting, display, and output of text, graphics, widgets, objects, tables, or other elements in an electronic word processing document. A first editor may receive user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data. In one embodiment, a user accesses an electronic collaborative word processing document using a computer and views the document in an editor that receives text and other input via a mouse and keyboard.
  • By way of example, FIG. 4 illustrates an instance of a collaborative electronic word processing document presented within an editor 400. In some embodiments, editor 400 may be displayed by a computing device (e.g., the computing device 100 illustrated in FIG. 1), software running thereon, or any other projecting device ((e.g., projector, AR or VR lens, or any other display device) as previously discussed). Editor 400 may include various tools for displaying information associated with the document of for editing the document. For example, editor 400 may display a title 402 indicating the title of the document. Formatting bar 404 may depict various tools to adjust formatting of information or objects within the document. Help bar 406 may be included which may provide hyperlinks to information about various features of the editor 400. Share button 410 may be included to invite additional users to edit another instance of the collaborative electronic word processing document. Editor 400 may include tool bar 412 and interface bar 414.
  • Some disclosed embodiments may include presenting a second instance of the electronic collaborative word processing document. Presenting a second instance of the electronic collaborative word processing document may be achieved in the same or similar manner as presenting a first instance of the electronic collaborative word processing document, as discussed above. Presenting a second instance may include the display of data and information contained in the electronic collaborative word processing document to a second user. For example, a second user may view or edit a second instance of the electronic collaborative word processing document and the second user may control the location of the second user's view or edits in the second instance of the electronic collaborative word processing document. Views presented and edits made in the second instance of the electronic collaborative word processing document may be made independently of the views presented or edits made by other users in any other instance, such as in the first instance discussed previously above. For example, the first instance and the second instance of the electronic collaborative word processing document may display different portions of the document and may receive edits to the electronic collaborative word processing document at different locations within the document. Edits made by a user in the first or the second instance of the electronic collaborative word processing document may be incorporated into other instances of the electronic collaborative word processing document in real time. In some embodiments, the first instance and the second instance of the electronic collaborative word processing document may share a common viewport displaying some of the same data and information in both the first and second instances of the document. Edits made in the first or second instance may be demarcated by user identification indicators in the first and second instance. User identification indicators may include a graphic, a user ID indicator, a color, a font, or any other differentiator that indicates the source of an edit in an instance of the electronic collaborative word processing document. The second instance of the electronic collaborative word processing document may be presented via a second hardware device running a second editor, in a similar manner to the first hardware device and the first editor described herein. Any number of hardware devices may run an editor to access another instance of the electronic collaborative word processing document.
  • Returning to FIG. 4 by way of example, editor 400 may indicate the that multiple users are accessing an electronic collaborative word processing document through the display of user indicator, such as user display indicator 408 which indicates two users are running an instance of the electronic collaborative word processing document. Editor 400 may include current user indicator 416. Current user indicator 416 may indicate the identification of the user running the displayed instance of the collaborative word processing document. In some embodiments, the objects and information displayed for editing may be controlled by the current user shown in 416 in each instance of the electronic collaborative word processing document. For example, FIG. 4 may depict an editing location that is actively edited by the current user, such as editing location 424, Editing location 424 may be a block as described herein. Other blocks may be shown in the viewport of editor 400 but may not be the active editing location. For example, FIG. 4 includes Title block 422 and paragraph block 420 which are not actively being edited by the user. The location that a different user is actively editing in another instance of the electronic collaborative word processing document may be indicated by icon 418, which may indicate the active working location of another user, which in this example is paragraph block 420.
  • Some embodiments may include receiving from a first editor during a common editing period, first edits to an electronic collaborative word processing document. A common editing period may include a time when at least two instances of the electronic collaborative word processing document are presented in two editors. In one embodiment, a common editing period may include two users each viewing and editing the same electronic collaborative word processing document in two instances displayed on separate hardware devices associated with each of the two users. A common editing period is not limited to situations when two users are editing a document and may include any number of users editing a document in real or near real time. An edit to an electronic collaborative word processing document may include the addition, manipulation, or deletion of objects or data, and may include addition, manipulation, or deletion of text, graphics, tables, images, formatting, highlights, manipulation of fonts, icons, shapes, references, headers, footers, or any other addition, deletion, or manipulation of objects or any other data within the electronic word processing document. Receiving the first edits may include the system receiving an edit request from a computing device associated with a user. The request may be transmitted over a network to a repository where the electronic collaborative word processing document is stored. At least one processor may then perform a lookup of permission settings to confirm whether the computing device has authorization to make the edit. In a situation where authorization is confirmed, the system may then implement and store the edit with the electronic collaborative word processing document such that any other computing devices accessing the document may retrieve the document with the implemented change.
  • In some embodiments, edits made during a common editing period may be transmitted or received through a communications interface. A communications interface may be a platform capable of sending and retrieving data through any electrical medium such as the types described herein that manage and track edits made in a collaborative electronic word processing document from one or more editors. In one embodiment, the communications interface may be integrated with the electronic collaborative word processing document editor. For example, protocols may be incorporated into the editor that manage exchanges of data between multiple editors running one or more instances of the electronic collaborative word processing document. In other embodiments, the communications interface may be separate from the editor and may run on separate hardware devices.
  • For example, a communications interface may run on a computing device, such as computing device 100 (of FIG. 1), and may transmit or receive edits made by a first editor running on user devices 220-1 and a second editor running on user device 220-2 through network 210 (of FIG. 2). More broadly, a communications interface may refer to any platform capable transmitting or receiving edits made to an electronic collaborative word processing document through a network or other electronic medium.
  • In some embodiments, first edits may occur on a first earlier page of an electronic collaborative word processing document and result in a pagination change. A pagination change may include any alteration to a length of an electronic document, such as by a line of text, a page of text, or multiple pages of text. The pagination change may a result of an addition, deletion, rearrangement, or any other modification to the information in the electronic collaborative word processing document. For example, data and objects in the electronic collaborative word processing document may be arranged in a publication display format that depict the display of data and objects on printed pages, such as the display found in desktop publishing applications or other editing software. Objects and data may be arranged so that pages of data are displayed sequentially, for example, in a vertical or a horizontal arrangement of the display of pages. A pagination change may occur when edits include the addition or arrangement of content in the document that causes certain data and content in the document to move to another page, or to move to another location on the same page. For example, a document may contain paragraph “A” located in the middle of the second page of the document. First edits may occur on the first page of the document that introduce the addition of two additional pages of text. This may result in a pagination change of paragraph “A,” which may move from page two to page four in the document. A pagination change is not limited to the movement of objects and data from one page to another and may include movements of objects and data within the same page either by a single line, part of a line, a paragraph, or horizontally within a single line. More broadly, a pagination change may refer to any adjustment in the location of objects or text within the location of a page in the collaborative electronic word processing document.
  • Some disclosed embodiments may include receiving from a second editor during the common editing period, second edits to an electronic collaborative word processing document. Second edits may include the addition, manipulation, or deletion of objects or data, and may include addition, manipulation, or deletion of text, graphics, tables, images, formatting, highlights, manipulation of fonts, icons, shapes, references, headers, footers, or any other addition, deletion, or manipulation of objects or data within the electronic word processing document as previously discussed. As used herein, second edits refer to edits made in a second instance of the collaborative electronic word processing document. Second edits may occur either earlier in time, later in time, or simultaneously with first edits and are not limited to edits occurring later in time than first edits in the document.
  • In some embodiments, second edits may occur on a second page of an electronic collaborative word processing document later than a first page. As described herein, objects and data within the electronic collaborative word processing document may be displayed on sequentially arranged pages, for example, in a vertical or a horizontal arrangement of the display of pages. Second edits may occur on a second page in that the second page is arranged sequentially after edits from the first page in the document. A second page may be later than a first page if it occurs anywhere in the sequence of pages after the first edits. In one non-limiting example, second edits in the document may occur on page 4 of the document and first edits may occur on page 2. In this example, the second edits on the second page occur later than the first page in that page 4 is displayed sequentially after page 2. In other embodiments, first and second edits may occur on the same page, with the second edits occurring sequentially after the first edits within the same page. For example, if a second edit occurs lower on a page than a first edit, then the second edit may be considered later than first edits. In some embodiments, second edits may be associated with a block in an electronic collaborative word processing document. As described herein, the electronic collaborative word processing document may organize objects and data into blocks. A block may be any collection of objects or data within the electronic collaborative word processing document, as described herein. For example, the electronic collaborative word processing document may contain one or more title blocks which display formatting text information. Other blocks may include other text portions of the document, such as a sentence, a group of sentences, a paragraph, or a collection of paragraphs, or any grouping of text. Blocks are not limited to text alone, and objects such as charts, graphics, widgets, objects, or tables, or any other component in the document may be recognized as a block.
  • Some disclosed embodiments may include recognizing an active work location of the second editor. An active work location of an editor may include any portion of the electronic collaborative word processing document displayed in or receiving edits from the editor. For example, a user may be editing a portion of the electronic word processing document using an instance of an editor, and the active work location may correspond to the location of the edits. In another example, a user may be viewing a portion of the document, and the active work location may correspond to the location of the viewport displayed by the editor. In yet another embodiment, there may be multiple active work locations, for example, when a user may be editing one portion of the electronic word processing document while viewing a second portion of the electronic word processing document, such as using multiple viewports or by scrolling away from the edit location. Recognizing the active work location may be performed in various ways and may include any process of determining at least a portion of an electronic collaborative word processing document for display or alteration on a computing device. In some embodiments, the recognition of the active work location may be based on a cursor location in the second instance of the collaborative electronic word processing document. A cursor location may include any indication of a location on a display that represents an intent to interact (e.g., manipulate text, select data objects, view information, activate a link, or any other interaction) with the location of the indication is presented in the display. The cursor location may be displayed visually or may be omitted from display according to preference. A cursor location may be determined by an editing location or by a hovering location. For example, a user may be editing a document at the location of an editing cursor and the system may recognize the cursor as the active work location. In other embodiments, the system may recognize adjacent objects and data around the cursor location as included in the active work location. For example, adjacent letters, words, sentences, or paragraphs near the cursor may be included as part of the active work location depending on certain contexts. In yet another example, a user may use a device (e.g., a mouse) to move a cursor location and hover over a certain portion of a collaborative electronic word processing document without selecting a specific location for editing, such a scrolling location. In other embodiments, the recognition of the active work location may be based on a scrolling location in the second instance of the collaborative electronic word processing document. A scrolling location may include any displayed portion of the collaborative electronic word processing document, which may be displayed independently of the editing cursor location. The system may recognize a location within the viewport as the active work location. A scrolling location may be recognized in various ways. For example, determining a scrolling location may be based on an amount of time a viewport displays a location of the document, based on a distance away from the editing cursor, or based on user preferences.
  • In yet other embodiments, the recognition of the active work location may be based on a block location in the second instance of the collaborative electronic word processing document A block location may include a relative or absolute position of a block within an electronic collaborative word processing document. For example, each block within the electronic collaborative word processing document may include a unique block identification (“ID”) with associated location data. The associated location data may determine a block's location within the electronic collaborative word processing document. For example, the location data may describe a block's location with respect to other blocks, describe a sequence of blocks for display, or describe a block's intended position within a document based on distances from margins or other blocks or a combination of these factors. The system may recognize that a block is an active work location based on the location of edits or the viewport displayed in the second editor, or any other way based on data received by the editor.
  • By way of example, FIG. 4, shows active work location 424 indicated by the user's cursor 425 positioned in a text block. Distance 426 may indicate the positioning of the active work area 424 from the edge of the display or viewport. Data associated with the active work location 424 and blocks, such as blocks 420, 422, and 424, may be stored in a repository, such as repository 230-1, so that the system can track the positioning of the active work location 424 and block locations 420, 422, and 424 in relation to a first or second instance of the editor. Location data of the user's cursor 425, or of the users scrolling location may also be stored in a repository. The user's scrolling location may be defined by a collection or grouping of blocks. In the example shown in FIG. 4, the user's scrolling location contains the collection of blocks 422 and 424. When a first user accesses the collaborative electronic word document from their computing device 220-1, the system may record an active work location for the first user and cause the system to display information from the stored collaborative electronic word document at that first active work location to computing device 220-1. Independently, when a second user accesses the collaborative electronic work document from a second computing device 220-2, the system may recognize a second active work location for the second user and cause the second computing device 220-2 to display only the second active work location independently from the display of the first computing device 220-1.
  • Some disclosed embodiments may include, locking a display associated with the second hardware device. Locking a display may refer to fixing the location of objects and information in a viewport of a hardware device. For example, the location of objects and information depicted on a screen of the second hardware device may shift during operation of an editor. When a display is locked, the location of objects and information depicted on a screen of the second hardware device may remain in a location that does not change, independent of the location of the objects and information in relation to their placement in a document. In some embodiments, locking a display may indicate that objects and information depicted on a screen are fixed at the pixel level. In other embodiments, locking a display may indicate that objects and information depicted on a screen are fixed with respect to a determined measurement taken from the boundaries of the viewport. In yet other embodiments, locking a display may indicate that objects and information depicted on a screen are fixed with respect to one direction but may not be fixed with respect to another direction. For example, a display may depict a block at a location in the document. Locking a display may fix the distance between the first line of the block and a boundary of the display but edits to the block may cause the distance from other lines of the block to the boundary of the display to change. Locking a display is not limited to fixing pixel locations or dimensions to boundaries of the viewport to blocks but may include any fixing of the display with respect to any objects or information within the electronic collaborative word processing document.
  • In some embodiments, locking a display may suppress a pagination change caused by the first edits received by the second hardware device during a common editing period. A pagination change may be a shift in the location of objects or data from one page in a document to another page based on changes in objects or data on an earlier page, as described previously above. The pagination change may occur as a result of a single user editing a document, or as a result of multiple users editing the document at the same time during a common editing period, as previously discussed above. In an unlocked display, introduction of objects or information at an earlier location in an electronic collaborative word processing document may cause the location of objects and information located on a later page in the document to shift to another page due to the first edits. Pagination changes may be caused by any edits of objects and data at an earlier page in a document, and may include, as non-limiting examples, introduction, modification, or removal of text, formatting, images, objects, comments, redlines, tables, graphs, charts, references, headers, covers, shapes, icons, models, links, bookmarks, headers, footers, text boxes, or any other objects or data. For example, paragraph “A” on page three of a document may shift to page five in the document if two pages of table data are added to the document at a location before paragraph “A.” In some embodiments, locking a display to suppress a pagination change may include fixing the location of objects or information to a location within a page of an electronic collaborative word processing document as described herein. For example, a user may be editing text on a third page in an electronic collaborative word processing document using a second hardware device in a common editing period, and another user may introduce two additional pages of text and graphics at a location earlier in the document using a first hardware device. In this example, the system may freeze the location of the text on the third page in a display of the second hardware device and will not adjust the location of this text to a new page based on the edits in an earlier location of the document caused by the first hardware device.
  • By way of example, FIG. 5 depicts an electronic collaborative word processing document with a locked display at an active work location. FIG. 5 is an example of the same interface in FIG. 4 after first and second edits have been made to the document. For example, second edits have made by a user operating a second hardware device, such as hardware device 220-2 (of FIG. 2), at location 506. Location 506 represents the same active editing location shown in FIG. 4 at 424. A first user operating a different hardware device 220-1 (of FIG. 2) has introduced first edits 504 to the document at a location earlier in the document than the active editing location 506 being edited by the second user on hardware device 220-2. The display shown on hardware device 220-2 is locked in that the vertical distance 508 from the active work location to the edge of the display in FIG. 5 is the same distance as vertical distance 426 in FIG. 1 made prior to the first and second edits. In response to the first user's edits in 504 made on hardware device 220-1, the system has adjusted the location of text earlier in the document shown on hardware device 220-2, such as text in block 502, while the display is locked. While the display is locked, additional edits made by the first user to location 504 on hardware device 220-1 will continue to adjust the location of text earlier in the document shown on hardware device 220-2, such as text block 502, but the location of the active editing area by the second user at 506 will remain fixed with respect to the viewport shown in hardware device 220-2.
  • FIG. 6 and FIG. 7 depict another example of locking a display. A display may be locked with the introduction of widgets, figures, or charts at an earlier location in the document. For example. FIG. 6 shows an active work location 64 of a user running an editor. FIG. 7 depicts the same editor later in time after a different user has introduced widgets 706 and 708 in the document. As can be seen in FIG. 6 and FIG. 7, the distance 606 from the active work location to the bottom of the editor before the addition of the widgets and the distance 706 from the active work location to the bottom of the editor after the addition of the widgets is constant in a locked display.
  • In some embodiments, locking the display scrolling associated with the second display may be based on the recognized active work location so as not to interrupt viewing of the active work location. The system may recognize an active work location as described herein and then freeze or lock the display of the active work location at a location on the screen when edits made at an earlier location in the document would otherwise result in a shift in the location of the active work location, as discussed previously, Not interrupting viewing of the active work location may include maintaining the display of the active work location even though other users make alterations to a document. For example, if the active work location is confined to information in a block and the block includes a paragraph, the system may recognize that the paragraph is the active work location and may fix the location of the paragraph in the display. Alternatively, blocks may include header lines, charts, graphs, or widgets or any other objects or information as described herein. As another example, the system may recognize that a block that includes a chart is the active work location and may fix the location of the chart in the display. The system may track the relative arrangement of blocks based on certain data associated with the blocks. For example, each block may retain location data that positions that block in relationship to the location of other blocks within the document. This data may be independent of location data associated with the display of information in the electronic collaborative word processing document. The system may compute or record the relative arrangement of the display of blocks within the document by updating data describing the relative position of the blocks but may not update the location of the block associated with the active work location within the document when the display is fixed. In this way, a second editor can receive edits from a first editor that updates block information, including the relative position data associated with the introduction of new blocks at an earlier location in the document, but allows the second editor to lock the display of the active work location.
  • In some other embodiments, a lock may remain in place until an active work location is changed in a second editor. The active work location may be changed based on user actions, user preferences, or other determinations by the system. For example, the active work location may be changed upon a user moving the cursor to a second location, scrolling to a new location in the document, editing a different block, an amount of time since the last user input, selecting an icon or toggle associated with the display lock, or any other change in the editing location by the user. When the lock is released, the display may update to reflect a revised location of the active work location based on edits that occurred at an earlier page in the document.
  • In some embodiments, the system may receive a scroll-up command via a second editor during the common editing period. A scroll-up command may be any input from a user that indicates a user intent to change the viewport to display additional information. For example, a user may roll a mouse wheel, click a scroll bar on a document, or provide input through a keyboard, voice headset, haptic controls, or other user device that indicates a user desire to adjust a display. A scroll commend in general may be any input to indicate a direction in which the system may re-render the viewport to display additional information in any direction in relation to the electronic document being displayed in the viewport. In some embodiments, receipt of a scroll-up command may cause the display associated with the second hardware device to reflect the pagination change caused by the first edits. Reflecting the pagination change caused by the first edits may include updating or re-rendering the display to reflect a revised location of objects and information currently displayed on the second editor to reflect location changes caused by edits that occurred at an earlier page in the document. For example, when a second editor is altering information on page 2 of an electronic collaborative document while a first editor is altering information on page 1 of the same document, if the first editor's alterations adds an additional page of text to add a new page 2 and push the previous page 2 into new page 3, the second editor's viewport of the electronic collaborative document may lock the second editor's display so that a second user of the second editor is not interrupted and displays information on the previous page 2. However, as soon as the second user inputs a scroll-up commend in the second editor, the system may render the second editor's viewport to view the newly added page 2 from the first editor in a seamless manner.
  • In yet other embodiments, a scroll-up command that causes a second hardware device to reflect the pagination change may include a scroll to a page other than a page currently displayed on a second display. For example, adjustments to the viewing location of less than one page may not cause the system to reflect the pagination change caused by the first edits. In one embodiment, a user may want to view a different part of a page associated with an active work location and may scroll up to another part of the page without changing the viewing page. In this embodiment, the system may not reflect the pagination change caused by first edits on an earlier page. If the user issues a scroll-up command that adjusts the viewing location to a different page than the currently displayed page on the second display, the system may update the display to reflect a revised location of objects and information currently displayed on the second editor to reflect location changes caused by edits that occurred at an earlier page in the document.
  • FIG. 8 illustrates a block diagram of an example process 800 for managing display interference in an electronic collaborative word processing document. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram. In some embodiments, the process 800 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1) of a computing device (e.g., the computing device 100 in FIGS. 1 and 2) to perform operations or functions described herein and may be described hereinafter with reference to FIGS. 4 to 7 by way of example. In some embodiments, some aspects of the process 800 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1) or a non-transitory computer-readable medium. In some embodiments, some aspects of the process 800 may be implemented as hardware (e.g., a specific-purpose circuit). In some embodiments, the process 800 may be implemented as a combination of software and hardware.
  • FIG. 8 includes process blocks 802 to 816. At block 802, a processing means may access an electronic collaborative word processing document, as discussed previously in the disclosure above.
  • At block 804, the processing means may present a first instance of the electronic collaborative word processing document in a first editor, as discussed previously in the disclosure above.
  • At block 806, the processing means may present a second instance of the electronic collaborative word processing document in a second editor, as discussed previously in the disclosure above.
  • At block 808, the processing means may receive from the first editor during a common editing period, first edits to the electronic collaborative word processing document, as discussed previously in the disclosure above.
  • At block 810, the processing means may receive from the first editor during a common editing period, second edits to the electronic collaborative word processing document, as discussed previously in the disclosure above.
  • At block 812, the processing means may, during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits, as discussed previously in the disclosure above.
  • At block 814, the processing means may receive a scroll-up command via the second editor during the common editing period, as discussed previously in the disclosure above.
  • At block 816, the processing means may update the display to reflect the pagination change caused by the first edits, as discussed previously in the disclosure above.
  • FIG. 9 illustrates a block diagram of an example process 900 for managing display interference in an electronic collaborative word processing document. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram. In some embodiments, the process 900 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1) of a computing device (e.g., the computing device 100 in FIGS. 1 and 2) to perform operations or functions described herein and may be described hereinafter with reference to FIGS. 4 to 7 by way of example. In some embodiments, some aspects of the process 900 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1) or a non-transitory computer-readable medium. In some embodiments, some aspects of the process 900 may be implemented as hardware (e.g., a specific-purpose circuit). In some embodiments, the process 900 may be implemented as a combination of software and hardware.
  • FIG. 9 includes process blocks 902 to 908. At block 902, a processing means may receive via a communications interface during a common editing period, first edits from a first editor accessing a first instance of the electronic collaborative document via a first hardware device, wherein the first edits occur on a first earlier page of the electronic collaborative word processing document and result in a pagination change, as discussed previously in the disclosure above.
  • At block 904, the processing means may receive during the common editing period, second edits from a second editor accessing a second instance of the electronic collaborative document via a second hardware device, as discussed previously in the disclosure above.
  • At block 908, the processing means may, during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits received via the communications interface, as discussed previously in the disclosure above.
  • At block 908, the processing means may, upon receipt of a scroll-up command via the second editor during the common editing period, cause the display associated with the second hardware device to reflect the pagination change caused by the first edits, as discussed previously in the disclosure above.
  • In a collaborative word processing document, multiple users may simultaneously edit a single document in real time, near real time, or asynchronously. Problems may arise when certain edits made by a user in a collaborative word processing document are visible to or shared with all other users in the collaborative word processing document. In some instances, a user may input data into an electronic collaborative word processing document that the user does not intend to share with all other users of the collaborative word processing document. For example, a user may input confidential salary data in a portion of a collaborative word processing document that the user wishes to hide from some or all other users in the same document. In other instances, a user may wish to mask or hide the user's edits to one or more portions of a collaborative word processing document for a period of time. For example, a user may wish to make several private revisions, or drafts, to a portion of a collaborative word processing document, and then share the user's final edits with the other users in the collaborative word processing document at a later time. More generally, users editing a collaborative word processing document may wish to control the timing and visibility to some or all other users of certain edits that are shared within the collaborative word processing document. Therefore, there is a need for unconventional innovations for enabling dual mode editing in collaborative documents to enable private changes.
  • Such unconventional approaches may enable computer systems to implement functions to improve the efficiency of electronic collaborative word processing documents. By using unique and unconventional methods of classifying and storing data associated with a collaborative word processing document or by grouping, storing, and displaying histories and iterations of editable segments of the collaborative word processing document into unique and discrete elements with access restriction controls, a system may provide dual mode editing in collaborative documents to enable private changes to increase the efficiency of electronic collaborative word processing documents. Various embodiments of the present disclosure describe unconventional systems, methods, and computer readable media for enabling dual mode editing in collaborative documents to enable private changes in an electronic collaborative word processing document. Various embodiments of the present disclosure may include at least one processor configured to access an electronic collaborative document in which a first editor and at least one second editor are enabled to simultaneously edit and view each other's edits to the electronic collaborative document, and output first display signals for presenting an interface on a display of the first editor, the interface including a toggle enabling the first editor to switch between a collaborative mode and a private mode. The at least one processor may be configured to receive from the first editor operating in the collaborative mode, first edits to the electronic collaborative document and to output second display signals to the first editor and the at least one second editor, the second display signals reflecting the first edits made by the first editor. The at least one processor may be configured to receive from the first editor interacting with the interface, a private mode change signal reflecting a request to change from the collaborative mode to the private mode, and in response to the first mode change signal, initiate in connection with the electronic collaborative document the private mode for the first editor. The at least one processor may be configured to, in the private mode, receive from the first editor, second edits to the electronic collaborative document, and in response to the second edits, output third display signals to the first editor while withholding the third display signals from the at least one second editor, such that the second edits are enabled to appear on a display of the first editor and are prevented from appearing on at least one display of the at least one second editor.
  • Thus, the various embodiments in the present disclosure describe at least a technological solution, based on improvements to operations of computer systems and platforms, to the technical challenge of managing display interference caused by simultaneous edits to an electronic collaborative word processing document.
  • Some disclosed embodiments may involve systems, methods, and computer readable media for enabling dual mode editing in collaborative documents to enable private changes. Enabling dual mode editing may refer to presenting an interactable interface with the ability to provide two independent modes of making changes to an electronic document. In one mode, changes made in an electronic collaborative word processing document may be public changes and may also be known as collaborative mode. A public change may include any edit to an electronic collaborative document that may be shared with or accessible to all users (or a designated group of users) in the electronic collaborative document in real-time or near-real time. Alternatively, a user may, through dual mode editing, enable private changes. Enabling a private change may include providing options to a user on an associated computing device to make any edit to an electronic collaborative document that is not shared with all other users in real-time, or not shared to at least some users who may have access to an electronic collaborative document. Dual mode editing to enable private changes may operate in various ways. For example, in collaborative mode, all of a user's changes may be shared and displayed with all other users accessing an electronic collaborative document. When in private mode, a user may designate edits to a portion of an electronic collaborative document to be visible to a subset of all users who have access to the collaborative document. In another example, some or all of a user's edits may not be visible to other users with access to an electronic collaborative document until the user signals that the edits should be visible to other users. More generally, dual mode editing to enable private changes allows a user to make any edit to an electronic collaborative document while restricting the timing or audience of the user's edits.
  • Dual mode editing to enable private changes may be enabled in electronic collaborative documents. A collaborative document may include any electronic file that may be read by a computer program that provides for the input, editing, formatting, display, and output of text, graphics, widgets, data, objects, tables, or other elements typically used in computer desktop publishing applications. An electronic collaborative document may be stored in one or more repositories connected to a network accessible by one or more users via at least one associated computing device. In one embodiment, one or more users may simultaneously edit an electronic collaborative document, with all users' edits displaying in real-time or near real-time within the same collaborative document file. The one or more users may access the electronic collaborative document through one or more user devices connected to a network. An electronic collaborative document may include graphical user interface elements enabled to support the input, display, and management of multiple edits made by multiple users operating simultaneously within the same document. Though this disclosure subsequently refers to electronic collaborative word processing documents, the systems, methods, and techniques disclosed herein are not limited to word processing documents and may be adapted for use in other productivity applications such as documents, presentations, worksheets, databases, charts, graphs, digital paintings, electronic music and digital video or any other application software used for producing information.
  • FIG. 3 is an exemplary embodiment of a presentation of an electronic collaborative word processing document 301 via an editing interface or editor 300. Though an electronic collaborative word processing document is depicted in this example, solutions and techniques are not limited to electronic collaborative word processing documents and may be included in any other types of electronic collaborative documents described herein. The editor 300 may include any user interface components 302 through 312 to assist with input or modification of information in an electronic collaborative word processing document 301. For example, editor 300 may include an indication of an entity 312, which may include at least one individual or group of individuals associated with an account for accessing the electronic collaborative word processing document. User interface components may provide the ability to format a title 302 of the electronic collaborative word processing document, select a view 304, perform a lookup for additional features 306, view an indication of other entities 308 accessing the electronic collaborative word processing document at a certain time (e.g., at the same time or at a recorded previous time), and configure permission access 310 to the electronic collaborative word processing document. The electronic collaborative word processing document 301 may include information that may be organized into blocks as previously discussed. For example, a block 320 may itself include one or more blocks of information. Each block may have similar or different configurations or formats according to a default or according to user preferences. For example, block 322 may be a “Title Block” configured to include text identifying a title of the document, and may also contain, embed, or otherwise link to metadata associated with the title. A block may be pre-configured to display information in a particular format (e.g., in bold font). Other blocks in the same electronic collaborative word processing document 301, such as compound block 320 or input block 324 may be configured differently from title block 322. As a user inputs information into a block, either via input block 324 or a previously entered block, the platform may provide an indication of the entity 318 responsible for inputting or altering the information. The entity responsible for inputting or altering the information in the electronic collaborative word processing document may include any entity accessing the document, such as an author of the document or any other collaborator who has permission to access the document.
  • Some aspects of the present disclosure may involve accessing an electronic collaborative document. An electronic collaborative document may be stored in one or more data repositories and the document may be retrieved by one or more users for downloading, receiving, processing, editing, or viewing, the electronic collaborative document. An electronic collaborative document may be accessed by a user using a user device through a network. Accessing an electronic collaborative document may involve retrieving data through any electrical medium such as one or more signals, instructions, operations, functions, databases, memories, hard drives, private data networks, virtual private networks, Wi-Fi networks, LAN or WAN networks, Ethernet cables, coaxial cables, twisted pair cables, fiber optics, public switched telephone networks, wireless cellular networks, BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), or any other suitable communication method that provide a medium for exchanging data. In some embodiments, accessing information may include adding, editing, deleting, re-arranging, or otherwise modifying information directly or indirectly from the network. A user may access the electronic collaborative document using a user device, which may include a computer, laptop, smartphone, tablet, VR headset, smart watch, or any other electronic display device capable of receiving and sending data. In some embodiments, accessing the electronic document may include retrieving the electronic document from a web browser cache. Additionally or alternatively, accessing the electronic document may include connecting with a live data stream of the electronic word processing document from a remote source. In some embodiments, accessing the electronic document may include logging into an account having a permission to access the document. For example, accessing the electronic document may be achieved by interacting with an indication associated with the electronic word processing document, such as an icon or file name, which may cause the system to retrieve (e.g., from a storage medium) a particular electronic document associated with the indication.
  • For example, an electronic collaborative document may be stored in repository 230-1 as shown in FIG. 2. Repository 230-1 may be configured to store software, files, or code, such as electronic collaborative documents developed using computing device 100 or user device 220-1. Repository 230-1 may further be accessed by computing device 100, user device 220-1, or other components of system 200 for downloading, receiving, processing, editing, or viewing, the electronic collaborative document. Repository 230-1 may be any suitable combination of data storage devices, which may optionally include any type or combination of slave databases, load balancers, dummy servers, firewalls, back-up databases, and/or any other desired database components. In some embodiments, repository 230-1 may be employed as a cloud service, such as a Software as a Service (SaaS) system, a Platform as a Service (PaaS), or infrastructure as a Service (IaaS) system. For example, repository 230-1 may be based on infrastructure of services of Amazon Web Services™ (AWS), Microsoft Azure™, Google Cloud Platform™, Cisco Metapod™, Joyent™, vmWare™, or other cloud computing providers. Repository 230-1 may include other commercial file sharing services, such as Dropbox™, Google Docs™, or iCloud™. In some embodiments, repository 230-1 may be a remote storage location, such as a network drive or server in communication with network 210. In other embodiments repository 230-1 may also be a local storage device, such as local memory of one or more computing devices (e.g., computing device 100) in a distributed computing environment.
  • In some embodiments, a first editor and at least one second editor may be enabled to simultaneously edit and view each other's edits to the electronic collaborative document. A first editor may be a user interface that provides for the input, editing, formatting, display, and output of text, graphics, widgets, objects, tables, or other elements in an electronic word processing document or any other electronic collaborative document. A first editor may receive user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data. In one embodiment, a user accesses an electronic collaborative document using a computer and views the document in an editor that receives text and other input via a mouse and keyboard. Another instance of the electronic collaborative document may be presented via a second hardware device running a second editor, in a similar manner to the first hardware device and the first editor described herein. Any number of hardware devices may run an editor to access another instance of the electronic collaborative word processing document.
  • A first editor and at least one second editor may be enabled to simultaneously edit and view each other's edits to an electronic collaborative document. Enabling simultaneous editing and viewing of other's edits to an electronic collaborative document may include providing the ability to access an electronic collaborative document to multiple users at the same time such that information in the electronic collaborative document may be presented to the multiple users and authorize the multiple users to alter information presented to them. For example, edits made by a user in the first or the second instance of the electronic collaborative document may be incorporated into other instances of the electronic collaborative document in real time. In some embodiments, the first instance and the second instance of the electronic collaborative document may share a common viewport displaying some of the same data and information in both the first and second instances of the document. Edits made in the first or second instance may be demarcated by user identification indicators in the first and second instance. User identification indicators may include a graphic, a user ID indicator, a color, a font, or any other differentiator that indicates the source of an edit in an instance of the electronic collaborative document.
  • By way of example, FIG. 10 illustrates an electronic collaborative document (e.g., an electronic collaborative word processing document) presented within an editor 1000 operating in collaborative mode. In some embodiments, editor 1000 may be displayed by a computing device (e.g., the computing device 100 illustrated in FIG. 1), software running thereon, or any other projecting device ((e.g., projector, AR or VR lens, or any other display device) as previously discussed). Editor 1000 may include various tools for displaying information associated with the document of for editing the document. For example, editor 1000 may display a title 1002 indicating the title of the document. Formatting bar 1004 may depict various tools to adjust formatting of information or objects within the document. Help bar 1006 may be included which may provide hyperlinks to information about various features of the editor 1000. Share button 1010 may be included to invite additional users to edit another instance of the collaborative electronic word processing document. Editor 1000 may include tool bar 1012 and interface bar 1014. Editor 1000 may indicate the that multiple users are accessing an electronic collaborative document through the display of user indicator, such as user display indicator 1008 which indicates two users are running an instance of the electronic collaborative document. Editor 1000 may include current user indicator 1016. Current user indicator 1016 may indicate the identification of the user running the displayed instance of the collaborative document. In some embodiments, the objects and information displayed for editing may be controlled by the current user shown in 1016 in each instance of the electronic collaborative document. For example, FIG. 10 may depict an editing location that is actively edited by the current user, such as editing location 1024 indicated by cursor 1026. A second user, indicated by icon 1018, is actively editing in another instance of the electronic collaborative document paragraph block 1020. With operating in collaborative mode, edits made by the first user in the first editor are immediately displayed in the editor viewed by the second user, and vice versa. For instance, any information or data added at the active work location 1024 will be visible to the second user, and any information added by the second user to paragraph block 1020 will be visible in editor 1000. Future edits to additional fields, such as title block 1022 will also be visible in both editors. The first user and the second user may correspond to users operating one or more user devices shown in FIG. 2. For example, first user may operate user device 220-1 (of FIG. 2) to view editor 1000 (of FIG. 10). Second user may operate the second editor through user device 220-2 (of FIG. 2). Additional users may further access the electronic collaborative document using additional user devices.
  • Some aspects of the present disclosure may involve outputting first display signals for presenting an interface on a display of the first editor. A display signal may be an electronic instruction that transmits display information. A display signal may be any phenomena capable of transmitting electronic display information and may include a time varying voltage, current, or electromagnetic wave or any other method of transmitting data through an electrical medium. Outputting a display signal may include transmitting a signal containing instructions to present an interface on a display of a first editor. A first display signal may represent a display signal that may be transmitted at a certain period of time before subsequent display signals or before toggling a change in the dual mode. Presenting an interface on a display of a first editor may include displaying a visualization with activatable elements that a user may interact with and provide input on a user device, which may include a computer, laptop, smartphone, tablet, VR headset, smart watch, or any other electronic display device capable of receiving and sending data. An interface may display data and information associated with the editor and the collaborative electronic document. The interface may receive user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data. In one embodiment, presenting an interface on a display of a first editor may include a user accessing an electronic collaborative word processing document using a computer and viewing the document in an editor that receives text and other input via a mouse and keyboard.
  • In some embodiments, an interface may include a toggle enabling a first editor to switch between a collaborative mode and a private mode. As described above, collaborative mode may be a manner of displaying an electronic collaborative document where changes made by one or more users are public changes. A public change is any edit to an electronic collaborative document that is immediately shared with all users in the electronic collaborative document in real-time. A private mode may be a manner of displaying an electronic collaborative document where edits made by a user to an electronic collaborative document is not shared with all other users in real-time. As described herein, private mode may operate in various ways. For example, a user may designate edits to a portion of an electronic collaborative document to be visible to a subset of all users who have access to the collaborative document. In another example, some or all of a user's edits may not be visible to other users with access to an electronic collaborative document until the user signals toggles back to collaborative mode. More generally, private mode allows a user to make any edit to an electronic collaborative document while restricting the visibility of the user's edits to other users viewing other instances of an electronic collaborative document for a period of time.
  • In some embodiments, the interface may switch between a collaborative mode and a private mode via a toggle. A toggle may be any activatable graphical user interface element that enables a change between one state to another state. For example, a toggle may be a button or other icon in a user interface that can be selected by a user. In other embodiments, the toggle is presented outside of the interface in an HTML hyperlink or file path. For example, the system may generate a unique hyperlink for an instance of an electronic collaborative document with a selection between collaborative mode and private mode pre-enabled. When a user selects the hyperlink, an interface on a display of an editor may be displayed in collaborative mode or private mode as indicated in the instructions in the hyperlink. A toggle enabling the first editor to switch between a collaborative mode and a private mode may include any activatable element on an interface that may send instructions to a processor to operate in a collaborative mode, to operate in a private mode, or to switch from an actively operating collaborative mode to private mode and vice versa.
  • Some aspects of the present disclosure may involve receiving from a first editor operating in the collaborative mode, first edits to the electronic collaborative document. An edit to an electronic collaborative document may include the addition, manipulation, or deletion of objects or data, and may include addition, manipulation, or deletion of text, graphics, tables, images, formatting, highlights, manipulation of fonts, icons, shapes, references, headers, footers, or any other addition, deletion, or manipulation of objects or any other data within the electronic collaborative document. Receiving the first edits may include the system receiving an edit request from a computing device associated with a user. The request may be transmitted over a network to a repository where the electronic collaborative document is stored. At least one processor may then perform a lookup of permission settings to confirm whether the computing device has authorization to make the edit. In a situation where authorization is confirmed, the system may then implement and store the edit with the electronic collaborative document such that any other computing devices accessing the document may retrieve the document with the implemented change.
  • Some embodiments may involve outputting second display signals to a first editor and at least one second editor, the second display signals reflecting first edits made by the first editor. A second display signal may be a display signal that is made at a later time than a first display signal, which may be output and transmitted to cause a rendering of information as discussed previously above. The second display signal may reflect first edits made by the first editor. As described herein, while in collaborative mode, an edit made by a user may be immediately shared with all other users operating instances of the electronic collaborative document in additional editors in real-time. Once a user makes first edits in an interface of a first editor, second display signals may be transmitted to the first and second editor reflecting the changes, resulting in the edits being visible in both the first and second editors to each user. Second display signals are not limited to transmission to a first and second editor, but may also include transmission to any number of editors accessing the electronic collaborative document.
  • Some aspects of the present disclosure may involve receiving from a first editor interacting with an interface, a private mode change signal reflecting a request to change from a collaborative mode to a private mode. A private mode change signal may be any electronic communications instruction from an editor indicating an intent to enable private mode operation from a collaborate mode operation. A private mode change signal may be indicated by user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data, which may then be received by at least one processor to carry out the associated instructions. In some embodiments, the private mode change signal may be generated by a user selecting a toggle in a graphical user interface.
  • Some embodiments may include, in response to a first mode change signal, initiating in connection with an electronic collaborative document a private mode for the first editor. Initiating the private mode for the first editor in connection with an electronic collaborative document may include causing some or all of the edits made in the first editor associated with a first editor to be withheld from display in other instances of the collaborative electronic document in other editors. Private mode may be initiated for all or part of an electronic collaborative document. For example, initiating private mode may cause all changes made in the first editor to be visible in the first editor only and not be visible in other instances of the collaborative electronic document in the second editor or any other editor. In some embodiments, private mode may be initiated in a portion of the collaborative electronic document. As described herein, collaborative electronic documents may be organized into one or more blocks of information. Private mode may be enabled for one or more blocks as designated by the user through the editor. In this scenario, changes made to blocks via the first editor that have private mode initiated will not display in other instances of the electronic collaborative word processing document, and changes made to blocks where private mode is not initiated changes will continue to display edits made by the first editor in real time.
  • By way of example, FIG. 11 depicts an exemplary editor 1100 for an electronic collaborative document with an option for enabling dual mode editing to enable private changes displayed. Editor 1100 may include private mode change toggle 1106 which may cause a private mode change signal to be transmitted as described herein. Once editor 1100 activates the private mode change toggle 1106 via a user input, the private mode for the first editor may be initiated.
  • Some aspects of the present disclosure may involve, in a private mode, receiving from a first editor, second edits to an electronic collaborative document. Second edits may include the addition, manipulation, or deletion of objects or data, and may include addition, manipulation, or deletion of text, graphics, tables, images, formatting, highlights, manipulation of fonts, icons, shapes, references, headers, footers, or any other addition, deletion, or manipulation of objects or data within the electronic collaborative document as previously discussed. As used herein, second edits may refer to edits made in a second instance of the collaborative electronic word processing document while private mode is enabled. Second edits may occur either earlier in time, later in time, or simultaneously with first edits and are not limited to edits occurring later in time than first edits in the document. For example, a user may toggle between collaborative mode and private mode multiple times. In this example, all edits made while operating in private mode may be considered second edits, even if the edits were made before or after first edits made while the editor is in collaborative mode.
  • Some aspects of the present disclosure may involve, in response to second edits, outputting third display signals to a first editor while withholding third display signals from at least one second editor. A third display signal may be a display signal that contains data for second edits that may be transmitted to cause a presentation of the second edits, consistent with the earlier discussion. Withholding a display signal may include not transmitting the display signal so that an editor does not receive information associated with the display signal. The processor may transmit the third display signal with second edits made by the first editor to a display (e.g., the first editor may be re-rendered to include the second edits in a presentation) while the processor may not transmit or withhold the third display signal to the second editor (e.g., resulting in the second editor to not re-render with the second edits. The third display signal may be differentiated between the first and second display signals in that the third display signal contains second edits made by an editor while private mode is enabled. Outputting third display signals to the first editor while withholding the third display signals from the at least one second editor may enable second edits to appear on a display of the first editor and prevent second edits from appearing on at least one display of the at least one second editor. Third display signals that are unique from first or second display signals may be transmitted containing instructions to display the second edits. The third display signals may be selectively transmitted some but not all editors. For example, a user operating a first editor may add text data to a document after enabling private mode. Upon receipt of third display signals, the user's text will display in the editor operated by the user (e.g., second edits may appear on a display of the first editor). When private mode editing is enabled, the third display signals may be withheld from the second editor, which means the second edits may not display in the second editor (e.g., second edits are prevented from appearing on at least one display of at least one second editor). By enabling private mode editing in part or all of a document, the user operating the first editor designates which editors receive third display signals containing second edits and designates which editors do not receive third display signals and continue to receive second display signals instead.
  • By way of example and returning to FIG. 11, editor 1100 may include first edits made in collaborative mode that are visible to all users accessing the electronic collaborative document. For example, text block 1101 includes first edits made by editor 1100 that are displayed in editor 1100 and in other editors accessing the electronic collaborative document. Text block 1102 displays edits made in editor 1100 in private mode, Text block 1102 is displayed in editor 1100 but not in any other editors viewing the same electronic collaborative document. Additional edits made at cursor 1104 while private mode is enabled may not be displayed in other editors viewing the same electronic collaborative document until collaborative mode is enabled.
  • Some aspects of the present disclosure may involve receiving from a first editor interacting with an interface, a collaborative mode change signal reflecting a request to change from a private mode to a collaborative mode. A collaborative mode change signal may be any electronic communications instruction from the editor indicating an intent to enable collaborative mode operations. A collaborative mode change signal may be indicated by user input via a keyboard, mouse, microphone, digital camera, scanner, voice sensing, webcam, biometric device, stylus, haptic devices, or any other input device capable of transmitting input data. In some embodiments, the collaborative mode change signal may be generated by a user selecting a toggle in a graphical user interface. In response to receipt of the collaborative mode change signal, subsequent edits made by the first editor may be enabled to be viewed by the at least one second editor. A subsequent edit may include an edit made by the first editor after receipt of the collaborative mode change signal. When edits are made by a first editor in collaborative mode, these edits may be immediately shared in real time to all other users and rendered on associated displays of the users accessing the collaborative electronic document. In some embodiments, the collaborative mode change signal may be toggled for the entire document. In this embodiment, all subsequent edits made to the document in collaborative mode may be displayed in other editors viewing other instances of the electronic document. In other embodiments, the collaborative mode change signal may be applied to one or more portions of a document. In this embodiment, only subsequent edits to certain portions of the electronic collaborative document may be displayed to all other editors in real time, while other portions of the electronic collaborative document remain in private mode. In some embodiments, a collaborative mode change signal may be toggled with respect to one or more blocks and may operate at the block level.
  • Some aspects of the present disclosure may involve segregating second edits made in private mode, such that upon return to a collaborative mode, viewing of the second edits are withheld from at least one second editor. Segregating second edits made in private mode may involve a method of saving and storing data that independently tracks and stores data associated with second edits in a manner that does not transmit the stored data until additional instructions are received to release the segregated second edits to particular editors. Data indicating that the edits were made in private mode may be stored as a property of the document, and in some embodiments, may be stored as a property of each individual block in the document. For example, a first editor may be enabled in private mode and may make second edits in private mode to one or more blocks of an electronic document. These edits may be initially withheld from display to other instances of the electronic document. Continuing with the example, the editor may close the document and reopen it at a later time and toggle collaborative mode. The second edits made to the one or more blocks may be displayed in the first editor but may not be displayed in the second editor or other editors because the second edits have been segregated when they were made in private mode. More generally, segregating edits made in private mode may refer to any method of data manipulation and storage that tracks the state of the dual mode at the of the editor at the time the second edits are made.
  • Some aspects of the present disclosure may involve receiving from a first editor a release signal, and in response thereto, enabling at least one second editor to view the second edits. Receiving a release signal from an editor may include any electronic communications instruction from the editor that transmits a user desire to publish second edits to the second editor. Enabling an editor to view edits may include transmitting a display signal to a particular computing device associated with an editor to cause information associated with particular edits to be rendered on a screen associated with the editor. As previously discussed, an editor may utilize both a collaborative mode and a private mode when editing an electronic document. Edits made in the electronic document while operating in collaborative mode may be shared and displayed in real time to all other users. Edits made in private mode may not be shared with all other users in the electronic collaborative document. In some embodiments, switching between collaborative mode and private mode may not publish the edits made to the electronic collaborative document that were made in private mode. Instead, a release signal may operate to publish edits made in private mode to the other users in the electronic collaborative document. An editor may transmit a release signal in response to various inputs. For example, the editor may include a button, toggle, switch, or other GUI element that releases all second edits made to an electronic collaborative document. In another embodiment, release signals may be transmitted that correspond to a portion of the electronic document. For example, a release signal may be transmitted that applies to one or more blocks in the electronic document. A user may indicate a desire to transmit a release signal by selecting a block and selecting a release icon. As an illustrative example, the editor may allow a user to right click on a block and select an option to release second edits in the block. In other embodiments, release signals may trigger automatically in accordance with various user settings. For example, user settings may cause release signals to be transmitted based on pre-determined intervals of time, based on certain users with superior administrative privileges viewing the document in another editor, or based on a predetermined action performed by the user, such as closing the editor. In some embodiments, enabling the at least one second editor to view the second edits may include displaying to the at least one second editor, in association with the second edits, an identity of the first editor. An identity of the first editor may be associated with the user operating the editor and may include any indicator (e.g., alphanumeric, graphical, or combination thereof). For example, a user operating a first editor may have a user account with personal identifying information, such as a name, username, photo, employee ID, or any other personal information. Displaying to the at least on second editor an identity of the first editor may include causing a presentation of an indication of the user account associated with the first editor. In some embodiments, the identity of the first editor may be displayed with an icon that is visible in the second editor. The icon may contain personally identifying information such as a name, initials, a photo, or other data. In some embodiments, the identity of the first editor may be displayed to the second editor in association with the second edits. Displaying an identity in association with second edits may include rendering a visual indicator of the identity of the first editor in or near the second edits in a co-presentation, or in response to an interaction (e.g., a cursor hover over the second edits). For example, the visual link may include an icon, a font, a highlight, a graphic, a color of text or any other data property identifying the identity of the first editor placed adjacent or in the edits in the display. In another example, the identity of the first editor may be displayed in response to an input in the second editor. For instance, the user operating the second editor may receive display information indicating second edits displayed in a color. Upon selecting the edits or placing a cursor near the edits, a popup may be displayed that identifies the identity of the first editor using a visual indicator as described herein.
  • In some embodiments, in response to receiving a release signal, at least one processor may compare second edits made in private mode to original text in an electronic collaborative document, identify differences based on the comparison, and present the differences in connection with text of the electronic collaborative document to thereby indicate changes originally made during private mode. Original text may include any or all text or data in an electronic collaborative document that the document contained prior to second edits made by the first editor. The processor may identify the second edits made in private mode by segregating the data associated with second edits as described herein. Comparing second edits to original text in an electronic collaborative document may include a calculation of differences and/or similarities between data contained in the second edits to the original text in an electronic document. Identifying differences between the original text and data in the electronic collaborative document and the second edits may include analyzing the differences in objects, text, and other data between the original version and the second edits after a comparison and associating a tag with the different data in the repository so that the processor may later locate the data that is different. Differences may include the addition, deletion, or modification of text, objects, tables, pictures, fonts, colors, object properties, graphics, visual or audio data, or any other manipulation of data in the electronic document made in private mode. Presenting the differences in connection with text of the electronic collaborative document to thereby indicate changes originally made during private mode may include rendering an indication that certain text, objects, or data have been changed as compared to the original version. In one embodiment, changes to text are presented by displaying additional or deleted text in a particular color, font, or format. For example, additional text may be displayed in red with underlines and deleted text may be indicated by a strikethrough. In other embodiments, changes to the document may be indicated by highlighting, font changes, imbedded objects or pop-up indicators, or any other method capable of visually distinguishing types of data in an electronic collaborative document. In some embodiments, the color associated with the changes to text or other objects corresponds with the identity of the user who made the second edits.
  • Some aspects of the present disclosure may include receiving from a first editor, in association with a text block, a retroactive privatization signal, and upon receipt of the retroactive privatization signal, withhold the text block from display to at least one second editor. A retroactive privatization signal may be a data signal that indicates a portion of text that should be withheld from display to a second editor or any additional editors. A retroactive privatization signal may function to transfer a portion or all of a document to private mode, thereby allowing the first editor to view and manipulate objects, text, or data in the portion of the document in private mode. Receiving a retroactive privatization signal may be associated with a text block which may involve obtaining instructions to retroactively mark a particular region of text as private. For example, a user running a first editor may wish to hide certain portions of a document containing confidential financial information from view of one or all other users. The user may select the block or blocks of text data containing the confidential information and transmit a privatization signal which causes the display signals being transmitted to the other users to not display the blocks containing confidential financial information. Any block or blocks of data may be designated associated with a retroactive privatization signal, which may transmit the objects, text, and data inside the block or blocks to private mode (e.g., causing a re-rendering of the displays of the users to omit the data designated to be retroactively private). Withholding the text block from display to a second editor may include causing a re-rendering of a display of the second editor to delete, omit, obscure, or reduce access to information marked as private. A retroactive privatization signal may be disabled by an editor sending a release signal.
  • Some aspects of the present disclosure may include receiving from a first editor operating in private mode an exemption signal for at least one particular editor, to thereby enable the at least one particular editor to view the second edits. Receiving an exemption signal may include obtaining an electronic transmittal of data or instructions from a computing device associated with a user interacting with an editor to enable a particular editor to receive display signals causing a display to show the changes made in private mode by the first editor. For example, a user operating the first editor may wish to make private edits in private mode and may wish to share the edits with a particular user without publishing the edits to all other users in the electronic collaborative document by sending a release signal. By sending an exemption signal, the first editor may designate one or more other editors to receive third display signals containing the second edits made in the first editor. Receiving an exemption signal for at least one particular editor to thereby enable the at least one particular editor to view the second edits may include receiving instructions in an exemption signal that may allow a user to share edits with some users and hide the edits from other users. For example, a large team of several dozen users may collaborate on a single electronic collaborative document. In this example, there may be a desire to include a section of the document that contains confidential information, such as salary information. A user may enable private mode editing to begin privately adding confidential data to the document that may be hidden from all other users. The user's editor may then transmit an exemption signal to another user's editor who can now view the confidential information and the confidential information will remain hidden from the other users working in the document. In some embodiments, an exemption signal may be applied to one or more blocks in an electronic document, thereby enabling particular editors to view the second edits associated with that block.
  • FIG. 12 illustrates a block diagram of an example process 1200 for enabling dual mode editing in collaborative documents to enable private changes. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram. In some embodiments, the process 1200 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1) of a computing device (e.g., the computing device 100 in FIGS. 1 and 2) to perform operations or functions described herein and may be described hereinafter with reference to FIGS. 10 to 11 by way of example. In some embodiments, some aspects of the process 1200 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1) or a non-transitory computer-readable medium. In some embodiments, some aspects of the process 1200 may be implemented as hardware (e.g., a specific-purpose circuit). In some embodiments, the process 1200 may be implemented as a combination of software and hardware.
  • FIG. 12 includes process blocks 1202 to 1216. At block 1202, a processing means may access an electronic collaborative document in which a first editor and at least one second editor are enabled to simultaneously edit and view each other's edits to the electronic collaborative document, as discussed previously in the disclosure above.
  • At block 1204, the processing means may output first display signals for presenting an interface on a display of the first editor, the interface including a toggle enabling the first editor to switch between a collaborative mode and a private mode, as discussed previously in the disclosure above.
  • At block 1206, the processing means may receive from the first editor operating in the collaborative mode, first edits to the electronic collaborative document, as discussed previously in the disclosure above.
  • At block 1208, the processing means may output second display signals to the first editor and the at least one second editor, the second display signals reflecting the first edits made by the first editor, as discussed previously in the disclosure above.
  • At block 1210, the processing means may receive from the first editor interacting with the interface, a private mode change signal reflecting a request to change from the collaborative mode to the private mode, as discussed previously in the disclosure above.
  • At block 1212, the processing means may, in response to the first mode change signal, initiate in connection with the electronic collaborative document the private mode for the first editor, as discussed previously in the disclosure above.
  • At block 1214, the processing means may, in the private mode, receive from the first editor, second edits to the electronic collaborative document, as discussed previously in the disclosure above.
  • At block 1216, the processing means may in response to the second edits, output third display signals to the first editor while withholding the third display signals from the at least one second editor, such that the second edits are enabled to appear on a display of the first editor and are prevented from appearing on at least one display of the at least one second editor, as discussed previously in the disclosure above.
  • Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.
  • Implementation of the method and system of the present disclosure may involve performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present disclosure, several selected steps may be implemented by hardware (HW) or by software (SW) on any operating system of any firmware, or by a combination thereof. For example, as hardware, selected steps of the disclosure could be implemented as a chip or a circuit. As software or algorithm, selected steps of the disclosure could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the disclosure could be described as being performed by a data processor, such as a computing device for executing a plurality of instructions.
  • As used herein, the terms “machine-readable medium” “computer-readable medium” refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
  • Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • Although the present disclosure is described with regard to a “computing device”, a “computer”, or “mobile device”, it should be noted that optionally any device featuring a data processor and the ability to execute one or more instructions may be described as a computing device, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, a smart watch or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network”.
  • To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
  • The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • It should be appreciated that the above described methods and apparatus may be varied in many ways, including omitting or adding steps, changing the order of steps and the type of devices used. It should be appreciated that different features may be combined in different ways. In particular, not all the features shown above in a particular embodiment or implementation are necessary in every embodiment or implementation of the invention. Further combinations of the above features and implementations are also considered to be within the scope of some embodiments or implementations of the invention.
  • While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components and/or features of the different implementations described.
  • Disclosed embodiments may include any one of the following bullet-pointed features alone or in combination with one or more other bullet-pointed features, whether implemented as a method, by at least one processor, and/or stored as executable instructions on non-transitory computer-readable media:
      • accessing the electronic collaborative word processing document;
      • presenting a first instance of the electronic collaborative word processing document via a first hardware device running a first editor;
      • presenting a second instance of the electronic collaborative word processing document via a second hardware device running a second editor;
      • receiving from the first editor during a common editing period, first edits to the electronic collaborative word processing document;
      • wherein the first edits occur on a first earlier page of the electronic collaborative word processing document and result in a pagination change;
      • receiving from the second editor during the common editing period, second edits to the electronic collaborative word processing document;
      • wherein the second edits occur on a second page of the electronic collaborative word processing document later than the first page;
      • during the common editing period, locking a display associated with the second hardware device to suppress the pagination change caused by the first edits received by the second hardware device;
      • upon receipt of a scroll-up command via the second editor during the common editing period, causing the display associated with the second hardware device to reflect the pagination change caused by the first edits;
      • recognizing an active work location of the second editor and to lock display scrolling associated with the second display based on the recognized active work location so as not to interrupt viewing of the active work location;
      • wherein the recognition of the active work location is based on a cursor location in the second instance of the collaborative electronic word processing document;
      • wherein the recognition of the active work location is based on a scrolling location in the second instance of the collaborative electronic word processing document;
      • wherein the lock remains in place until the active work location is changed in the second editor;
      • wherein the scroll-up command that causes the second hardware device to reflect the pagination change includes a scroll to a page other than a page currently displayed on the second display;
      • wherein the second edits are associated with a block in the electronic collaborative word processing document;
      • wherein the recognition of the active work location is based on a block location in the second instance of the collaborative electronic word processing document;
      • accessing an electronic collaborative document in which a first editor and at least one second editor are enabled to simultaneously edit and view each other's edits to the electronic collaborative document;
      • outputting first display signals for presenting an interface on a display of the first editor, the interface including a toggle enabling the first editor to switch between a collaborative mode and a private mode;
      • receiving from the first editor operating in the collaborative mode, first edits to the electronic collaborative document;
      • outputting second display signals to the first editor and the at least one second editor, the second display signals reflecting the first edits made by the first editor,
      • receiving from the first editor interacting with the interface, a private mode change signal reflecting a request to change from the collaborative mode to the private mode;
      • in response to the first mode change signal, initiating in connection with the electronic collaborative document the private mode for the first editor;
      • in the private mode, receiving from the first editor, second edits to the electronic collaborative document;
      • in response to the second edits, outputting third display signals to the first editor while withholding the third display signals from the at least one second editor, such that the second edits are enabled to appear on a display of the first editor and are prevented from appearing on at least one display of the at least one second editor;
      • receiving from the first editor interacting with the interface, a collaborative mode change signal reflecting a request to change from the private mode to the collaborative mode;
      • in response to receipt of the collaborative mode change signal, enabling subsequent edits made by the first editor to be viewed by the at least one second editor;
      • segregating the second edits made in private mode, such that upon return to the collaborative mode, viewing of the second edits are withheld from the at least one second editor;
      • receiving from the first editor a release signal, and in response thereto, enabling the at least one second editor to view the second edits;
      • wherein enabling the at least one second editor to view the second edits includes displaying to the at least one second editor, in association with the second edits, an identity of the first editor,
      • in response to receiving the release signal, comparing the second edits made in private mode to original text in the electronic collaborative document;
      • identifying differences based on the comparison;
      • presenting the differences in connection with text of the electronic collaborative document to thereby indicate changes originally made during private mode;
      • receiving from the first editor, in association with a text block, a retroactive privatization signal;
      • upon receipt of the retroactive privatization signal, withholding the text block from display to the at least one second editor; and
      • receiving from the first editor operating in private mode an exemption signal for at least one particular editor, to thereby enable the at least one particular editor to view the second edits.
  • Systems and methods disclosed herein involve unconventional improvements over conventional approaches. Descriptions of the disclosed embodiments are not exhaustive and are not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. Additionally, the disclosed embodiments are not limited to the examples discussed herein.
  • The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. For example, the described implementations include hardware and software, but systems and methods consistent with the present disclosure may be implemented as hardware alone.
  • It is appreciated that the above described embodiments can be implemented by hardware, or software (program codes), or a combination of hardware and software. If implemented by software, it can be stored in the above-described computer-readable media. The software, when executed by the processor can perform the disclosed methods. The computing units and other functional units described in the present disclosure can be implemented by hardware, or software, or a combination of hardware and software. One of ordinary skill in the art will also understand that multiple ones of the above described modules/units can be combined as one module or unit, and each of the above described modules/units can be further divided into a plurality of sub-modules or sub-units.
  • The block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer hardware or software products according to various example embodiments of the present disclosure. In this regard, each block in a flowchart or block diagram may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical functions. It should be understood that in some alternative implementations, functions indicated in a block may occur out of order noted in the figures. For example, two blocks shown in succession may be executed or implemented substantially concurrently, or two blocks may sometimes be executed in reverse order, depending upon the functionality involved. Some blocks may also be omitted. It should also be understood that each block of the block diagrams, and combination of the blocks, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or by combinations of special purpose hardware and computer instructions.
  • In the foregoing specification, embodiments have been described with reference to numerous specific details that can vary from implementation to implementation. Certain adaptations and modifications of the described embodiments can be made. Other embodiments can be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as example only, with a true scope and spirit of the invention being indicated by the following claims. It is also intended that the sequence of steps shown in figures are only for illustrative purposes and are not intended to be limited to any particular sequence of steps. As such, those skilled in the art can appreciate that these steps can be performed in a different order while implementing the same method.
  • It will be appreciated that the embodiments of the present disclosure are not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes may be made without departing from the scope thereof.
  • Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosed embodiments being indicated by the following claims.
  • Computer programs based on the written description and methods of this specification are within the skill of a software developer. The various programs or program modules can be created using a variety of programming techniques. One or more of such software sections or modules can be integrated into a computer system, non-transitory computer readable media, or existing software.
  • Moreover, while illustrative embodiments have been described herein, the scope includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations or alterations based on the present disclosure. The elements in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. These examples are to be construed as non-exclusive. Further, the steps of the disclosed methods can be modified in any manner, including by reordering steps or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.

Claims (20)

What is claimed is:
1. A system for managing display interference in an electronic collaborative word processing document, the system comprising:
at least one processor configured to:
access the electronic collaborative word processing document;
present a first instance of the electronic collaborative word processing document via a first hardware device running a first editor;
present a second instance of the electronic collaborative word processing document via a second hardware device running a second editor;
receive from the first editor during a common editing period, first edits to the electronic collaborative word processing document, wherein the first edits occur on a first earlier page of the electronic collaborative word processing document and result in a pagination change;
receive from the second editor during the common editing period, second edits to the electronic collaborative word processing document, wherein the second edits occur on a second page of the electronic collaborative word processing document later than the first page;
during the common editing period, lock a display associated with the second hardware device to suppress the pagination change caused by the first edits received by the second hardware device; and
upon receipt of a scroll-up command via the second editor during the common editing period, cause the display associated with the second hardware device to reflect the pagination change caused by the first edits.
2. The system of claim 1, wherein the at least one processor is further configured to recognize an active work location of the second editor and to lock display scrolling associated with the second display based on the recognized active work location so as not to interrupt viewing of the active work location.
3. The system of claim 2, wherein the recognition of the active work location is based on a cursor location in the second instance of the collaborative electronic word processing document.
4. The system of claim 2, wherein the recognition of the active work location is based on a scrolling location in the second instance of the collaborative electronic word processing document.
5. The system of claim 2, wherein the lock remains in place until the active work location is changed in the second editor.
6. The system of claim 1, wherein the scroll-up command that causes the second hardware device to reflect the pagination change includes a scroll to a page other than a page currently displayed on the second display.
7. A non-transitory computer readable medium containing instructions configured to cause at least one processor to perform operations for managing display interference in an electronic collaborative word processing document, the operations comprising:
receiving via a communications interface during a common editing period, first edits from a first editor accessing a first instance of the electronic collaborative document via a first hardware device, wherein the first edits occur on a first earlier page of the electronic collaborative word processing document and result in a pagination change;
receiving during the common editing period, second edits from a second editor accessing a second instance of the electronic collaborative document via a second hardware device;
during the common editing period, locking a display associated with the second hardware device to suppress the pagination change caused by the first edits received via the communications interface; and
upon receipt of a scroll-up command via the second editor during the common editing period, causing the display associated with the second hardware device to reflect the pagination change caused by the first edits.
8. The non-transitory computer readable medium of claim 7, wherein the operations further include recognizing an active work location of the second editor and to lock display scrolling associated with the second display based on the recognized active work location so as not to interrupt viewing of the active work location.
9. The non-transitory computer readable medium of claim 8, wherein the recognition of the active work location is based on a cursor location in the second instance of the collaborative electronic word processing document.
10. The non-transitory computer readable medium of claim 8, wherein the recognition of the active work location is based on a scrolling location in the second instance of the collaborative electronic word processing document.
11. The non-transitory computer readable medium of claim 8, wherein the lock remains in place until the active work location is changed in the second editor.
12. The non-transitory computer readable medium of claim 7, wherein the scroll-up command that causes the second hardware device to reflect the pagination change includes a scroll to a page other than a page currently displayed on the second display.
13. A method for managing a display interference in an electronic collaborative word processing document, the method comprising:
accessing the electronic collaborative word processing document;
presenting a first instance of the electronic collaborative word processing document via a first hardware device running a first editor;
presenting a second instance of the electronic collaborative word processing document via a second hardware device running a second editor;
receiving from the first editor during a common editing period, first edits to the electronic collaborative word processing document, wherein the first edits occur on a first earlier page of the electronic collaborative word processing document and result in a pagination change;
receiving from the second editor during the common editing period, second edits to the electronic collaborative word processing document, wherein the second edits occur on a second page of the electronic collaborative word processing document later than the first page;
during the common editing period, locking a display associated with the second hardware device to suppress the pagination change caused by the first edits received by the second hardware device; and
upon receipt of a scroll-up command via the second editor during the common editing period, causing the display associated with the second hardware device to reflect the pagination change caused by the first edits.
14. The method of claim 13, further comprising recognizing an active work location of the second editor and locking display scrolling associated with the second display based on the recognized active work location so as not to interrupt viewing of the active work location.
15. The method of claim 14, wherein the recognition of the active work location is based on a cursor location in the second instance of the collaborative electronic word processing document.
16. The method of claim 14, wherein the recognition of the active work location is based on a scrolling location in the second instance of the collaborative electronic word processing document.
17. The method of claim 14, wherein the lock remains in place until the active work location is changed in the second editor.
18. The method of claim 13, wherein the scroll-up command that causes the second hardware device to reflect the pagination change includes a scroll to a page other than a page currently displayed on the second display.
19. The method of claim 13, wherein the second edits are associated with a block in the electronic collaborative word processing document.
20. The method of claim 14, wherein the recognition of the active work location is based on a block location in the second instance of the collaborative electronic word processing document.
US17/565,801 2021-01-14 2021-12-30 Digital processing systems and methods for display pane scroll locking during collaborative document editing in collaborative work systems Active US11397847B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/565,801 US11397847B1 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for display pane scroll locking during collaborative document editing in collaborative work systems

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
PCT/IB2021/000024 WO2021144656A1 (en) 2020-01-15 2021-01-14 Digital processing systems and methods for graphical dynamic table gauges in collaborative work systems
PCT/IB2021/000090 WO2021161104A1 (en) 2020-02-12 2021-02-11 Enhanced display features in collaborative network systems, methods, and devices
PCT/IB2021/000297 WO2021220058A1 (en) 2020-05-01 2021-04-28 Digital processing systems and methods for enhanced collaborative workflow and networking systems, methods, and devices
US202163233925P 2021-08-17 2021-08-17
US202163273448P 2021-10-29 2021-10-29
US202163273453P 2021-10-29 2021-10-29
PCT/IB2021/062440 WO2022153122A1 (en) 2021-01-14 2021-12-29 Systems, methods, and devices for enhanced collaborative work documents
US17/565,801 US11397847B1 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for display pane scroll locking during collaborative document editing in collaborative work systems

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/062440 Continuation WO2022153122A1 (en) 2021-01-14 2021-12-29 Systems, methods, and devices for enhanced collaborative work documents

Publications (2)

Publication Number Publication Date
US20220222426A1 true US20220222426A1 (en) 2022-07-14
US11397847B1 US11397847B1 (en) 2022-07-26

Family

ID=82322776

Family Applications (12)

Application Number Title Priority Date Filing Date
US17/565,821 Active US11392556B1 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for draft and time slider for presentations in collaborative work systems
US17/565,801 Active US11397847B1 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for display pane scroll locking during collaborative document editing in collaborative work systems
US17/565,780 Pending US20220221966A1 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for dual mode editing in collaborative documents enabling private changes in collaborative work systems
US17/565,699 Active US11481288B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for historical review of specific document edits in collaborative work systems
US17/565,614 Active US11782582B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for detectable codes in presentation enabling targeted feedback in collaborative work systems
US17/565,526 Active US11475215B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for dynamic work document updates using embedded in-line links in collaborative work systems
US17/565,652 Active US11531452B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for group-based document edit tracking in collaborative work systems
US17/565,718 Active US11928315B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for tagging extraction engine for generating new documents in collaborative work systems
US17/565,534 Active US11687216B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for dynamically updating documents with data from linked files in collaborative work systems
US17/565,843 Active US11893213B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for embedded live application in-line in a word processing document in collaborative work systems
US17/565,853 Active US11726640B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for granular permission system for electronic documents in collaborative work systems
US17/565,880 Active US11449668B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for embedding a functioning application in a word processing document in collaborative work systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/565,821 Active US11392556B1 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for draft and time slider for presentations in collaborative work systems

Family Applications After (10)

Application Number Title Priority Date Filing Date
US17/565,780 Pending US20220221966A1 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for dual mode editing in collaborative documents enabling private changes in collaborative work systems
US17/565,699 Active US11481288B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for historical review of specific document edits in collaborative work systems
US17/565,614 Active US11782582B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for detectable codes in presentation enabling targeted feedback in collaborative work systems
US17/565,526 Active US11475215B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for dynamic work document updates using embedded in-line links in collaborative work systems
US17/565,652 Active US11531452B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for group-based document edit tracking in collaborative work systems
US17/565,718 Active US11928315B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for tagging extraction engine for generating new documents in collaborative work systems
US17/565,534 Active US11687216B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for dynamically updating documents with data from linked files in collaborative work systems
US17/565,843 Active US11893213B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for embedded live application in-line in a word processing document in collaborative work systems
US17/565,853 Active US11726640B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for granular permission system for electronic documents in collaborative work systems
US17/565,880 Active US11449668B2 (en) 2021-01-14 2021-12-30 Digital processing systems and methods for embedding a functioning application in a word processing document in collaborative work systems

Country Status (1)

Country Link
US (12) US11392556B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240019992A1 (en) * 2022-07-13 2024-01-18 Beijing Paoding Technology Co., Ltd Document content point-and-select method, electronic apparatus and medium

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017189933A1 (en) * 2016-04-27 2017-11-02 Krypton Project, Inc. System, method, and apparatus for operating a unified document surface workspace
US11605187B1 (en) * 2020-08-18 2023-03-14 Corel Corporation Drawing function identification in graphics applications
US11556223B2 (en) * 2020-08-27 2023-01-17 Ebay Inc. Automatic feedback system using visual interactions
US11880650B1 (en) * 2020-10-26 2024-01-23 Ironclad, Inc. Smart detection of and templates for contract edits in a workflow
US20220165024A1 (en) * 2020-11-24 2022-05-26 At&T Intellectual Property I, L.P. Transforming static two-dimensional images into immersive computer-generated content
WO2022179598A1 (en) * 2021-02-26 2022-09-01 北京字跳网络技术有限公司 Information processing, information interaction, tag viewing and information display method and apparatus
US20230055241A1 (en) * 2021-08-17 2023-02-23 Monday.com Ltd. Digital processing systems and methods for external events trigger automatic text-based document alterations in collaborative work systems
US11763258B2 (en) * 2021-12-29 2023-09-19 Slack Technologies, Llc Workflows for documents
US11875081B2 (en) 2022-01-31 2024-01-16 Salesforce, Inc. Shared screen tools for collaboration
US11727190B1 (en) 2022-01-31 2023-08-15 Salesforce, Inc. Previews for collaborative documents
US11933986B2 (en) * 2022-03-11 2024-03-19 Bank Of America Corporation Apparatus and methods to extract data with smart glasses
US20230328122A1 (en) * 2022-04-07 2023-10-12 Sigma Computing, Inc. Live editing a workbook with multiple clients
CN114780083B (en) * 2022-06-17 2022-10-18 之江实验室 Visual construction method and device of knowledge map system
US20240045913A1 (en) * 2022-08-03 2024-02-08 Capital One Services, Llc Systems and methods for active web-based content filtering
US11644948B1 (en) * 2022-11-10 2023-05-09 Bolt-On Ip Solutions, Llc Method and system for presenting slides during live presentations
US11935007B1 (en) 2022-12-27 2024-03-19 Dropbox, Inc. Generating collaborative content items to provide customizable graphical representations in online graphical user interfaces

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5220657A (en) * 1987-12-02 1993-06-15 Xerox Corporation Updating local copy of shared data in a collaborative system
US20090271696A1 (en) * 2008-04-28 2009-10-29 Microsoft Corporation Conflict Resolution
US20170076101A1 (en) * 2015-09-10 2017-03-16 Airwatch Llc Systems for modular document editing

Family Cites Families (869)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4972314A (en) 1985-05-20 1990-11-20 Hughes Aircraft Company Data flow signal processor method and apparatus
GB2241629A (en) 1990-02-27 1991-09-04 Apple Computer Content-based depictions of computer icons
US5517663A (en) 1993-03-22 1996-05-14 Kahn; Kenneth M. Animated user interface for computer program creation, control and execution
US5632009A (en) 1993-09-17 1997-05-20 Xerox Corporation Method and system for producing a table image showing indirect data representations
US6034681A (en) 1993-12-17 2000-03-07 International Business Machines Corp. Dynamic data link interface in a graphic user interface
US5682469A (en) 1994-07-08 1997-10-28 Microsoft Corporation Software platform having a real world interface with animated characters
US5696702A (en) 1995-04-17 1997-12-09 Skinner; Gary R. Time and work tracker
US5726701A (en) 1995-04-20 1998-03-10 Intel Corporation Method and apparatus for stimulating the responses of a physically-distributed audience
US5845257A (en) 1996-02-29 1998-12-01 Starfish Software, Inc. System and methods for scheduling and tracking events across multiple time zones
US5787411A (en) 1996-03-20 1998-07-28 Microsoft Corporation Method and apparatus for database filter generation by display selection
US6275809B1 (en) 1996-05-15 2001-08-14 Hitachi, Ltd. Business processing system employing a notice board business system database and method of processing the same
JPH10124649A (en) 1996-10-21 1998-05-15 Toshiba Iyou Syst Eng Kk Mpr image preparing device
US6049622A (en) 1996-12-05 2000-04-11 Mayo Foundation For Medical Education And Research Graphic navigational guides for accurate image orientation and navigation
US6182127B1 (en) 1997-02-12 2001-01-30 Digital Paper, Llc Network image view server using efficent client-server tilting and caching architecture
US6111573A (en) 1997-02-14 2000-08-29 Velocity.Com, Inc. Device independent window and view system
US5933145A (en) 1997-04-17 1999-08-03 Microsoft Corporation Method and system for visually indicating a selection query
US6169534B1 (en) 1997-06-26 2001-01-02 Upshot.Com Graphical user interface for customer information management
US6988248B1 (en) * 1997-06-30 2006-01-17 Sun Microsystems, Inc. Animated indicators that reflect function activity or state of objects data or processes
JPH1125076A (en) 1997-06-30 1999-01-29 Fujitsu Ltd Document managing device and document management program storage medium
US6195794B1 (en) 1997-08-12 2001-02-27 International Business Machines Corporation Method and apparatus for distributing templates in a component system
US6016553A (en) 1997-09-05 2000-01-18 Wild File, Inc. Method, software and apparatus for saving, using and recovering data
US6088707A (en) 1997-10-06 2000-07-11 International Business Machines Corporation Computer system and method of displaying update status of linked hypertext documents
US6023695A (en) 1997-10-31 2000-02-08 Oracle Corporation Summary table management in a computer system
US6377965B1 (en) 1997-11-07 2002-04-23 Microsoft Corporation Automatic word completion system for partially entered data
US6527556B1 (en) 1997-11-12 2003-03-04 Intellishare, Llc Method and system for creating an integrated learning environment with a pattern-generator and course-outlining tool for content authoring, an interactive learning tool, and related administrative tools
US6157381A (en) 1997-11-18 2000-12-05 International Business Machines Corporation Computer system, user interface component and method utilizing non-linear scroll bar
US6509912B1 (en) 1998-01-12 2003-01-21 Xerox Corporation Domain objects for use in a freeform graphics system
US6460043B1 (en) 1998-02-04 2002-10-01 Microsoft Corporation Method and apparatus for operating on data with a conceptual data manipulation language
US6167405A (en) 1998-04-27 2000-12-26 Bull Hn Information Systems Inc. Method and apparatus for automatically populating a data warehouse system
US6185582B1 (en) 1998-06-17 2001-02-06 Xerox Corporation Spreadsheet view enhancement system
US6266067B1 (en) 1998-07-28 2001-07-24 International Business Machines Corporation System and method for dynamically displaying data relationships between static charts
AU5791899A (en) 1998-08-27 2000-03-21 Upshot Corporation A method and apparatus for network-based sales force management
US6606740B1 (en) 1998-10-05 2003-08-12 American Management Systems, Inc. Development framework for case and workflow systems
US6496832B2 (en) 1998-10-20 2002-12-17 University Of Minnesota Visualization spreadsheet
US6330022B1 (en) 1998-11-05 2001-12-11 Lucent Technologies Inc. Digital processing apparatus and method to support video conferencing in variable contexts
US7043529B1 (en) 1999-04-23 2006-05-09 The United States Of America As Represented By The Secretary Of The Navy Collaborative development network for widely dispersed users and methods therefor
US6108573A (en) 1998-11-25 2000-08-22 General Electric Co. Real-time MR section cross-reference on replaceable MR localizer images
US6252594B1 (en) 1998-12-11 2001-06-26 International Business Machines Corporation Method and system for aiding a user in scrolling through a document using animation, voice cues and a dockable scroll bar
US6567830B1 (en) 1999-02-12 2003-05-20 International Business Machines Corporation Method, system, and program for displaying added text to an electronic media file
US6611802B2 (en) 1999-06-11 2003-08-26 International Business Machines Corporation Method and system for proofreading and correcting dictated text
US7228492B1 (en) 1999-07-06 2007-06-05 Ricoh Company, Ltd. 2D graph displaying document locations of user-specified concept of interest
ATE258700T1 (en) 1999-07-15 2004-02-15 Richard B Himmelstein COMMUNICATION DEVICE FOR EFFICIENT ACCESS TO INTERNET DATA
US7272637B1 (en) 1999-07-15 2007-09-18 Himmelstein Richard B Communication system and method for efficiently accessing internet resources
US6636242B2 (en) 1999-08-31 2003-10-21 Accenture Llp View configurer in a presentation services patterns environment
US7237188B1 (en) 2004-02-06 2007-06-26 Microsoft Corporation Method and system for managing dynamic tables
US6385617B1 (en) 1999-10-07 2002-05-07 International Business Machines Corporation Method and apparatus for creating and manipulating a compressed binary decision diagram in a data processing system
US7383320B1 (en) 1999-11-05 2008-06-03 Idom Technologies, Incorporated Method and apparatus for automatically updating website content
US6522347B1 (en) 2000-01-18 2003-02-18 Seiko Epson Corporation Display apparatus, portable information processing apparatus, information recording medium, and electronic apparatus
US20010032248A1 (en) 2000-03-29 2001-10-18 Krafchin Richard H. Systems and methods for generating computer-displayed presentations
GB2367660B (en) 2000-04-13 2004-01-14 Ibm Methods and apparatus for automatic page break detection
US6456234B1 (en) 2000-06-07 2002-09-24 William J. Johnson System and method for proactive content delivery by situation location
US7155667B1 (en) 2000-06-21 2006-12-26 Microsoft Corporation User interface for integrated spreadsheets and word processing tables
WO2002005065A2 (en) 2000-07-11 2002-01-17 Juice Software, Inc. A method and system for integrating network-based functionality into productivity applications and documents
CA2424713C (en) 2000-08-21 2007-12-04 Thoughtslinger Corporation Simultaneous multi-user document editing system
US20060074727A1 (en) 2000-09-07 2006-04-06 Briere Daniel D Method and apparatus for collection and dissemination of information over a computer network
US6661431B1 (en) 2000-10-10 2003-12-09 Stone Analytica, Inc. Method of representing high-dimensional information
US7249042B1 (en) 2000-11-01 2007-07-24 Microsoft Corporation Method and system for visually indicating project task durations are estimated using a character
US7027997B1 (en) 2000-11-02 2006-04-11 Verizon Laboratories Inc. Flexible web-based interface for workflow management systems
JP4162181B2 (en) 2000-11-27 2008-10-08 ヤマハ株式会社 Program creation / playback apparatus, program creation / playback method, and storage medium
US20020069207A1 (en) 2000-12-06 2002-06-06 Alexander Amy E. System and method for conducting surveys
US7607083B2 (en) 2000-12-12 2009-10-20 Nec Corporation Test summarization using relevance measures and latent semantic analysis
US6907580B2 (en) 2000-12-14 2005-06-14 Microsoft Corporation Selection paradigm for displayed user interface
US7222156B2 (en) * 2001-01-25 2007-05-22 Microsoft Corporation Integrating collaborative messaging into an electronic mail program
US7788598B2 (en) 2001-03-16 2010-08-31 Siebel Systems, Inc. System and method for assigning and scheduling activities
US20030033196A1 (en) 2001-05-18 2003-02-13 Tomlin John Anthony Unintrusive targeted advertising on the world wide web using an entropy model
CA2403300A1 (en) 2002-09-12 2004-03-12 Pranil Ram A method of buying or selling items and a user interface to facilitate the same
GB0116771D0 (en) 2001-07-10 2001-08-29 Ibm System and method for tailoring of electronic messages
US8108241B2 (en) 2001-07-11 2012-01-31 Shabina Shukoor System and method for promoting action on visualized changes to information
US6901277B2 (en) 2001-07-17 2005-05-31 Accuimage Diagnostics Corp. Methods for generating a lung report
US20040215443A1 (en) 2001-07-27 2004-10-28 Hatton Charles Malcolm Computers that communicate in the english language and complete work assignments by reading english language sentences
US7461077B1 (en) 2001-07-31 2008-12-02 Nicholas Greenwood Representation of data records
US7415664B2 (en) 2001-08-09 2008-08-19 International Business Machines Corporation System and method in a spreadsheet for exporting-importing the content of input cells from a scalable template instance to another
US7117225B2 (en) 2001-08-13 2006-10-03 Jasmin Cosic Universal data management interface
US7398201B2 (en) 2001-08-14 2008-07-08 Evri Inc. Method and system for enhanced data searching
US8933949B2 (en) 2010-10-01 2015-01-13 Z124 User interaction across cross-environment applications through an extended graphics context
US9047102B2 (en) 2010-10-01 2015-06-02 Z124 Instant remote rendering
US8819705B2 (en) 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US6550165B2 (en) 2001-09-14 2003-04-22 Charles Chirafesi, Jr. Perpetual calendar wall display device having rotatable calendar days
US7499907B2 (en) 2001-10-12 2009-03-03 Teradata Us, Inc. Index selection in a database system
AU2002359444A1 (en) 2001-11-21 2003-06-10 Viatronix Incorporated Imaging system and method for cardiac analysis
GB2383662B (en) 2001-11-26 2005-05-11 Evolution Consulting Group Plc Creating XML documents
US20030137536A1 (en) 2001-11-30 2003-07-24 Hugh Harlan M. Method and apparatus for communicating changes from and to a shared associative database using one-way communications techniques
US7139800B2 (en) 2002-01-16 2006-11-21 Xerox Corporation User interface for a message-based system having embedded information management capabilities
US7039596B1 (en) 2002-01-18 2006-05-02 America Online, Inc. Calendar overlays
US7054891B2 (en) 2002-03-18 2006-05-30 Bmc Software, Inc. System and method for comparing database data
US7062478B1 (en) 2002-03-20 2006-06-13 Resolutionebs, Inc. Method and apparatus using automated rule processing to configure a product or service
US7263512B2 (en) 2002-04-02 2007-08-28 Mcgoveran David O Accessing and updating views and relations in a relational database
US7533026B2 (en) 2002-04-12 2009-05-12 International Business Machines Corporation Facilitating management of service elements usable in providing information technology service offerings
US6976023B2 (en) 2002-04-23 2005-12-13 International Business Machines Corporation System and method for managing application specific privileges in a content management system
US20030204490A1 (en) 2002-04-24 2003-10-30 Stephane Kasriel Web-page collaboration system
US7523394B2 (en) 2002-06-28 2009-04-21 Microsoft Corporation Word-processing document stored in a single XML file that may be manipulated by applications that understand XML
CA2398103A1 (en) 2002-08-14 2004-02-14 March Networks Corporation Multi-dimensional table filtering system
US20040133441A1 (en) 2002-09-04 2004-07-08 Jeffrey Brady Method and program for transferring information from an application
US9811805B2 (en) 2002-09-18 2017-11-07 eSys Technologies, Inc. Automated work-flow management system with dynamic interface
WO2004053624A2 (en) 2002-10-17 2004-06-24 The Knowledge It Corporation Virtual knowledge management system
AU2003301602A1 (en) 2002-10-23 2004-05-13 David Theiler Method and apparatus for managing workflow
US20040139400A1 (en) 2002-10-23 2004-07-15 Allam Scott Gerald Method and apparatus for displaying and viewing information
US9172738B1 (en) 2003-05-08 2015-10-27 Dynamic Mesh Networks, Inc. Collaborative logistics ecosystem: an extensible framework for collaborative logistics
US7274375B1 (en) 2002-11-19 2007-09-25 Peter David Timekeeping system and method for graphically tracking and representing activities
US7954043B2 (en) * 2002-12-02 2011-05-31 International Business Machines Corporation Concurrent editing of a file by multiple authors
US7783614B2 (en) 2003-02-13 2010-08-24 Microsoft Corporation Linking elements of a document to corresponding fields, queries and/or procedures in a database
US7017112B2 (en) 2003-02-28 2006-03-21 Microsoft Corporation Importing and exporting markup language data in a spreadsheet application document
US7769794B2 (en) 2003-03-24 2010-08-03 Microsoft Corporation User interface for a file system shell
US7605813B2 (en) 2003-04-22 2009-10-20 International Business Machines Corporation Displaying arbitrary relationships in a tree-map visualization
US20060156220A1 (en) 2003-05-05 2006-07-13 Dreystadt John N System and method for managing dynamic content assembly
US7417644B2 (en) 2003-05-12 2008-08-26 Microsoft Corporation Dynamic pluggable user interface layout
US7034860B2 (en) 2003-06-20 2006-04-25 Tandberg Telecom As Method and apparatus for video conferencing having dynamic picture layout
US7143340B2 (en) 2003-06-27 2006-11-28 Microsoft Corporation Row sharing techniques for grid controls
US7814093B2 (en) 2003-07-25 2010-10-12 Microsoft Corporation Method and system for building a report for execution against a data store
US20050034064A1 (en) 2003-07-25 2005-02-10 Activeviews, Inc. Method and system for creating and following drill links
US20050039001A1 (en) 2003-07-30 2005-02-17 Microsoft Corporation Zoned based security administration for data items
US7895595B2 (en) 2003-07-30 2011-02-22 Northwestern University Automatic method and system for formulating and transforming representations of context used by information services
US7617443B2 (en) 2003-08-04 2009-11-10 At&T Intellectual Property I, L.P. Flexible multiple spreadsheet data consolidation system
WO2005022417A2 (en) 2003-08-27 2005-03-10 Ascential Software Corporation Methods and systems for real time integration services
US8543566B2 (en) 2003-09-23 2013-09-24 Salesforce.Com, Inc. System and methods of improving a multi-tenant database query using contextual knowledge about non-homogeneously distributed tenant data
US7779039B2 (en) 2004-04-02 2010-08-17 Salesforce.Com, Inc. Custom entities and fields in a multi-tenant database system
US7149353B2 (en) 2003-09-23 2006-12-12 Amazon.Com, Inc. Method and system for suppression of features in digital images of content
US7318216B2 (en) 2003-09-24 2008-01-08 Tablecode Software Corporation Software application development environment facilitating development of a software application
US7433920B2 (en) 2003-10-10 2008-10-07 Microsoft Corporation Contact sidebar tile
US7921360B1 (en) 2003-10-21 2011-04-05 Adobe Systems Incorporated Content-restricted editing
US6990637B2 (en) 2003-10-23 2006-01-24 Microsoft Corporation Graphical user interface for 3-dimensional view of a data collection based on an attribute of the data
US20050096973A1 (en) 2003-11-04 2005-05-05 Heyse Neil W. Automated life and career management services
US8091044B2 (en) 2003-11-20 2012-01-03 International Business Machines Corporation Filtering the display of files in graphical interfaces
US7509306B2 (en) 2003-12-08 2009-03-24 International Business Machines Corporation Index for data retrieval and data structuring
US20080163075A1 (en) 2004-01-26 2008-07-03 Beck Christopher Clemmett Macl Server-Client Interaction and Information Management System
US8868405B2 (en) 2004-01-27 2014-10-21 Hewlett-Packard Development Company, L. P. System and method for comparative analysis of textual documents
GB2410575A (en) 2004-01-30 2005-08-03 Nomura Internat Plc Analysing and displaying associated financial data
US20050216830A1 (en) * 2004-03-29 2005-09-29 Turner Jeffrey S Access tool to facilitate exchange of data to and from an end-user application software package
US9811728B2 (en) 2004-04-12 2017-11-07 Google Inc. Adding value to a rendered document
US7836408B1 (en) 2004-04-14 2010-11-16 Apple Inc. Methods and apparatus for displaying relative emphasis in a file
EP1596311A1 (en) 2004-05-10 2005-11-16 France Telecom System and method for managing data tables
CN1981301B (en) 2004-05-17 2012-01-18 因文西斯系统公司 System and method for developing animated visualization interfaces
US7774378B2 (en) 2004-06-04 2010-08-10 Icentera Corporation System and method for providing intelligence centers
US7827476B1 (en) 2004-06-18 2010-11-02 Emc Corporation System and methods for a task management user interface
US7788301B2 (en) 2004-06-21 2010-08-31 Canon Kabushiki Kaisha Metadata driven user interface
US20050289453A1 (en) 2004-06-21 2005-12-29 Tsakhi Segal Apparatys and method for off-line synchronized capturing and reviewing notes and presentations
US8566732B2 (en) 2004-06-25 2013-10-22 Apple Inc. Synchronization of widgets and dashboards
US20050289342A1 (en) 2004-06-28 2005-12-29 Oracle International Corporation Column relevant data security label
US8190497B2 (en) 2004-07-02 2012-05-29 Hallmark Cards, Incorporated Handheld scanner device with display location database
US7379934B1 (en) 2004-07-09 2008-05-27 Ernest Forman Data mapping
US20060015499A1 (en) 2004-07-13 2006-01-19 International Business Machines Corporation Method, data processing system, and computer program product for sectional access privileges of plain text files
US20060013462A1 (en) 2004-07-15 2006-01-19 Navid Sadikali Image display system and method
US7779431B2 (en) 2004-07-16 2010-08-17 Wallace Robert G Networked spreadsheet template designer
US8578399B2 (en) 2004-07-30 2013-11-05 Microsoft Corporation Method, system, and apparatus for providing access to workbook models through remote function cells
US20060047811A1 (en) 2004-09-01 2006-03-02 Microsoft Corporation Method and system of providing access to various data associated with a project
US7702730B2 (en) * 2004-09-03 2010-04-20 Open Text Corporation Systems and methods for collaboration
US20060053194A1 (en) 2004-09-03 2006-03-09 Schneider Ronald E Systems and methods for collaboration
US7720867B2 (en) 2004-09-08 2010-05-18 Oracle International Corporation Natural language query construction using purpose-driven template
US20060090169A1 (en) 2004-09-29 2006-04-27 International Business Machines Corporation Process to not disturb a user when performing critical activities
US7747966B2 (en) 2004-09-30 2010-06-29 Microsoft Corporation User interface for providing task management and calendar information
US8745483B2 (en) 2004-10-07 2014-06-03 International Business Machines Corporation Methods, systems and computer program products for facilitating visualization of interrelationships in a spreadsheet
US7787672B2 (en) 2004-11-04 2010-08-31 Dr Systems, Inc. Systems and methods for matching, naming, and displaying medical images
US8402361B2 (en) * 2004-11-09 2013-03-19 Oracle International Corporation Methods and systems for implementing a dynamic hierarchical data viewer
US20060107196A1 (en) 2004-11-12 2006-05-18 Microsoft Corporation Method for expanding and collapsing data cells in a spreadsheet report
WO2006068731A2 (en) 2004-11-12 2006-06-29 Haley Systems, Inc. A system for enterprise knowledge management and automation
US8001476B2 (en) 2004-11-16 2011-08-16 Open Text Inc. Cellular user interface
US20080104091A1 (en) 2004-11-26 2008-05-01 Chin Philip K Method of displaying data in a table
US11461077B2 (en) 2004-11-26 2022-10-04 Philip K. Chin Method of displaying data in a table with fixed header
US20060129415A1 (en) 2004-12-13 2006-06-15 Rohit Thukral System for linking financial asset records with networked assets
JP4738805B2 (en) 2004-12-16 2011-08-03 株式会社リコー Screen sharing system, screen sharing method, screen sharing program
US7770180B2 (en) 2004-12-21 2010-08-03 Microsoft Corporation Exposing embedded data in a computer-generated document
JP3734491B1 (en) 2004-12-21 2006-01-11 公靖 中野 How to display in-cell graph of spreadsheet
US8312368B2 (en) 2005-01-06 2012-11-13 Oracle International Corporation Dynamic documentation
US20060173908A1 (en) 2005-01-10 2006-08-03 Browning Michelle M System and method for automated customization of a workflow management system
EP1844403A4 (en) 2005-01-16 2010-06-23 Zlango Ltd Iconic communication
US20110208732A1 (en) 2010-02-24 2011-08-25 Apple Inc. Systems and methods for organizing data items
US20070106754A1 (en) 2005-09-10 2007-05-10 Moore James F Security facility for maintaining health care data pools
US8660852B2 (en) 2005-02-28 2014-02-25 Microsoft Corporation CRM office document integration
US7567975B2 (en) 2005-03-16 2009-07-28 Oracle International Corporation Incremental evaluation of complex event-condition-action rules in a database system
US20060236246A1 (en) 2005-03-23 2006-10-19 Bono Charles A On-line slide kit creation and collaboration system
US8151213B2 (en) 2005-03-25 2012-04-03 International Business Machines Corporation System, method and program product for tabular data with dynamic visual cells
US20060224946A1 (en) 2005-03-31 2006-10-05 International Business Machines Corporation Spreadsheet programming
US20060224568A1 (en) 2005-04-02 2006-10-05 Debrito Daniel N Automatically displaying fields that were non-displayed when the fields are filter fields
WO2006116580A2 (en) 2005-04-27 2006-11-02 Yost James T Pop-up software application
US20060253205A1 (en) 2005-05-09 2006-11-09 Michael Gardiner Method and apparatus for tabular process control
US20060250369A1 (en) 2005-05-09 2006-11-09 Keim Oliver G Keyboard controls for customizing table layouts
US7831539B2 (en) 2005-06-21 2010-11-09 Microsoft Corporation Dynamically filtering aggregate reports based on values resulting from one or more previously applied filters
US7543228B2 (en) 2005-06-27 2009-06-02 Microsoft Corporation Template for rendering an electronic form
US20070027932A1 (en) 2005-07-29 2007-02-01 Q2 Labs, Llc System and method of creating a single source rss document from multiple content sources
US9268867B2 (en) 2005-08-03 2016-02-23 Aol Inc. Enhanced favorites service for web browsers and web applications
US9286388B2 (en) 2005-08-04 2016-03-15 Time Warner Cable Enterprises Llc Method and apparatus for context-specific content delivery
US7916157B1 (en) 2005-08-16 2011-03-29 Adobe Systems Incorporated System and methods for selective zoom response behavior
US20070050379A1 (en) 2005-08-25 2007-03-01 International Business Machines Corporation Highlighting entities in a display representation of a database query, results of a database query, and debug message of a database query to indicate associations
US7779000B2 (en) 2005-08-29 2010-08-17 Microsoft Corporation Associating conditions to summary table data
US7779347B2 (en) * 2005-09-02 2010-08-17 Fourteen40, Inc. Systems and methods for collaboratively annotating electronic documents
US8601383B2 (en) 2005-09-09 2013-12-03 Microsoft Corporation User interface for creating a spreadsheet data summary table
US7489976B2 (en) 2005-09-12 2009-02-10 Hosni I Adra System and method for dynamically simulating process and value stream maps
US7721205B2 (en) 2005-09-15 2010-05-18 Microsoft Corporation Integration of composite objects in host applications
US20070073899A1 (en) 2005-09-15 2007-03-29 Judge Francis P Techniques to synchronize heterogeneous data sources
US20070092048A1 (en) 2005-10-20 2007-04-26 Chelstrom Nathan P RUNN counter phase control
US7954064B2 (en) 2005-10-27 2011-05-31 Apple Inc. Multiple dashboards
US7627812B2 (en) 2005-10-27 2009-12-01 Microsoft Corporation Variable formatting of cells
US9104294B2 (en) 2005-10-27 2015-08-11 Apple Inc. Linked widgets
US8219457B2 (en) 2005-10-28 2012-07-10 Adobe Systems Incorporated Custom user definable keyword bidding system and method
US7707514B2 (en) 2005-11-18 2010-04-27 Apple Inc. Management of user interface elements in a display environment
US20070118527A1 (en) 2005-11-22 2007-05-24 Microsoft Corporation Security and data filtering
US8185819B2 (en) 2005-12-12 2012-05-22 Google Inc. Module specification for a module to be incorporated into a container document
US8560942B2 (en) 2005-12-15 2013-10-15 Microsoft Corporation Determining document layout between different views
US20070143169A1 (en) 2005-12-21 2007-06-21 Grant Chad W Real-time workload information scheduling and tracking system and related methods
US7685152B2 (en) 2006-01-10 2010-03-23 International Business Machines Corporation Method and apparatus for loading data from a spreadsheet to a relational database table
US20070168861A1 (en) 2006-01-17 2007-07-19 Bell Denise A Method for indicating completion status of user initiated and system created tasks
US20070174228A1 (en) 2006-01-17 2007-07-26 Microsoft Corporation Graphical representation of key performance indicators
US7634717B2 (en) 2006-01-23 2009-12-15 Microsoft Corporation Multiple conditional formatting
US8005873B2 (en) 2006-01-25 2011-08-23 Microsoft Corporation Filtering and sorting information
US20070186173A1 (en) 2006-02-03 2007-08-09 Yahoo! Inc. Instant messenger alerts and organization systems
US9083663B2 (en) 2006-02-04 2015-07-14 Docsof, Llc Reminder system
US8930812B2 (en) 2006-02-17 2015-01-06 Vmware, Inc. System and method for embedding, editing, saving, and restoring objects within a browser window
US7770100B2 (en) 2006-02-27 2010-08-03 Microsoft Corporation Dynamic thresholds for conditional formats
US8046703B2 (en) 2006-02-28 2011-10-25 Sap Ag Monitoring and integration of an organization's planning processes
US8266152B2 (en) 2006-03-03 2012-09-11 Perfect Search Corporation Hashed indexing
US20070266177A1 (en) 2006-03-08 2007-11-15 David Vismans Communication device with indirect command distribution
US20070239746A1 (en) 2006-03-29 2007-10-11 International Business Machines Corporation Visual merge of portlets
US20070233647A1 (en) 2006-03-30 2007-10-04 Microsoft Corporation Sharing Items In An Operating System
US20070256043A1 (en) 2006-05-01 2007-11-01 Peters Johan C Method and system for implementing a mass data change tool in a graphical user interface
US8078955B1 (en) 2006-05-02 2011-12-13 Adobe Systems Incorportaed Method and apparatus for defining table styles
US7467354B2 (en) 2006-05-30 2008-12-16 International Business Machines Corporation Method to search data
US8364514B2 (en) 2006-06-27 2013-01-29 Microsoft Corporation Monitoring group activities
US7761393B2 (en) 2006-06-27 2010-07-20 Microsoft Corporation Creating and managing activity-centric workflow
US20070300185A1 (en) 2006-06-27 2007-12-27 Microsoft Corporation Activity-centric adaptive user interface
US20080005235A1 (en) 2006-06-30 2008-01-03 Microsoft Corporation Collaborative integrated development environment using presence information
US8166415B2 (en) 2006-08-04 2012-04-24 Apple Inc. User interface for backup management
US8869027B2 (en) 2006-08-04 2014-10-21 Apple Inc. Management and generation of dashboards
US20080059539A1 (en) * 2006-08-08 2008-03-06 Richard Chin Document Collaboration System and Method
US8676845B2 (en) 2006-08-22 2014-03-18 International Business Machines Corporation Database entitlement
US20080065460A1 (en) 2006-08-23 2008-03-13 Renegade Swish, Llc Apparatus, system, method, and computer program for task and process management
US8688522B2 (en) 2006-09-06 2014-04-01 Mediamath, Inc. System and method for dynamic online advertisement creation and management
US10637724B2 (en) 2006-09-25 2020-04-28 Remot3.It, Inc. Managing network connected devices
US20080077530A1 (en) 2006-09-25 2008-03-27 John Banas System and method for project process and workflow optimization
US9201854B1 (en) 2006-10-25 2015-12-01 Hewlett-Packard Development Company, L.P. Methods and systems for creating, interacting with, and utilizing a superactive document
US8775563B2 (en) 2006-11-20 2014-07-08 Yapta, Inc. Dynamic overlaying of content on web pages for tracking data
US8078643B2 (en) 2006-11-27 2011-12-13 Sap Ag Schema modeler for generating an efficient database schema
US20080133736A1 (en) 2006-11-30 2008-06-05 Ava Mobile, Inc. System, method, and computer program product for tracking digital media in collaborative environments
US20080222192A1 (en) 2006-12-21 2008-09-11 Ec-Enabler, Ltd Method and system for transferring information using metabase
US20080155547A1 (en) 2006-12-22 2008-06-26 Yahoo! Inc. Transactional calendar
US10318624B1 (en) 2006-12-28 2019-06-11 Apple Inc. Infinite canvas
US9390059B1 (en) 2006-12-28 2016-07-12 Apple Inc. Multiple object types on a canvas
US7827615B1 (en) 2007-01-23 2010-11-02 Sprint Communications Company L.P. Hybrid role-based discretionary access control
EP2115677A4 (en) 2007-01-29 2013-11-06 Google Inc On-line payment transactions
US20100287163A1 (en) 2007-02-01 2010-11-11 Sridhar G S Collaborative online content editing and approval
US8413064B2 (en) 2007-02-12 2013-04-02 Jds Uniphase Corporation Method and apparatus for graphically indicating the progress of multiple parts of a task
US7992078B2 (en) 2007-02-28 2011-08-02 Business Objects Software Ltd Apparatus and method for creating publications from static and dynamic content
EP2132641A4 (en) 2007-03-02 2012-07-04 Telarix Inc System and method for user-definable document exchange
US8069129B2 (en) 2007-04-10 2011-11-29 Ab Initio Technology Llc Editing and compiling business rules
US20090019383A1 (en) 2007-04-13 2009-01-15 Workstone Llc User interface for a personal information manager
EP1986369B1 (en) 2007-04-27 2012-03-07 Accenture Global Services Limited End user control configuration system with dynamic user interface
US20090031401A1 (en) 2007-04-27 2009-01-29 Bea Systems, Inc. Annotations for enterprise web application constructor
US8866815B2 (en) 2007-05-23 2014-10-21 Oracle International Corporation Automated treemap configuration
US10546272B2 (en) 2007-05-08 2020-01-28 Metropolitan Life Insurance Co. System and method for workflow management
US7925989B2 (en) 2007-05-09 2011-04-12 Sap Ag System and method for simultaneous display of multiple tables
US8233624B2 (en) 2007-05-25 2012-07-31 Splitstreem Oy Method and apparatus for securing data in a memory device
US20080301237A1 (en) 2007-05-31 2008-12-04 Allan Peter Parsons Method and apparatus for improved referral to resources and a related social network
US9411798B1 (en) 2007-06-04 2016-08-09 Open Text Corporation Methods and apparatus for reusing report design components and templates
US10783463B2 (en) 2007-06-27 2020-09-22 International Business Machines Corporation System, method and program for tracking labor costs
US8166000B2 (en) 2007-06-27 2012-04-24 International Business Machines Corporation Using a data mining algorithm to generate format rules used to validate data sets
US8082274B2 (en) 2007-06-28 2011-12-20 Microsoft Corporation Scheduling application allowing freeform data entry
US7933952B2 (en) 2007-06-29 2011-04-26 Microsoft Corporation Collaborative document authoring
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US8954871B2 (en) 2007-07-18 2015-02-10 Apple Inc. User-centric widgets and dashboards
US20090044090A1 (en) 2007-08-06 2009-02-12 Apple Inc. Referring to cells using header cell values
US8972458B2 (en) * 2007-08-09 2015-03-03 Yahoo! Inc. Systems and methods for comments aggregation and carryover in word pages
US20090048896A1 (en) 2007-08-14 2009-02-19 Vignesh Anandan Work management using integrated project and workflow methodology
US10235429B2 (en) 2007-08-20 2019-03-19 Stephen W. Meehan System and method for organizing data in a dynamic user-customizable interface for search and display
US8713144B2 (en) 2007-09-14 2014-04-29 Ricoh Co., Ltd. Workflow-enabled client
US8621652B2 (en) 2007-09-17 2013-12-31 Metabyte Inc. Copying a web element with reassigned permissions
CN102749997B (en) 2007-09-18 2016-06-22 微软技术许可有限责任公司 The method of the operation of mobile terminal and this mobile terminal of control
US20090083140A1 (en) 2007-09-25 2009-03-26 Yahoo! Inc. Non-intrusive, context-sensitive integration of advertisements within network-delivered media content
IL186505A0 (en) 2007-10-08 2008-01-20 Excelang Ltd Grammar checker
US8185827B2 (en) 2007-10-26 2012-05-22 International Business Machines Corporation Role tailored portal solution integrating near real-time metrics, business logic, online collaboration, and web 2.0 content
US7950064B2 (en) 2007-11-16 2011-05-24 International Business Machines Corporation System and method for controlling comments in a collaborative document
US8204880B2 (en) 2007-11-20 2012-06-19 Sap Aktiengeselleschaft Generic table grouper
AU2007237356A1 (en) 2007-12-05 2009-06-25 Canon Kabushiki Kaisha Animated user interface control elements
US8825758B2 (en) 2007-12-14 2014-09-02 Microsoft Corporation Collaborative authoring modes
US20090172565A1 (en) 2007-12-26 2009-07-02 John Clarke Jackson Systems, Devices, and Methods for Sharing Content
US8327272B2 (en) 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US8862979B2 (en) 2008-01-15 2014-10-14 Microsoft Corporation Multi-client collaboration to access and update structured data elements
US7908299B2 (en) 2008-01-31 2011-03-15 Computer Associates Think, Inc. Method and apparatus for pseudo-conversion of table objects
US10255609B2 (en) 2008-02-21 2019-04-09 Micronotes, Inc. Interactive marketing system
US20090222760A1 (en) 2008-02-29 2009-09-03 Halverson Steven G Method, System and Computer Program Product for Automating the Selection and Ordering of Column Data in a Table for a User
US9495386B2 (en) 2008-03-05 2016-11-15 Ebay Inc. Identification of items depicted in images
US9558172B2 (en) 2008-03-12 2017-01-31 Microsoft Technology Licensing, Llc Linking visual properties of charts to cells within tables
US7895174B2 (en) 2008-03-27 2011-02-22 Microsoft Corporation Database part table junctioning
US8805689B2 (en) * 2008-04-11 2014-08-12 The Nielsen Company (Us), Llc Methods and apparatus to generate and use content-aware watermarks
US8347204B2 (en) 2008-05-05 2013-01-01 Norm Rosner Method and system for data analysis
US20090292690A1 (en) 2008-05-23 2009-11-26 Daniel Jason Culbert Method and System for Automatic Event Administration and Viewing
US20090299808A1 (en) 2008-05-30 2009-12-03 Gilmour Tom S Method and system for project management
US8413261B2 (en) 2008-05-30 2013-04-02 Red Hat, Inc. Sharing private data publicly and anonymously
US9165044B2 (en) 2008-05-30 2015-10-20 Ethority, Llc Enhanced user interface and data handling in business intelligence software
US20090313570A1 (en) 2008-06-13 2009-12-17 Po Ronald T System and method for integrating locational awareness into a subject oriented workflow
US20090313537A1 (en) 2008-06-17 2009-12-17 Microsoft Corporation Micro browser spreadsheet viewer
US8166387B2 (en) 2008-06-20 2012-04-24 Microsoft Corporation DataGrid user interface control with row details
US20090319623A1 (en) 2008-06-24 2009-12-24 Oracle International Corporation Recipient-dependent presentation of electronic messages
JP2010033551A (en) 2008-06-26 2010-02-12 Canon Inc Design editing apparatus, design editing method, and design editing program
US20090327301A1 (en) 2008-06-26 2009-12-31 Microsoft Corporation Distributed Configuration Management Using Constitutional Documents
US20090327851A1 (en) 2008-06-27 2009-12-31 Steven Raposo Data analysis method
US20150363478A1 (en) 2008-07-11 2015-12-17 Michael N. Haynes Systems, Devices, and/or Methods for Managing Data
US9449311B2 (en) 2008-07-18 2016-09-20 Ebay Inc. Methods and systems for facilitating transactions using badges
US20100017699A1 (en) 2008-07-20 2010-01-21 Farrell Glenn H Multi-choice controls for selecting data groups to be displayed
US8381124B2 (en) 2008-07-30 2013-02-19 The Regents Of The University Of California Single select clinical informatics
US20100031135A1 (en) 2008-08-01 2010-02-04 Oracle International Corporation Annotation management in enterprise applications
US8386960B1 (en) 2008-08-29 2013-02-26 Adobe Systems Incorporated Building object interactions
US8938465B2 (en) 2008-09-10 2015-01-20 Samsung Electronics Co., Ltd. Method and system for utilizing packaged content sources to identify and provide information based on contextual information
US8726179B2 (en) 2008-09-12 2014-05-13 Salesforce.Com, Inc. Method and system for providing in-line scheduling in an on-demand service
US20100070845A1 (en) 2008-09-17 2010-03-18 International Business Machines Corporation Shared web 2.0 annotations linked to content segments of web documents
US8745052B2 (en) 2008-09-18 2014-06-03 Accenture Global Services Limited System and method for adding context to the creation and revision of artifacts
US7945622B1 (en) * 2008-10-01 2011-05-17 Adobe Systems Incorporated User-aware collaboration playback and recording
US8280822B2 (en) 2008-10-15 2012-10-02 Adp Workscape, Inc. Performance driven compensation for enterprise-level human capital management
US20100095219A1 (en) 2008-10-15 2010-04-15 Maciej Stachowiak Selective history data structures
US8135635B2 (en) 2008-10-16 2012-03-13 Intuit Inc. System and method for time tracking on a mobile computing device
US8326864B2 (en) 2008-10-21 2012-12-04 International Business Machines Corporation Method, system, and computer program product for implementing automated worklists
US9092636B2 (en) 2008-11-18 2015-07-28 Workshare Technology, Inc. Methods and systems for exact data match filtering
KR101118089B1 (en) 2008-12-10 2012-03-09 서울대학교산학협력단 Apparatus and system for Variable Length Decoding
US9424287B2 (en) 2008-12-16 2016-08-23 Hewlett Packard Enterprise Development Lp Continuous, automated database-table partitioning and database-schema evolution
US10685177B2 (en) 2009-01-07 2020-06-16 Litera Corporation System and method for comparing digital data in spreadsheets or database tables
US8312366B2 (en) 2009-02-11 2012-11-13 Microsoft Corporation Displaying multiple row and column header areas in a summary table
US20100228752A1 (en) 2009-02-25 2010-09-09 Microsoft Corporation Multi-condition filtering of an interactive summary table
US8136031B2 (en) 2009-03-17 2012-03-13 Litera Technologies, LLC Comparing the content of tables containing merged or split cells
US8181106B2 (en) 2009-03-18 2012-05-15 Microsoft Corporation Use of overriding templates associated with customizable elements when editing a web page
US20100241477A1 (en) 2009-03-19 2010-09-23 Scenario Design, Llc Dimensioned modeling system
US9159074B2 (en) * 2009-03-23 2015-10-13 Yahoo! Inc. Tool for embedding comments for objects in an article
US20100241990A1 (en) 2009-03-23 2010-09-23 Microsoft Corporation Re-usable declarative workflow templates
US8973153B2 (en) * 2009-03-30 2015-03-03 International Business Machines Corporation Creating audio-based annotations for audiobooks
US20100257015A1 (en) 2009-04-01 2010-10-07 National Information Solutions Cooperative, Inc. Graphical client interface resource and work management scheduler
GB0905953D0 (en) 2009-04-06 2009-05-20 Bowling Anthony Document editing method
US8254890B2 (en) 2009-04-08 2012-08-28 Research In Motion Limited System and method for managing items in a list shared by a group of mobile devices
US20100262625A1 (en) 2009-04-08 2010-10-14 Glenn Robert Pittenger Method and system for fine-granularity access control for database entities
US8548997B1 (en) 2009-04-08 2013-10-01 Jianqing Wu Discovery information management system
US8180812B2 (en) 2009-05-08 2012-05-15 Microsoft Corporation Templates for configuring file shares
US9268761B2 (en) 2009-06-05 2016-02-23 Microsoft Technology Licensing, Llc In-line dynamic text with variable formatting
US20100324964A1 (en) 2009-06-19 2010-12-23 International Business Machines Corporation Automatically monitoring working hours for projects using instant messenger
CN102292713A (en) 2009-06-30 2011-12-21 唐桥科技有限公司 A multimedia collaboration system
WO2011000165A1 (en) 2009-07-03 2011-01-06 Hewlett-Packard Development Company,L.P. Apparatus and method for text extraction
US9396241B2 (en) 2009-07-15 2016-07-19 Oracle International Corporation User interface controls for specifying data hierarchies
US9223770B1 (en) 2009-07-29 2015-12-29 Open Invention Network, Llc Method and apparatus of creating electronic forms to include internet list data
US8626141B2 (en) 2009-07-30 2014-01-07 Qualcomm Incorporated Method and apparatus for customizing a user interface menu
US20110047484A1 (en) 2009-08-19 2011-02-24 Onehub Inc. User manageable collaboration
US20110055177A1 (en) 2009-08-26 2011-03-03 International Business Machines Corporation Collaborative content retrieval using calendar task lists
US9779386B2 (en) 2009-08-31 2017-10-03 Thomson Reuters Global Resources Method and system for implementing workflows and managing staff and engagements
US20110066933A1 (en) 2009-09-02 2011-03-17 Ludwig Lester F Value-driven visualization primitives for spreadsheets, tabular data, and advanced spreadsheet visualization
US8296170B2 (en) 2009-09-24 2012-10-23 Bp Logix Process management system and method
US20110106636A1 (en) 2009-11-02 2011-05-05 Undercurrent Inc. Method and system for managing online presence
US20110119352A1 (en) 2009-11-16 2011-05-19 Parrotview, Inc. Method of mutual browsing and computer program therefor
US9015580B2 (en) 2009-12-15 2015-04-21 Shutterfly, Inc. System and method for online and mobile memories and greeting service
US20120215574A1 (en) 2010-01-16 2012-08-23 Management Consulting & Research, LLC System, method and computer program product for enhanced performance management
US8645854B2 (en) 2010-01-19 2014-02-04 Verizon Patent And Licensing Inc. Provisioning workflow management methods and systems
US8407217B1 (en) 2010-01-29 2013-03-26 Guangsheng Zhang Automated topic discovery in documents
US20110205231A1 (en) 2010-02-24 2011-08-25 Oracle International Corporation Mapping data in enterprise applications for operational visibility
US20110208324A1 (en) 2010-02-25 2011-08-25 Mitsubishi Electric Corporation Sysyem, method, and apparatus for maintenance of sensor and control systems
US20110219321A1 (en) 2010-03-02 2011-09-08 Microsoft Corporation Web-based control using integrated control interface having dynamic hit zones
US8656291B2 (en) 2010-03-12 2014-02-18 Salesforce.Com, Inc. System, method and computer program product for displaying data utilizing a selected source and visualization
US8359246B2 (en) 2010-03-19 2013-01-22 Buchheit Brian K Secondary marketplace for digital media content
US9430155B2 (en) 2010-03-25 2016-08-30 International Business Machines Corporation File index, metadata storage, and file system management for magnetic tape
US20110258040A1 (en) 2010-04-16 2011-10-20 Xerox Corporation System and method for providing feedback for targeted communications
US8819042B2 (en) 2010-04-23 2014-08-26 Bank Of America Corporation Enhanced data comparison tool
US20120089914A1 (en) 2010-04-27 2012-04-12 Surfwax Inc. User interfaces for navigating structured content
CA2738428A1 (en) 2010-04-30 2011-10-30 Iliv Technologies Inc. Collaboration tool
WO2021161104A1 (en) * 2020-02-12 2021-08-19 Monday.Com Enhanced display features in collaborative network systems, methods, and devices
WO2021099839A1 (en) 2019-11-18 2021-05-27 Roy Mann Collaborative networking systems, methods, and devices
WO2021024040A1 (en) 2019-08-08 2021-02-11 Mann, Roy Digital processing systems and methods for automatic relationship recognition in tables of collaborative work systems
WO2021144656A1 (en) 2020-01-15 2021-07-22 Monday.Com Digital processing systems and methods for graphical dynamic table gauges in collaborative work systems
WO2021220058A1 (en) * 2020-05-01 2021-11-04 Monday.com Ltd. Digital processing systems and methods for enhanced collaborative workflow and networking systems, methods, and devices
US20160335731A1 (en) 2010-05-05 2016-11-17 Site 10.01, Inc. System and method for monitoring and managing information
US8683359B2 (en) 2010-05-18 2014-03-25 Sap Ag In-place user interface and dataflow modeling
US20110289397A1 (en) 2010-05-19 2011-11-24 Mauricio Eastmond Displaying Table Data in a Limited Display Area
AU2010212367A1 (en) 2010-05-24 2011-12-08 Ehiive Holdings Pty Ltd Task management method, system and tool
US10289959B2 (en) 2010-05-26 2019-05-14 Automation Anywhere, Inc. Artificial intelligence and knowledge based automation enhancement
US9800705B2 (en) 2010-06-02 2017-10-24 Apple Inc. Remote user status indicators
US20140058801A1 (en) 2010-06-04 2014-02-27 Sapience Analytics Private Limited System And Method To Measure, Aggregate And Analyze Exact Effort And Time Productivity
US20170116552A1 (en) 2010-06-04 2017-04-27 Sapience Analytics Private Limited System and Method to Measure, Aggregate and Analyze Exact Effort and Time Productivity
US20110302003A1 (en) 2010-06-04 2011-12-08 Deodhar Swati Shirish System And Method To Measure, Aggregate And Analyze Exact Effort And Time Productivity
US20110320230A1 (en) 2010-06-23 2011-12-29 Canadian National Railway Company User interface for providing a user with the ability to view job assignment information
JP5498579B2 (en) 2010-06-30 2014-05-21 株式会社日立製作所 Medical support system and medical support method
US8706535B2 (en) 2010-07-13 2014-04-22 Liquidplanner, Inc. Transforming a prioritized project hierarchy with work packages
US9292587B2 (en) 2010-07-21 2016-03-22 Citrix System, Inc. Systems and methods for database notification interface to efficiently identify events and changed data
US8423909B2 (en) 2010-07-26 2013-04-16 International Business Machines Corporation System and method for an interactive filter
US9063958B2 (en) 2010-07-29 2015-06-23 Sap Se Advance enhancement of secondary persistency for extension field search
WO2012018358A1 (en) * 2010-08-04 2012-02-09 Copia Interactive, Llc Method of and system for browsing and displaying items from a collection
US9047576B2 (en) 2010-08-09 2015-06-02 Oracle International Corporation Mechanism to communicate and visualize dependencies between a large number of flows in software
US9553878B2 (en) 2010-08-16 2017-01-24 Facebook, Inc. People directory with social privacy and contact association features
JP5906594B2 (en) * 2010-08-31 2016-04-20 株式会社リコー Cooperation system, image processing apparatus, cooperation control method, cooperation control program, and recording medium
US9286246B2 (en) 2010-09-10 2016-03-15 Hitachi, Ltd. System for managing task that is for processing to computer system and that is based on user operation and method for displaying information related to task of that type
US20120079408A1 (en) 2010-09-24 2012-03-29 Visibility, Biz. Inc. Systems and methods for generating a swimlane timeline for task data visualization
JP5257433B2 (en) 2010-09-30 2013-08-07 ブラザー工業株式会社 Image reading device
WO2012044557A2 (en) 2010-10-01 2012-04-05 Imerj, Llc Auto-configuration of a docked system in a multi-os environment
US9031957B2 (en) 2010-10-08 2015-05-12 Salesforce.Com, Inc. Structured data in a business networking feed
WO2012051224A2 (en) 2010-10-11 2012-04-19 Teachscape Inc. Methods and systems for capturing, processing, managing and/or evaluating multimedia content of observed persons performing a task
US20120096389A1 (en) 2010-10-19 2012-04-19 Ran J Flam Integrated web-based workspace with curated tree-structure database schema
US10740117B2 (en) 2010-10-19 2020-08-11 Apple Inc. Grouping windows into clusters in one or more workspaces in a user interface
CA2718360A1 (en) 2010-10-25 2011-01-05 Ibm Canada Limited - Ibm Canada Limitee Communicating secondary selection feedback
US20120102543A1 (en) 2010-10-26 2012-04-26 360 GRC, Inc. Audit Management System
US8548992B2 (en) 2010-10-28 2013-10-01 Cary Scott Abramoff User interface for a digital content management system
US20120116834A1 (en) 2010-11-08 2012-05-10 Microsoft Corporation Hybrid task board and critical path method based project application
US20120116835A1 (en) 2010-11-10 2012-05-10 Microsoft Corporation Hybrid task board and critical path method based project management application interface
US20120130907A1 (en) 2010-11-22 2012-05-24 Execution Software, LLC Project management system and method
US20120131445A1 (en) 2010-11-23 2012-05-24 International Business Machines Corporation Template-based content creation
JP5663599B2 (en) 2010-11-26 2015-02-04 株式会社日立製作所 Medical support system and medical support method
US9094291B1 (en) 2010-12-14 2015-07-28 Symantec Corporation Partial risk score calculation for a data object
US9135158B2 (en) 2010-12-14 2015-09-15 Microsoft Technology Licensing, Llc Inheritance of growth patterns for derived tables
US8566328B2 (en) 2010-12-21 2013-10-22 Facebook, Inc. Prioritization and updating of contact information from multiple sources
WO2012086097A1 (en) 2010-12-21 2012-06-28 株式会社アイ・ピー・エス Database, data-management server, and data-management program
US8738414B1 (en) 2010-12-31 2014-05-27 Ajay R. Nagar Method and system for handling program, project and asset scheduling management
US9361395B2 (en) * 2011-01-13 2016-06-07 Google Inc. System and method for providing offline access in a hosted document service
US9129234B2 (en) 2011-01-24 2015-09-08 Microsoft Technology Licensing, Llc Representation of people in a spreadsheet
US8484550B2 (en) 2011-01-27 2013-07-09 Microsoft Corporation Automated table transformations from examples
US8990048B2 (en) * 2011-02-09 2015-03-24 Ipcomm Adaptive ski bindings system
US8479089B2 (en) 2011-03-08 2013-07-02 Certusoft, Inc. Constructing and applying a constraint-choice-action matrix for decision making
JP2012191508A (en) * 2011-03-11 2012-10-04 Canon Inc System capable of handling code image and control method of the same
US9626348B2 (en) 2011-03-11 2017-04-18 Microsoft Technology Licensing, Llc Aggregating document annotations
US20130262574A1 (en) 2011-03-15 2013-10-03 Gabriel Cohen Inline User Addressing in Chat Sessions
JP5699010B2 (en) * 2011-03-18 2015-04-08 東芝テック株式会社 Image processing device
US20120234907A1 (en) * 2011-03-18 2012-09-20 Donald Jordan Clark System and process for managing hosting and redirecting the data output of a 2-D QR barcode
US20120244891A1 (en) * 2011-03-21 2012-09-27 Appleton Andrew B System and method for enabling a mobile chat session
US20120246170A1 (en) 2011-03-22 2012-09-27 Momentum Consulting Managing compliance of data integration implementations
US9007405B1 (en) 2011-03-28 2015-04-14 Amazon Technologies, Inc. Column zoom
CN102737033B (en) 2011-03-31 2015-02-04 国际商业机器公司 Data processing equipment and data processing method thereof
US20120254770A1 (en) 2011-03-31 2012-10-04 Eyal Ophir Messaging interface
US20130059598A1 (en) * 2011-04-27 2013-03-07 F-Matic, Inc. Interactive computer software processes and apparatus for managing, tracking, reporting, providing feedback and tasking
US8645178B2 (en) 2011-04-28 2014-02-04 Accenture Global Services Limited Task management for a plurality of team members
EP2521066A1 (en) 2011-05-05 2012-11-07 Axiomatics AB Fine-grained relational database access-control policy enforcement using reverse queries
US9195965B2 (en) 2011-05-06 2015-11-24 David H. Sitrick Systems and methods providing collaborating among a plurality of users each at a respective computing appliance, and providing storage in respective data layers of respective user data, provided responsive to a respective user input, and utilizing event processing of event content stored in the data layers
US9330366B2 (en) 2011-05-06 2016-05-03 David H. Sitrick System and method for collaboration via team and role designation and control and management of annotations
US9384116B2 (en) 2011-05-16 2016-07-05 Vmware, Inc. Graphically representing load balance in a computing cluster
US8838533B2 (en) * 2011-05-20 2014-09-16 Microsoft Corporation Optimistic application of data edits
US9261373B2 (en) 2011-05-23 2016-02-16 Microsoft Technology Licensing, Llc Start-of-route map navigation with suppression of off-route feedback
US20120304098A1 (en) 2011-05-27 2012-11-29 Nokia Corporation Method and apparatus for providing detailed progress indicators
US9342579B2 (en) 2011-05-31 2016-05-17 International Business Machines Corporation Visual analysis of multidimensional clusters
US8689298B2 (en) 2011-05-31 2014-04-01 Red Hat, Inc. Resource-centric authorization schemes
US9323723B2 (en) 2011-06-17 2016-04-26 Microsoft Technology Licensing, Llc Reading ease of text on a device
US9071658B2 (en) 2011-07-12 2015-06-30 Salesforce.Com, Inc. Method and system for presenting a meeting in a cloud computing environment
US9195971B2 (en) 2011-07-12 2015-11-24 Salesforce.Com, Inc. Method and system for planning a meeting in a cloud computing environment
US9311288B2 (en) 2011-07-12 2016-04-12 Sony Corporation Electronic book reader
WO2013010177A2 (en) 2011-07-14 2013-01-17 Surfari Inc. Online groups interacting around common content
US8620703B1 (en) 2011-07-19 2013-12-31 Realization Technologies, Inc. Full-kit management in projects: checking full-kit compliance
US20130211866A1 (en) 2011-07-20 2013-08-15 Bank Of America Corporation Project checklist and table of changes for project management
US8713446B2 (en) 2011-07-21 2014-04-29 Sap Ag Personalized dashboard architecture for displaying data display applications
US20130036369A1 (en) 2011-08-02 2013-02-07 SquaredOut, Inc. Systems and methods for managing event-related information
US20120124749A1 (en) 2011-08-04 2012-05-24 Lewman Clyde Mcclain Meditation seating cushion
US8856246B2 (en) 2011-08-10 2014-10-07 Clarizen Ltd. System and method for project management system operation using electronic messaging
US9197427B2 (en) 2011-08-26 2015-11-24 Salesforce.Com, Inc. Methods and systems for screensharing
US8863022B2 (en) 2011-09-07 2014-10-14 Microsoft Corporation Process management views
US20130065216A1 (en) 2011-09-08 2013-03-14 Claudia Marcela Mendoza Tascon Real-Time Interactive Collaboration Board
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US8223172B1 (en) 2011-09-26 2012-07-17 Google Inc. Regional map zoom tables
US9244917B1 (en) 2011-09-30 2016-01-26 Google Inc. Generating a layout
US8990675B2 (en) 2011-10-04 2015-03-24 Microsoft Technology Licensing, Llc Automatic relationship detection for spreadsheet data items
US9123005B2 (en) 2011-10-11 2015-09-01 Mobiwork, Llc Method and system to define implement and enforce workflow of a mobile workforce
US9176933B2 (en) 2011-10-13 2015-11-03 Microsoft Technology Licensing, Llc Application of multiple content items and functionality to an electronic content item
CN103064833B (en) 2011-10-18 2016-03-16 阿里巴巴集团控股有限公司 A kind of method and system of Clean Up Database historical data
US20130104035A1 (en) 2011-10-25 2013-04-25 Robert Wagner Gps tracking system and method employing public portal publishing location data
US9411797B2 (en) 2011-10-31 2016-08-09 Microsoft Technology Licensing, Llc Slicer elements for filtering tabular data
US9430458B2 (en) 2011-11-03 2016-08-30 Microsoft Technology Licensing, Llc List-based interactivity features as part of modifying list data and structure
US8990202B2 (en) 2011-11-03 2015-03-24 Corefiling S.A.R.L. Identifying and suggesting classifications for financial data according to a taxonomy
US20130159832A1 (en) 2011-12-12 2013-06-20 Black Point Technologies Llc Systems and methods for trading using an embedded spreadsheet engine and user interface
US9064220B2 (en) 2011-12-14 2015-06-23 Sap Se Linear visualization for overview, status display, and navigation along business scenario instances
US9159246B2 (en) 2012-01-06 2015-10-13 Raytheon Cyber Products, Llc Science, technology, engineering and mathematics based cyber security education system
US20130179209A1 (en) 2012-01-10 2013-07-11 Steven J. Milosevich Information management services
US11762684B2 (en) 2012-01-30 2023-09-19 Workfusion, Inc. Distributed task execution
US8856291B2 (en) 2012-02-14 2014-10-07 Amazon Technologies, Inc. Providing configurable workflow capabilities
JP2013168858A (en) 2012-02-16 2013-08-29 Fuji Xerox Co Ltd Image processing apparatus and program
US9286475B2 (en) 2012-02-21 2016-03-15 Xerox Corporation Systems and methods for enforcement of security profiles in multi-tenant database
US8892990B2 (en) 2012-03-07 2014-11-18 Ricoh Co., Ltd. Automatic creation of a table and query tools
US9280794B2 (en) 2012-03-19 2016-03-08 David W. Victor Providing access to documents in an online document sharing community
US8937627B1 (en) 2012-03-28 2015-01-20 Google Inc. Seamless vector map tiles across multiple zoom levels
US8738665B2 (en) 2012-04-02 2014-05-27 Apple Inc. Smart progress indicator
US20130268331A1 (en) 2012-04-10 2013-10-10 Sears Brands, Llc Methods and systems for providing online group shopping services
US20130297468A1 (en) 2012-04-13 2013-11-07 CreativeWork Corporation Systems and methods for tracking time
US9247306B2 (en) 2012-05-21 2016-01-26 Intellectual Ventures Fund 83 Llc Forming a multimedia product using video chat
CN103428073B (en) 2012-05-24 2015-06-17 腾讯科技(深圳)有限公司 User interface-based instant messaging method and apparatus
US20130317988A1 (en) 2012-05-28 2013-11-28 Ian A. R. Boyd Payment and account management system using pictooverlay technology
US9449312B1 (en) * 2012-06-01 2016-09-20 Dadesystems, Llp Systems and devices controlled responsive to data bearing records
US20130339051A1 (en) * 2012-06-18 2013-12-19 George M. Dobrean System and method for generating textual report content
US8924327B2 (en) 2012-06-28 2014-12-30 Nokia Corporation Method and apparatus for providing rapport management
US10235441B1 (en) 2012-06-29 2019-03-19 Open Text Corporation Methods and systems for multi-dimensional aggregation using composition
JP5942640B2 (en) * 2012-07-01 2016-06-29 ブラザー工業株式会社 Image processing apparatus and computer program
JP5983099B2 (en) * 2012-07-01 2016-08-31 ブラザー工業株式会社 Image processing apparatus and program
US20140012616A1 (en) 2012-07-04 2014-01-09 EHSolution.com Systems and methods for new location task completion and enterprise-wide project initiative tracking
US20140019842A1 (en) 2012-07-11 2014-01-16 Bank Of America Corporation Dynamic Pivot Table Creation and Modification
WO2014018630A1 (en) 2012-07-24 2014-01-30 Webroot Inc. System and method to provide automatic classification of phishing sites
US9794256B2 (en) 2012-07-30 2017-10-17 Box, Inc. System and method for advanced control tools for administrators in a cloud-based service
US8807434B1 (en) 2012-08-08 2014-08-19 Google Inc. Techniques for generating customized two-dimensional barcodes
US8988431B2 (en) 2012-08-08 2015-03-24 Umbra Software Ltd. Conservative cell and portal graph generation
US8631034B1 (en) 2012-08-13 2014-01-14 Aria Solutions Inc. High performance real-time relational database system and methods for using same
WO2014031618A2 (en) 2012-08-22 2014-02-27 Bitvore Corp. Data relationships storage platform
GB201215193D0 (en) 2012-08-25 2012-10-10 Dalp Daniel Order delivery system
US9152618B2 (en) 2012-08-31 2015-10-06 Microsoft Technology Licensing, Llc Cell view mode for outsized cells
US20140074545A1 (en) 2012-09-07 2014-03-13 Magnet Systems Inc. Human workflow aware recommendation engine
JP2014056319A (en) 2012-09-11 2014-03-27 Canon Inc Information processor, program, and control method
US9560091B2 (en) 2012-09-17 2017-01-31 Accenture Global Services Limited Action oriented social collaboration system
US11144854B1 (en) 2012-09-18 2021-10-12 Taskworld Holdings Pte. Ltd. Digital pinboard system
US20140095237A1 (en) 2012-10-02 2014-04-03 Stefan Ehrler Organizing and Managing Employee Information for a Manager
DE112013004915T8 (en) 2012-10-08 2015-07-23 Fisher-Rosemount Systems, Inc. Configurable user displays in a process control system
US20140101527A1 (en) 2012-10-10 2014-04-10 Dominic Dan Suciu Electronic Media Reader with a Conceptual Information Tagging and Retrieval System
US20140109012A1 (en) 2012-10-16 2014-04-17 Microsoft Corporation Thumbnail and document map based navigation in a document
US9576020B1 (en) 2012-10-18 2017-02-21 Proofpoint, Inc. Methods, systems, and computer program products for storing graph-oriented data on a column-oriented database
US8972883B2 (en) 2012-10-19 2015-03-03 Sap Se Method and device for display time and timescale reset
US9710944B2 (en) * 2012-10-22 2017-07-18 Apple Inc. Electronic document thinning
WO2014066635A1 (en) 2012-10-24 2014-05-01 Complete Genomics, Inc. Genome explorer system to process and present nucleotide variations in genome sequence data
US9400777B2 (en) 2012-11-02 2016-07-26 CRM Excel Template, LLC Management data processing system and method
US9875220B2 (en) 2012-11-09 2018-01-23 The Boeing Company Panoptic visualization document printing
US20140137144A1 (en) 2012-11-12 2014-05-15 Mikko Henrik Järvenpää System and method for measuring and analyzing audience reactions to video
US9117199B2 (en) 2012-11-13 2015-08-25 Sap Se Conversation graphical user interface (GUI)
CA2798022A1 (en) 2012-12-04 2014-06-04 Hugh Hull Worker self-management system and method
MY167769A (en) * 2012-12-07 2018-09-24 Malaysian Agricultural Res And Development Institute Mardi Method and System for Food Tracking and Food Ordering
EP2929430A1 (en) 2012-12-10 2015-10-14 Viditeck AG Rules based data processing system and method
US9935910B2 (en) 2012-12-21 2018-04-03 Google Llc Recipient location aware notifications in response to related posts
US20140181155A1 (en) 2012-12-21 2014-06-26 Dropbox, Inc. Systems and methods for directing imaged documents to specified storage locations
EP2750087A1 (en) 2012-12-28 2014-07-02 Exapaq Sas Methods and systems for determining estimated package delivery/pick-up times
US10554594B2 (en) 2013-01-10 2020-02-04 Vmware, Inc. Method and system for automatic switching between chat windows
US9239719B1 (en) 2013-01-23 2016-01-19 Amazon Technologies, Inc. Task management system
US9170993B2 (en) 2013-01-29 2015-10-27 Hewlett-Packard Development Company, L.P. Identifying tasks and commitments using natural language processing and machine learning
US9946691B2 (en) 2013-01-30 2018-04-17 Microsoft Technology Licensing, Llc Modifying a document with separately addressable content blocks
US20140229816A1 (en) 2013-02-08 2014-08-14 Syed Imtiaz Yakub Methods and devices for tagging a document
US20140306837A1 (en) 2013-02-13 2014-10-16 Veedims, Llc System and method for qualitative indication of cumulative wear status
US20140240735A1 (en) 2013-02-22 2014-08-28 Xerox Corporation Systems and methods for using a printer driver to create and apply barcodes
US9449031B2 (en) 2013-02-28 2016-09-20 Ricoh Company, Ltd. Sorting and filtering a table with image data and symbolic data in a single cell
US20140324895A1 (en) * 2013-03-01 2014-10-30 GoPop.TV, Inc. System and method for creating and maintaining a database of annotations corresponding to portions of a content item
JP5472504B1 (en) 2013-03-12 2014-04-16 富士ゼロックス株式会社 Work flow creation support apparatus and method, and program
US20140278638A1 (en) 2013-03-12 2014-09-18 Springshot, Inc. Workforce productivity tool
US10372292B2 (en) 2013-03-13 2019-08-06 Microsoft Technology Licensing, Llc Semantic zoom-based navigation of displayed content
US9305170B1 (en) 2013-03-13 2016-04-05 Symantec Corporation Systems and methods for securely providing information external to documents
US20140280377A1 (en) 2013-03-14 2014-09-18 Scribestar Ltd. Systems and methods for collaborative document review
US9063631B2 (en) 2013-03-15 2015-06-23 Chad Dustin TILLMAN System and method for cooperative sharing of resources of an environment
US10803512B2 (en) 2013-03-15 2020-10-13 Commerce Signals, Inc. Graphical user interface for object discovery and mapping in open systems
US11727357B2 (en) 2019-07-31 2023-08-15 True Client Pro Data structures, graphical user interfaces, and computer-implemented processes for automation of project management
US20140281869A1 (en) 2013-03-15 2014-09-18 Susan Yob Variable size table templates, interactive size tables, distributable size tables, and related systems and methods
US8935272B2 (en) 2013-03-17 2015-01-13 Alation, Inc. Curated answers community automatically populated through user query monitoring
US9659058B2 (en) 2013-03-22 2017-05-23 X1 Discovery, Inc. Methods and systems for federation of results from search indexing
US10997556B2 (en) 2013-04-08 2021-05-04 Oracle International Corporation Summarizing tabular data across multiple projects using user-defined attributes
US9715476B2 (en) 2013-04-10 2017-07-25 Microsoft Technology Licensing, Llc Collaborative authoring with scratchpad functionality
US20140324501A1 (en) 2013-04-30 2014-10-30 The Glassbox Incorporated Method and system for automated template creation and rollup
US9015716B2 (en) 2013-04-30 2015-04-21 Splunk Inc. Proactive monitoring tree with node pinning for concurrent node comparisons
US9336502B2 (en) 2013-04-30 2016-05-10 Oracle International Corporation Showing relationships between tasks in a Gantt chart
US20140324497A1 (en) 2013-04-30 2014-10-30 Nitin Kumar Verma Tracking business processes and instances
US9361287B1 (en) 2013-05-22 2016-06-07 Google Inc. Non-collaborative filters in a collaborative document
US10346621B2 (en) 2013-05-23 2019-07-09 yTrre, Inc. End-to-end situation aware operations solution for customer experience centric businesses
US9251487B2 (en) 2013-06-06 2016-02-02 Safford T Black System and method for computing and overwriting the appearance of project tasks and milestones
US9253130B2 (en) 2013-06-12 2016-02-02 Cloudon Ltd Systems and methods for supporting social productivity using a dashboard
US20140372856A1 (en) 2013-06-14 2014-12-18 Microsoft Corporation Natural Quick Functions Gestures
US20140372932A1 (en) 2013-06-15 2014-12-18 Microsoft Corporation Filtering Data with Slicer-Style Filtering User Interface
US9026897B2 (en) 2013-07-12 2015-05-05 Logic9S, Llc Integrated, configurable, sensitivity, analytical, temporal, visual electronic plan system
US20150378542A1 (en) 2013-07-22 2015-12-31 Hitachi, Ltd. Management system for computer system
US20150032686A1 (en) 2013-07-23 2015-01-29 Salesforce.Com, Inc. Application sharing functionality in an information networking environment
US9360992B2 (en) 2013-07-29 2016-06-07 Microsoft Technology Licensing, Llc Three dimensional conditional formatting
JP6592877B2 (en) 2013-07-31 2019-10-23 株式会社リコー Printing apparatus, printing system, and printed matter manufacturing method
US20150046209A1 (en) 2013-08-09 2015-02-12 slipcal, PBC System and method for providing calendar services to users
WO2015025386A1 (en) 2013-08-21 2015-02-26 株式会社日立製作所 Data processing system, data processing method, and data processing device
US9152695B2 (en) 2013-08-28 2015-10-06 Intelati, Inc. Generation of metadata and computational model for visual exploration system
US9658757B2 (en) 2013-09-04 2017-05-23 Tencent Technology (Shenzhen) Company Limited Method and device for managing progress indicator display
US9679456B2 (en) 2013-09-06 2017-06-13 Tracfind, Inc. System and method for tracking assets
US9635091B1 (en) 2013-09-09 2017-04-25 Chad Dustin TILLMAN User interaction with desktop environment
US10080060B2 (en) 2013-09-10 2018-09-18 Opentv, Inc. Systems and methods of displaying content
US20150074728A1 (en) 2013-09-10 2015-03-12 Opentv, Inc. Systems and methods of displaying content
AU2014319964B2 (en) 2013-09-12 2019-01-17 Wix.Com Ltd. System and method for automated conversion of interactive sites and applications to support mobile and other display environments
US9128972B2 (en) 2013-09-21 2015-09-08 Oracle International Corporation Multi-version concurrency control on in-memory snapshot store of oracle in-memory database
CA2924826A1 (en) 2013-09-27 2015-04-02 Ab Initio Technology Llc Evaluating rules applied to data
US20150106736A1 (en) 2013-10-15 2015-04-16 Salesforce.Com, Inc. Role-based presentation of user interface
US9798829B1 (en) 2013-10-22 2017-10-24 Google Inc. Data graph interface
US10282406B2 (en) 2013-10-31 2019-05-07 Nicolas Bissantz System for modifying a table
US10067928B1 (en) 2013-11-06 2018-09-04 Apttex Corporation. Creating a spreadsheet template for generating an end user spreadsheet with dynamic cell dimensions retrieved from a remote application
US20150142676A1 (en) 2013-11-13 2015-05-21 Tweddle Group Systems and methods for managing authored content generation, approval, and distribution
US10327712B2 (en) 2013-11-16 2019-06-25 International Business Machines Corporation Prediction of diseases based on analysis of medical exam and/or test workflow
EP2874073A1 (en) 2013-11-18 2015-05-20 Fujitsu Limited System, apparatus, program and method for data aggregation
US9674042B2 (en) 2013-11-25 2017-06-06 Amazon Technologies, Inc. Centralized resource usage visualization service for large-scale network topologies
US10380239B2 (en) 2013-12-03 2019-08-13 Sharethrough Inc. Dynamic native advertisment insertion
JP6298079B2 (en) 2013-12-16 2018-03-20 楽天株式会社 Visit management system, program, and visit management method
US20150169531A1 (en) 2013-12-17 2015-06-18 Microsoft Corporation Touch/Gesture-Enabled Interaction with Electronic Spreadsheets
US9742827B2 (en) * 2014-01-02 2017-08-22 Alcatel Lucent Rendering rated media content on client devices using packet-level ratings
US9633074B1 (en) 2014-01-03 2017-04-25 Amazon Technologies, Inc. Querying data set tables in a non-transactional database
US20170200122A1 (en) 2014-01-10 2017-07-13 Kuhoo G. Edson Information organization, management, and processing system and methods
US10747950B2 (en) 2014-01-30 2020-08-18 Microsoft Technology Licensing, Llc Automatic insights for spreadsheets
US20150212717A1 (en) 2014-01-30 2015-07-30 Honeywell International Inc. Dashboard and control point configurators
US10534844B2 (en) 2014-02-03 2020-01-14 Oracle International Corporation Systems and methods for viewing and editing composite documents
US10831356B2 (en) 2014-02-10 2020-11-10 International Business Machines Corporation Controlling visualization of data by a dashboard widget
US10360642B2 (en) * 2014-02-18 2019-07-23 Google Llc Global comments for a media item
WO2015127404A1 (en) 2014-02-24 2015-08-27 Microsoft Technology Licensing, Llc Unified presentation of contextually connected information to improve user efficiency and interaction performance
US9380342B2 (en) 2014-02-28 2016-06-28 Rovi Guides, Inc. Systems and methods for control of media access based on crowd-sourced access control data and user-attributes
US9727376B1 (en) 2014-03-04 2017-08-08 Palantir Technologies, Inc. Mobile tasks
US10587714B1 (en) * 2014-03-12 2020-03-10 Amazon Technologies, Inc. Method for aggregating distributed data
US9519699B1 (en) * 2014-03-12 2016-12-13 Amazon Technologies, Inc. Consistency of query results in a distributed system
US10769122B2 (en) 2014-03-13 2020-09-08 Ab Initio Technology Llc Specifying and applying logical validation rules to data
US10573407B2 (en) 2014-03-21 2020-02-25 Leonard Ginsburg Medical services tracking server system and method
US20150281292A1 (en) 2014-03-25 2015-10-01 PlusAmp, Inc. Data File Discovery, Visualization, and Actioning
US9413707B2 (en) 2014-04-11 2016-08-09 ACR Development, Inc. Automated user task management
US9576070B2 (en) 2014-04-23 2017-02-21 Akamai Technologies, Inc. Creation and delivery of pre-rendered web pages for accelerated browsing
US10025804B2 (en) 2014-05-04 2018-07-17 Veritas Technologies Llc Systems and methods for aggregating information-asset metadata from multiple disparate data-management systems
US9710430B2 (en) 2014-05-09 2017-07-18 Sap Se Representation of datasets using view-specific visual bundlers
US10318625B2 (en) 2014-05-13 2019-06-11 International Business Machines Corporation Table narration using narration templates
AU2015258733B2 (en) * 2014-05-14 2020-03-12 Pagecloud Inc. Methods and systems for web content generation
US9977654B2 (en) 2014-06-20 2018-05-22 Asset, S.r.L. Method of developing an application for execution in a workflow management system and apparatus to assist with generation of an application for execution in a workflow management system
US20150370462A1 (en) 2014-06-20 2015-12-24 Microsoft Corporation Creating calendar event from timeline
US9874995B2 (en) 2014-06-25 2018-01-23 Oracle International Corporation Maintaining context for maximize interactions on grid-based visualizations
US9569418B2 (en) 2014-06-27 2017-02-14 International Busines Machines Corporation Stream-enabled spreadsheet as a circuit
US9442714B2 (en) 2014-06-28 2016-09-13 Vmware, Inc. Unified visualization of a plan of operations in a datacenter
WO2016004138A2 (en) 2014-06-30 2016-01-07 Shaaban Ahmed Farouk Improved system and method for budgeting and cash flow forecasting
US10606855B2 (en) 2014-07-10 2020-03-31 Oracle International Corporation Embedding analytics within transaction search
US10585892B2 (en) 2014-07-10 2020-03-10 Oracle International Corporation Hierarchical dimension analysis in multi-dimensional pivot grids
US10928970B2 (en) 2014-07-18 2021-02-23 Apple Inc. User-interface for developing applications that apply machine learning
US9760271B2 (en) 2014-07-28 2017-09-12 International Business Machines Corporation Client-side dynamic control of visualization of frozen region in a data table
US9846687B2 (en) 2014-07-28 2017-12-19 Adp, Llc Word cloud candidate management system
US10151025B2 (en) 2014-07-31 2018-12-11 Seagate Technology Llc Helmholtz coil assisted PECVD carbon source
US9613086B1 (en) 2014-08-15 2017-04-04 Tableau Software, Inc. Graphical user interface for generating and displaying data visualizations that use relationships
US9779147B1 (en) 2014-08-15 2017-10-03 Tableau Software, Inc. Systems and methods to query and visualize data and relationships
US9524429B2 (en) 2014-08-21 2016-12-20 Microsoft Technology Licensing, Llc Enhanced interpretation of character arrangements
EP2988231A1 (en) 2014-08-21 2016-02-24 Samsung Electronics Co., Ltd. Method and apparatus for providing summarized content to users
US20160063435A1 (en) 2014-08-27 2016-03-03 Inam Shah Systems and methods for facilitating secure ordering, payment and delivery of goods or services
US9424333B1 (en) 2014-09-05 2016-08-23 Addepar, Inc. Systems and user interfaces for dynamic and interactive report generation and editing based on automatic traversal of complex data structures
KR20160029985A (en) 2014-09-05 2016-03-16 성균관대학교산학협력단 A method for generating plasma uniformly on dielectric material
US9872174B2 (en) 2014-09-19 2018-01-16 Google Inc. Transferring application data between devices
US10210246B2 (en) 2014-09-26 2019-02-19 Oracle International Corporation Techniques for similarity analysis and data enrichment using knowledge sources
US10261673B2 (en) 2014-10-05 2019-04-16 Splunk Inc. Statistics value chart interface cell mode drill down
US20160098574A1 (en) 2014-10-07 2016-04-07 Cynny Spa Systems and methods to manage file access
US10505825B1 (en) 2014-10-09 2019-12-10 Splunk Inc. Automatic creation of related event groups for IT service monitoring
US10467337B2 (en) 2014-10-27 2019-11-05 Kinaxis Inc. Responsive data exploration on small screen devices
US10410297B2 (en) 2014-11-03 2019-09-10 PJS of Texas Inc. Devices, systems, and methods of activity-based monitoring and incentivization
ES2949399T3 (en) 2014-11-12 2023-09-28 Calctopia Ltd Secure multiparty computing in spreadsheets
WO2016115130A1 (en) 2015-01-15 2016-07-21 Servicenow, Inc. Related table notifications
US9424545B1 (en) 2015-01-15 2016-08-23 Hito Management Company Geospatial construction task management system and method
US10061824B2 (en) 2015-01-30 2018-08-28 Splunk Inc. Cell-based table manipulation of event data
US9183303B1 (en) 2015-01-30 2015-11-10 Dropbox, Inc. Personal content item searching system and method
CN105991398A (en) 2015-02-04 2016-10-05 阿里巴巴集团控股有限公司 Instant message IM chatting records storage method and apparatus
US20160224939A1 (en) 2015-02-04 2016-08-04 Broadvision, Inc. Systems and methods for managing tasks
US11238397B2 (en) 2015-02-09 2022-02-01 Fedex Corporate Services, Inc. Methods, apparatus, and systems for generating a corrective pickup notification for a shipped item using a mobile master node
US10733256B2 (en) 2015-02-10 2020-08-04 Researchgate Gmbh Online publication system and method
US20160231915A1 (en) 2015-02-10 2016-08-11 Microsoft Technology Licensing, Llc. Real-time presentation of customizable drill-down views of data at specific data points
US20160246490A1 (en) 2015-02-25 2016-08-25 Bank Of America Corporation Customizable Dashboard
US10229655B2 (en) 2015-02-28 2019-03-12 Microsoft Technology Licensing, Llc Contextual zoom
US20170061820A1 (en) 2015-03-01 2017-03-02 Babak Firoozbakhsh Goal based monetary reward system
US20160259856A1 (en) 2015-03-03 2016-09-08 International Business Machines Corporation Consolidating and formatting search results
US9928281B2 (en) 2015-03-20 2018-03-27 International Business Machines Corporation Lightweight table comparison
EP3276570A4 (en) 2015-03-27 2018-11-07 Hitachi, Ltd. Computer system and information processing method
US10719220B2 (en) 2015-03-31 2020-07-21 Autodesk, Inc. Dynamic scrolling
US10691323B2 (en) 2015-04-10 2020-06-23 Apple Inc. Column fit document traversal for reader application
US10503836B2 (en) 2015-04-13 2019-12-10 Equivalentor Oy Method for generating natural language communication
US10546001B1 (en) 2015-04-15 2020-01-28 Arimo, LLC Natural language queries based on user defined attributes
US10277672B2 (en) 2015-04-17 2019-04-30 Zuora, Inc. System and method for real-time cloud data synchronization using a database binary log
US20160323224A1 (en) 2015-04-28 2016-11-03 SmartSheet.com, Inc. Systems and methods for providing an email client interface with an integrated tabular data management interface
US10831449B2 (en) 2015-04-28 2020-11-10 Lexica S.A.S. Process and system for automatic generation of functional architecture documents and software design and analysis specification documents from natural language
US10867269B2 (en) 2015-04-29 2020-12-15 NetSuite Inc. System and methods for processing information regarding relationships and interactions to assist in making organizational decisions
WO2016179434A1 (en) 2015-05-05 2016-11-10 Dart Neuroscience, Llc Systems and methods for cognitive testing
US20160335604A1 (en) 2015-05-13 2016-11-17 SJ MedConnect, Inc. Multi-program scheduling platform with sharing
WO2016183550A1 (en) 2015-05-14 2016-11-17 Walleye Software, LLC Dynamic table index mapping
US10282424B2 (en) 2015-05-19 2019-05-07 Researchgate Gmbh Linking documents using citations
US10354419B2 (en) 2015-05-25 2019-07-16 Colin Frederick Ritchie Methods and systems for dynamic graph generating
US10051020B2 (en) * 2015-06-26 2018-08-14 Microsoft Technology Licensing, Llc Real-time distributed coauthoring via vector clock translations
US10169552B2 (en) 2015-07-17 2019-01-01 Box, Inc. Event-driven generation of watermarked previews of an object in a collaboration environment
US10366083B2 (en) 2015-07-29 2019-07-30 Oracle International Corporation Materializing internal computations in-memory to improve query performance
US10033702B2 (en) * 2015-08-05 2018-07-24 Intralinks, Inc. Systems and methods of secure data exchange
US10489391B1 (en) 2015-08-17 2019-11-26 Palantir Technologies Inc. Systems and methods for grouping and enriching data items accessed from one or more databases for presentation in a user interface
US10140314B2 (en) 2015-08-21 2018-11-27 Adobe Systems Incorporated Previews for contextual searches
US10380528B2 (en) 2015-08-27 2019-08-13 Jpmorgan Chase Bank, N.A. Interactive approach for managing risk and transparency
US20170060609A1 (en) 2015-08-28 2017-03-02 International Business Machines Corporation Managing a shared pool of configurable computing resources which has a set of containers
US20170061360A1 (en) 2015-09-01 2017-03-02 SmartSheet.com, Inc. Interactive charts with dynamic progress monitoring, notification, and resource allocation
US10558349B2 (en) 2015-09-15 2020-02-11 Medidata Solutions, Inc. Functional scrollbar and system
US10120552B2 (en) * 2015-09-25 2018-11-06 International Business Machines Corporation Annotating collaborative content to facilitate mining key content as a runbook
US10205730B2 (en) 2015-09-29 2019-02-12 International Business Machines Corporation Access control for database
US20170109499A1 (en) 2015-10-19 2017-04-20 Rajiv Doshi Disease management systems comprising dietary supplements
JP6398944B2 (en) 2015-10-28 2018-10-03 オムロン株式会社 Data distribution management system
US11157689B2 (en) 2015-11-02 2021-10-26 Microsoft Technology Licensing, Llc Operations on dynamic data associated with cells in spreadsheets
US10599764B2 (en) 2015-11-02 2020-03-24 Microsoft Technology Licensing, Llc Operations on images associated with cells in spreadsheets
US10540435B2 (en) 2015-11-02 2020-01-21 Microsoft Technology Licensing, Llc Decks, cards, and mobile UI
US10255335B2 (en) 2015-11-06 2019-04-09 Cloudera, Inc. Database workload analysis and optimization visualizations
US11443390B1 (en) 2015-11-06 2022-09-13 Addepar, Inc. Systems and user interfaces for dynamic and interactive table generation and editing based on automatic traversal of complex data structures and incorporation of metadata mapped to the complex data structures
CA3004610A1 (en) 2015-11-09 2017-05-18 Nexwriter Limited Collaborative document creation by a plurality of distinct teams
US20170132652A1 (en) 2015-11-11 2017-05-11 Mastercard International Incorporated Systems and Methods for Processing Loyalty Rewards
US20170139891A1 (en) 2015-11-13 2017-05-18 Sap Se Shared elements for business information documents
US10366114B2 (en) 2015-11-15 2019-07-30 Microsoft Technology Licensing, Llc Providing data presentation functionality associated with collaboration database
AU2016356737A1 (en) 2015-11-20 2018-02-15 Wisetech Global Limited Systems and methods of a production environment tool
US10380140B2 (en) 2015-11-30 2019-08-13 Tableau Software, Inc. Systems and methods for implementing a virtual machine for interactive visual analysis
US10503360B2 (en) 2015-11-30 2019-12-10 Unisys Corporation System and method for adaptive control and annotation interface
US10089288B2 (en) 2015-12-04 2018-10-02 Ca, Inc. Annotations management for electronic documents handling
US10055444B2 (en) 2015-12-16 2018-08-21 American Express Travel Related Services Company, Inc. Systems and methods for access control over changing big data structures
US10185707B2 (en) 2015-12-16 2019-01-22 Microsoft Technology Licensing, Llc Aggregate visualizations of activities performed with respect to portions of electronic documents
WO2017112713A1 (en) 2015-12-21 2017-06-29 University Of Utah Research Foundation Method for approximate processing of complex join queries
US10977435B2 (en) 2015-12-28 2021-04-13 Informatica Llc Method, apparatus, and computer-readable medium for visualizing relationships between pairs of columns
US10089289B2 (en) * 2015-12-29 2018-10-02 Palantir Technologies Inc. Real-time document annotation
WO2017124024A1 (en) 2016-01-14 2017-07-20 Sumo Logic Single click delta analysis
US10068100B2 (en) 2016-01-20 2018-09-04 Microsoft Technology Licensing, Llc Painting content classifications onto document portions
US20170212924A1 (en) * 2016-01-21 2017-07-27 Salesforce.Com, Inc. Configurable database platform for updating objects
US10068104B2 (en) * 2016-01-29 2018-09-04 Microsoft Technology Licensing, Llc Conditional redaction of portions of electronic documents
US10558679B2 (en) 2016-02-10 2020-02-11 Fuji Xerox Co., Ltd. Systems and methods for presenting a topic-centric visualization of collaboration data
US10068617B2 (en) 2016-02-10 2018-09-04 Microsoft Technology Licensing, Llc Adding content to a media timeline
US10347017B2 (en) 2016-02-12 2019-07-09 Microsoft Technology Licensing, Llc Interactive controls that are collapsible and expandable and sequences for chart visualization optimizations
US10748312B2 (en) 2016-02-12 2020-08-18 Microsoft Technology Licensing, Llc Tagging utilizations for selectively preserving chart elements during visualization optimizations
US10430451B2 (en) 2016-02-22 2019-10-01 Arie Rota System and method for aggregating and sharing accumulated information
US10540434B2 (en) 2016-03-01 2020-01-21 Business Objects Software Limited Dynamic disaggregation and aggregation of spreadsheet data
US10148849B2 (en) * 2016-03-07 2018-12-04 Kyocera Document Solutions Inc. Systems and methods for printing a document using a graphical code image
US9792567B2 (en) 2016-03-11 2017-10-17 Route4Me, Inc. Methods and systems for managing large asset fleets through a virtual reality interface
US11748709B2 (en) 2016-03-14 2023-09-05 Project Map Ltd. Systems and programs for project portfolio management
US10127945B2 (en) 2016-03-15 2018-11-13 Google Llc Visualization of image themes based on image content
US10229099B2 (en) 2016-03-22 2019-03-12 Business Objects Software Limited Shared and private annotation of content from a collaboration session
CN109074619B (en) 2016-03-23 2022-05-24 福特全球技术公司 Enhanced cargo transport system
US10146665B2 (en) 2016-03-24 2018-12-04 Oracle International Corporation Systems and methods for providing dynamic and real time simulations of matching resources to requests
CN107241622A (en) 2016-03-29 2017-10-10 北京三星通信技术研究有限公司 video location processing method, terminal device and cloud server
US20170285890A1 (en) 2016-03-30 2017-10-05 Microsoft Technology Licensing, Llc Contextual actions from collaboration features
US10733546B2 (en) 2016-03-30 2020-08-04 Experian Health, Inc. Automated user interface generation for process tracking
US11030259B2 (en) 2016-04-13 2021-06-08 Microsoft Technology Licensing, Llc Document searching visualized within a document
FI20165327A (en) 2016-04-15 2017-10-16 Copla Oy document Automation
WO2017189933A1 (en) 2016-04-27 2017-11-02 Krypton Project, Inc. System, method, and apparatus for operating a unified document surface workspace
US10635746B2 (en) * 2016-04-29 2020-04-28 Microsoft Technology Licensing, Llc Web-based embeddable collaborative workspace
US9532004B1 (en) 2016-05-12 2016-12-27 Google Inc. Animated user identifiers
US10353534B2 (en) 2016-05-13 2019-07-16 Sap Se Overview page in multi application user interface
EP3246771B1 (en) 2016-05-17 2021-06-30 Siemens Aktiengesellschaft Method for operating a redundant automation system
CN105871466B (en) 2016-05-25 2021-10-29 全球能源互联网研究院 Wide-area stable communication device and method with intelligent identification function
US9720602B1 (en) 2016-06-01 2017-08-01 International Business Machines Corporation Data transfers in columnar data systems
US10095747B1 (en) * 2016-06-06 2018-10-09 @Legal Discovery LLC Similar document identification using artificial intelligence
US10747774B2 (en) 2016-06-19 2020-08-18 Data.World, Inc. Interactive interfaces to present data arrangement overviews and summarized dataset attributes for collaborative datasets
US11036716B2 (en) 2016-06-19 2021-06-15 Data World, Inc. Layered data generation and data remediation to facilitate formation of interrelated data in a system of networked collaborative datasets
CA2971784A1 (en) 2016-06-23 2017-12-23 Radicalogic Technologies, Inc. Healthcare workflow system
US9942418B2 (en) * 2016-06-28 2018-04-10 Kyocera Document Solutions Inc. Methods for configuring settings for an image forming apparatus with template sheet
US9817806B1 (en) 2016-06-28 2017-11-14 International Business Machines Corporation Entity-based content change management within a document content management system
US10445702B1 (en) 2016-06-30 2019-10-15 John E. Hunt Personal adaptive scheduling system and associated methods
US20180025084A1 (en) 2016-07-19 2018-01-25 Microsoft Technology Licensing, Llc Automatic recommendations for content collaboration
US10554644B2 (en) 2016-07-20 2020-02-04 Fisher-Rosemount Systems, Inc. Two-factor authentication for user interface devices in a process plant
US10324609B2 (en) 2016-07-21 2019-06-18 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US10558651B2 (en) 2016-07-27 2020-02-11 Splunk Inc. Search point management
US10776569B2 (en) 2016-07-29 2020-09-15 International Business Machines Corporation Generation of annotated computerized visualizations with explanations for areas of interest
US10564622B1 (en) 2016-07-31 2020-02-18 Splunk Inc. Control interface for metric definition specification for assets and asset groups driven by search-derived asset tree hierarchy
US10459938B1 (en) 2016-07-31 2019-10-29 Splunk Inc. Punchcard chart visualization for machine data search and analysis system
US9753935B1 (en) 2016-08-02 2017-09-05 Palantir Technologies Inc. Time-series data storage and processing database system
WO2018023798A1 (en) * 2016-08-05 2018-02-08 王志强 Method for collecting dish praises on basis of qr code, and comment system
US10261747B2 (en) 2016-09-09 2019-04-16 The Boeing Company Synchronized side-by-side display of live video and corresponding virtual environment images
US10565222B2 (en) 2016-09-15 2020-02-18 Oracle International Corporation Techniques for facilitating the joining of datasets
US10650000B2 (en) 2016-09-15 2020-05-12 Oracle International Corporation Techniques for relationship discovery between datasets
US10831983B2 (en) 2016-09-16 2020-11-10 Oracle International Corporation Techniques for managing display of headers in an electronic document
US10496741B2 (en) 2016-09-21 2019-12-03 FinancialForce.com, Inc. Dynamic intermediate templates for richly formatted output
US10318348B2 (en) 2016-09-23 2019-06-11 Imagination Technologies Limited Task scheduling in a GPU
US10489424B2 (en) 2016-09-26 2019-11-26 Amazon Technologies, Inc. Different hierarchies of resource data objects for managing system resources
US10540152B1 (en) 2016-09-23 2020-01-21 Massachusetts Mutual Life Insurance Company Systems, devices, and methods for software coding
US10747764B1 (en) 2016-09-28 2020-08-18 Amazon Technologies, Inc. Index-based replica scale-out
US11093703B2 (en) 2016-09-29 2021-08-17 Google Llc Generating charts from data in a data table
US20180095938A1 (en) 2016-09-30 2018-04-05 Sap Se Synchronized calendar and timeline adaptive user interface
US11068125B2 (en) 2016-10-27 2021-07-20 Google Llc Multi-spatial overview mode
US10043296B2 (en) 2016-10-27 2018-08-07 Sap Se Visual relationship between table values
US10991033B2 (en) 2016-10-28 2021-04-27 International Business Machines Corporation Optimization of delivery to a recipient in a moving vehicle
US10242079B2 (en) 2016-11-07 2019-03-26 Tableau Software, Inc. Optimizing execution of data transformation flows
US10107641B2 (en) 2016-11-08 2018-10-23 Google Llc Linear visualization of a driving route
US10540153B2 (en) 2016-12-03 2020-01-21 Thomas STACHURA Spreadsheet-based software application development
US10216494B2 (en) 2016-12-03 2019-02-26 Thomas STACHURA Spreadsheet-based software application development
US10650050B2 (en) 2016-12-06 2020-05-12 Microsoft Technology Licensing, Llc Synthesizing mapping relationships using table corpus
US10528599B1 (en) 2016-12-16 2020-01-07 Amazon Technologies, Inc. Tiered data processing for distributed data
CN110663040B (en) 2016-12-21 2023-08-22 奥恩全球运营有限公司,新加坡分公司 Method and system for securely embedding dashboard into content management system
JP6764779B2 (en) 2016-12-26 2020-10-07 株式会社日立製作所 Synonymous column candidate selection device, synonymous column candidate selection method, and synonymous column candidate selection program
US20180181716A1 (en) 2016-12-27 2018-06-28 General Electric Company Role-based navigation interface systems and methods
CN106646641A (en) 2016-12-29 2017-05-10 上海瑞示电子科技有限公司 Detection method and detection system based on multiple detectors
US10719807B2 (en) 2016-12-29 2020-07-21 Dropbox, Inc. Managing projects using references
US10496737B1 (en) 2017-01-05 2019-12-03 Massachusetts Mutual Life Insurance Company Systems, devices, and methods for software coding
US20180225270A1 (en) 2017-02-06 2018-08-09 International Business Machines Corporation Processing user action in data integration tools
US20200005295A1 (en) 2017-02-10 2020-01-02 Jean Louis Murphy Secure location based electronic financial transaction methods and systems
EP3605363A4 (en) 2017-03-30 2020-02-26 Nec Corporation Information processing system, feature value explanation method and feature value explanation program
US20180285918A1 (en) * 2017-03-31 2018-10-04 Tyler Staggs Advertising incentives
US10372810B2 (en) 2017-04-05 2019-08-06 Microsoft Technology Licensing, Llc Smarter copy/paste
WO2018187815A1 (en) 2017-04-07 2018-10-11 Relola, Inc. System and method of collecting and providing service provider records
CN107123424B (en) 2017-04-27 2022-03-11 腾讯科技(深圳)有限公司 Audio file processing method and device
US10437795B2 (en) 2017-05-12 2019-10-08 Sap Se Upgrading systems with changing constraints
US20180330320A1 (en) 2017-05-12 2018-11-15 Mastercard International Incorporated Method and system for real-time update, tracking, and notification of package delivery
US10846285B2 (en) 2017-06-02 2020-11-24 Chaossearch, Inc. Materialization for data edge platform
US10650033B2 (en) 2017-06-08 2020-05-12 Microsoft Technology Licensing, Llc Calendar user interface search and interactivity features
US10348658B2 (en) 2017-06-15 2019-07-09 Google Llc Suggested items for use with embedded applications in chat conversations
US10534917B2 (en) 2017-06-20 2020-01-14 Xm Cyber Ltd. Testing for risk of macro vulnerability
US11635908B2 (en) 2017-06-22 2023-04-25 Adobe Inc. Managing digital assets stored as components and packaged files
US10713246B2 (en) 2017-06-22 2020-07-14 Sap Se Column based data access controls
US20190012342A1 (en) 2017-07-10 2019-01-10 Kaspect Labs Llc Method and apparatus for continuously producing analytical reports
US10628002B1 (en) 2017-07-10 2020-04-21 Palantir Technologies Inc. Integrated data authentication system with an interactive user interface
US11106862B2 (en) * 2017-07-28 2021-08-31 Cisco Technology, Inc. Combining modalities for collaborating while editing and annotating files
US11122094B2 (en) 2017-07-28 2021-09-14 Snap Inc. Software application manager for messaging applications
US20190050812A1 (en) 2017-08-09 2019-02-14 Mario Boileau Project management and activity tracking methods and systems
US10845976B2 (en) 2017-08-21 2020-11-24 Immersive Systems Inc. Systems and methods for representing data, media, and time using spatial levels of detail in 2D and 3D digital applications
US10609140B2 (en) 2017-08-28 2020-03-31 Salesforce.Com, Inc. Dynamic resource management systems and methods
JP6939285B2 (en) 2017-09-05 2021-09-22 ブラザー工業株式会社 Data processing programs and data processing equipment
CN107885656B (en) 2017-09-13 2021-02-09 平安科技(深圳)有限公司 Automatic product algorithm testing method and application server
CN107623596A (en) 2017-09-15 2018-01-23 郑州云海信息技术有限公司 Start the method for testing network element positioning investigation failure in a kind of NFV platforms
US10693758B2 (en) 2017-09-25 2020-06-23 Splunk Inc. Collaborative incident management for networked computing systems
US11138371B2 (en) 2017-09-28 2021-10-05 Oracle International Corporation Editable table in a spreadsheet integrated with a web service
GB201716305D0 (en) 2017-10-05 2017-11-22 Palantir Technologies Inc Dashboard creation and management
US20190114589A1 (en) 2017-10-16 2019-04-18 RightSource Compliance Housing assistance application audit management system and method
US10409895B2 (en) 2017-10-17 2019-09-10 Qualtrics, Llc Optimizing a document based on dynamically updating content
US11341321B2 (en) 2017-10-20 2022-05-24 Uxstorm, Llc UI enabling mapping engine system and process interconnecting spreadsheets and database-driven applications
US10979235B2 (en) 2017-10-20 2021-04-13 Dropbox, Inc. Content management system supporting third-party code
US10380772B2 (en) 2017-10-30 2019-08-13 Safford T Black System and method for non-linear and discontinuous project timelines
US11741300B2 (en) 2017-11-03 2023-08-29 Dropbox, Inc. Embedded spreadsheet data implementation and synchronization
US11645321B2 (en) 2017-11-03 2023-05-09 Salesforce, Inc. Calculating relationship strength using an activity-based distributed graph
US10282405B1 (en) 2017-11-03 2019-05-07 Dropbox, Inc. Task management in a collaborative spreadsheet environment
US11157149B2 (en) 2017-12-08 2021-10-26 Google Llc Managing comments in a cloud-based environment
US10705805B1 (en) 2017-12-12 2020-07-07 Amazon Technologies, Inc. Application authoring using web-of-sheets data model
US10397403B2 (en) 2017-12-28 2019-08-27 Ringcentral, Inc. System and method for managing events at contact center
US11263592B2 (en) 2018-01-07 2022-03-01 Microsoft Technology Licensing, Llc Multi-calendar harmonization
US11004165B2 (en) 2018-01-12 2021-05-11 ClearstoneIP, Inc. Management systems and methods for claim-based patent analysis
US10534527B2 (en) 2018-01-12 2020-01-14 Wacom Co., Ltd. Relative pen scroll
US20190236188A1 (en) 2018-01-31 2019-08-01 Salesforce.Com, Inc. Query optimizer constraints
US11003832B2 (en) 2018-02-07 2021-05-11 Microsoft Technology Licensing, Llc Embedded action card in editable electronic document
US20190251884A1 (en) 2018-02-14 2019-08-15 Microsoft Technology Licensing, Llc Shared content display with concurrent views
US10664650B2 (en) 2018-02-21 2020-05-26 Microsoft Technology Licensing, Llc Slide tagging and filtering
US10496382B2 (en) 2018-02-22 2019-12-03 Midea Group Co., Ltd. Machine generation of context-free grammar for intent deduction
US10789387B2 (en) 2018-03-13 2020-09-29 Commvault Systems, Inc. Graphical representation of an information management system
US10819560B2 (en) 2018-03-29 2020-10-27 Servicenow, Inc. Alert management system and method of using alert context-based alert rules
US10810075B2 (en) 2018-04-23 2020-10-20 EMC IP Holding Company Generating a social graph from file metadata
US10970471B2 (en) 2018-04-23 2021-04-06 International Business Machines Corporation Phased collaborative editing
US20190340550A1 (en) 2018-05-07 2019-11-07 Walmart Apollo, Llc Customized labor demand allocation system
CN108717428A (en) * 2018-05-09 2018-10-30 岑志锦 A kind of Commentary Systems based on Quick Response Code
US10565229B2 (en) 2018-05-24 2020-02-18 People.ai, Inc. Systems and methods for matching electronic activities directly to record objects of systems of record
US11132501B2 (en) 2018-05-25 2021-09-28 Salesforce.Com, Inc. Smart column selection for table operations in data preparation
US20190371442A1 (en) 2018-05-31 2019-12-05 Allscripts Software, Llc Apparatus, system and method for secure processing and transmission of data
US10650100B2 (en) 2018-06-08 2020-05-12 International Business Machines Corporation Natural language generation pattern enhancement
US11226721B2 (en) 2018-06-25 2022-01-18 Lineage Logistics, LLC Measuring and visualizing facility performance
US20200005248A1 (en) 2018-06-29 2020-01-02 Microsoft Technology Licensing, Llc Meeting preparation manager
US11698890B2 (en) 2018-07-04 2023-07-11 Monday.com Ltd. System and method for generating a column-oriented data structure repository for columns of single data types
US20200019595A1 (en) 2018-07-12 2020-01-16 Giovanni Azua Garcia System and method for graphical vector representation of a resume
US11810071B2 (en) 2018-07-12 2023-11-07 Lindy Property Management Company Property management system and related methods
WO2020018592A1 (en) 2018-07-17 2020-01-23 Methodical Mind, Llc. Graphical user interface system
US11360558B2 (en) 2018-07-17 2022-06-14 Apple Inc. Computer systems with finger devices
US11281732B2 (en) 2018-08-02 2022-03-22 Microsoft Technology Licensing, Llc Recommending development tool extensions based on media type
US11115486B2 (en) 2018-08-08 2021-09-07 Microsoft Technology Licensing, Llc Data re-use across documents
US11386112B2 (en) 2018-08-08 2022-07-12 Microsoft Technology Licensing, Llc Visualization platform for reusable data chunks
US11163777B2 (en) 2018-10-18 2021-11-02 Oracle International Corporation Smart content recommendations for content authors
US11966406B2 (en) 2018-10-22 2024-04-23 Tableau Software, Inc. Utilizing appropriate measure aggregation for generating data visualizations of multi-fact datasets
US10809991B2 (en) 2018-10-26 2020-10-20 Salesforce.Com, Inc. Security model for live applications in a cloud collaboration platform
US10936156B2 (en) 2018-11-05 2021-03-02 International Business Machines Corporation Interactive access to ascendants while navigating hierarchical dimensions
US11449815B2 (en) 2018-11-08 2022-09-20 Airslate, Inc. Automated electronic document workflows
US10761876B2 (en) 2018-11-21 2020-09-01 Microsoft Technology Licensing, Llc Faster access of virtual machine memory backed by a host computing device's virtual memory
US20200175094A1 (en) 2018-12-03 2020-06-04 Bank Of America Corporation Document visualization and distribution layering system
US11243688B1 (en) 2018-12-05 2022-02-08 Mobile Heartbeat, Llc Bi-directional application switching with contextual awareness
US11157386B2 (en) 2018-12-18 2021-10-26 Sap Se Debugging rules based on execution events stored in an event log
US11113667B1 (en) 2018-12-18 2021-09-07 Asana, Inc. Systems and methods for providing a dashboard for a collaboration work management platform
US11610058B1 (en) 2019-01-29 2023-03-21 Hitps Llc Systems and methods for reflexive questionnaire generation
WO2020160551A1 (en) 2019-02-01 2020-08-06 L2F Inc. Beverage dispensing and monitoring system
US20200265112A1 (en) 2019-02-18 2020-08-20 Microsoft Technology Licensing, Llc Dynamically adjustable content based on context
US11573993B2 (en) * 2019-03-15 2023-02-07 Ricoh Company, Ltd. Generating a meeting review document that includes links to the one or more documents reviewed
US10452360B1 (en) 2019-03-19 2019-10-22 Servicenow, Inc. Workflow support for dynamic action input
US11100075B2 (en) 2019-03-19 2021-08-24 Servicenow, Inc. Graphical user interfaces for incorporating complex data objects into a workflow
US10929107B2 (en) 2019-03-19 2021-02-23 Servicenow, Inc. Workflow support for dynamic action output
WO2020187408A1 (en) 2019-03-20 2020-09-24 Sony Corporation Post-processing of audio recordings
US11263029B2 (en) 2019-03-27 2022-03-01 Citrix Systems, Inc. Providing user interface (UI) elements having scrollable content in virtual machine sessions at reduced latency and related methods
US20200327244A1 (en) 2019-04-12 2020-10-15 Walmart Apollo, Llc System for database access restrictions using ip addresses
DK180359B1 (en) 2019-04-15 2021-02-03 Apple Inc Accelerated scrolling and selection
JP6602500B1 (en) 2019-04-22 2019-11-06 Dendritik Design株式会社 Database management system, database management method, and database management program
US11543943B2 (en) 2019-04-30 2023-01-03 Open Text Sa Ulc Systems and methods for on-image navigation and direct image-to-data storage table data capture
US20200356563A1 (en) 2019-05-08 2020-11-12 Datameer, Inc. Query performance model generation and use in a hybrid multi-cloud database environment
US10809696B1 (en) * 2019-05-09 2020-10-20 Micron Technology, Inc. Scanning encoded images on physical objects to determine parameters for a manufacturing process
US11366976B2 (en) 2019-05-09 2022-06-21 Micron Technology, Inc. Updating manufactured product life cycle data in a database based on scanning of encoded images
US20200374146A1 (en) 2019-05-24 2020-11-26 Microsoft Technology Licensing, Llc Generation of intelligent summaries of shared content based on a contextual analysis of user engagement
KR102301026B1 (en) 2019-05-30 2021-09-14 델타피디에스 주식회사 Task map providing apparatus and the method thereof
US11704494B2 (en) 2019-05-31 2023-07-18 Ab Initio Technology Llc Discovering a semantic meaning of data fields from profile data of the data fields
US11308100B2 (en) 2019-06-25 2022-04-19 Amazon Technologies, Inc. Dynamically assigning queries to secondary query processing resources
US11086894B1 (en) 2019-06-25 2021-08-10 Amazon Technologies, Inc. Dynamically updated data sheets using row links
US20210014136A1 (en) 2019-07-12 2021-01-14 SupportLogic, Inc. Assigning support tickets to support agents
JP2022541199A (en) 2019-07-16 2022-09-22 エヌフェレンス,インコーポレイテッド A system and method for inserting data into a structured database based on image representations of data tables.
US11196750B2 (en) 2019-07-18 2021-12-07 International Business Machines Corporation Fine-grained data masking according to classifications of sensitive data
US11650595B2 (en) 2019-07-30 2023-05-16 Caterpillar Inc. Worksite plan execution
US20210049524A1 (en) 2019-07-31 2021-02-18 Dr. Agile LTD Controller system for large-scale agile organization
US11379883B2 (en) 2019-08-09 2022-07-05 SOCI, Inc. Systems, devices, and methods for dynamically generating, distributing, and managing online communications
USD910077S1 (en) 2019-08-14 2021-02-09 Monday.com Ltd Display screen with graphical user interface
US11010031B2 (en) 2019-09-06 2021-05-18 Salesforce.Com, Inc. Creating and/or editing interactions between user interface elements with selections rather than coding
US11282297B2 (en) 2019-09-10 2022-03-22 Blue Planet Training, Inc. System and method for visual analysis of emotional coherence in videos
US11372947B2 (en) 2019-09-13 2022-06-28 Oracle International Corporation System and method for automatic selection for dynamic site compilation within a cloud-based content hub environment
US11010371B1 (en) 2019-09-16 2021-05-18 Palantir Technologies Inc. Tag management system
US11588764B2 (en) 2019-10-30 2023-02-21 Amazon Technologies, Inc. Extensible framework for constructing autonomous workflows
WO2021096944A1 (en) 2019-11-11 2021-05-20 Aveva Software, Llc Computerized system and method for generating and dynamically updating a dashboard of multiple processes and operations across platforms
US11175816B2 (en) 2019-11-18 2021-11-16 Monday.Com Digital processing systems and methods for automatic user time zone updates in collaborative work systems
US11113273B2 (en) 2019-11-29 2021-09-07 Amazon Technologies, Inc. Managed materialized views created from heterogeneous data sources
US11748128B2 (en) 2019-12-05 2023-09-05 International Business Machines Corporation Flexible artificial intelligence agent infrastructure for adapting processing of a shell
GB201918084D0 (en) 2019-12-10 2020-01-22 Teambento Ltd System and method for facilitating complex document drafting and management
US11222167B2 (en) 2019-12-19 2022-01-11 Adobe Inc. Generating structured text summaries of digital documents using interactive collaboration
US20210264220A1 (en) 2020-02-21 2021-08-26 Alibaba Group Holding Limited Method and system for updating embedding tables for machine learning models
US11562129B2 (en) 2020-04-20 2023-01-24 Google Llc Adding machine understanding on spreadsheets data
US20220099454A1 (en) 2020-09-29 2022-03-31 International Business Machines Corporation Navigation street view tool
US20220121325A1 (en) 2020-10-21 2022-04-21 Lenovo (Singapore) Pte. Ltd. User interface customization per application
CA3105572C (en) 2021-01-13 2022-01-18 Ryan Smith Tracking device and system
WO2022153122A1 (en) * 2021-01-14 2022-07-21 Monday.com Ltd. Systems, methods, and devices for enhanced collaborative work documents
CN112929172B (en) 2021-02-08 2023-03-14 中国工商银行股份有限公司 System, method and device for dynamically encrypting data based on key bank
US11429384B1 (en) 2021-10-14 2022-08-30 Morgan Stanley Services Group Inc. System and method for computer development data aggregation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5220657A (en) * 1987-12-02 1993-06-15 Xerox Corporation Updating local copy of shared data in a collaborative system
US20090271696A1 (en) * 2008-04-28 2009-10-29 Microsoft Corporation Conflict Resolution
US20170076101A1 (en) * 2015-09-10 2017-03-16 Airwatch Llc Systems for modular document editing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240019992A1 (en) * 2022-07-13 2024-01-18 Beijing Paoding Technology Co., Ltd Document content point-and-select method, electronic apparatus and medium

Also Published As

Publication number Publication date
US20220222421A1 (en) 2022-07-14
US11782582B2 (en) 2023-10-10
US20220222153A1 (en) 2022-07-14
US11687216B2 (en) 2023-06-27
US20220222427A1 (en) 2022-07-14
US11449668B2 (en) 2022-09-20
US11392556B1 (en) 2022-07-19
US11893213B2 (en) 2024-02-06
US20220221966A1 (en) 2022-07-14
US11475215B2 (en) 2022-10-18
US20220222625A1 (en) 2022-07-14
US20220222428A1 (en) 2022-07-14
US11397847B1 (en) 2022-07-26
US20220222222A1 (en) 2022-07-14
US11531452B2 (en) 2022-12-20
US11481288B2 (en) 2022-10-25
US20220222431A1 (en) 2022-07-14
US20220222461A1 (en) 2022-07-14
US20220222425A1 (en) 2022-07-14
US11726640B2 (en) 2023-08-15
US20220222361A1 (en) 2022-07-14
US11928315B2 (en) 2024-03-12

Similar Documents

Publication Publication Date Title
US11397847B1 (en) Digital processing systems and methods for display pane scroll locking during collaborative document editing in collaborative work systems
US11301813B2 (en) Digital processing systems and methods for hierarchical table structure with conditional linking rules in collaborative work systems
CN109416704B (en) Network-based embeddable collaborative workspace
US9544307B2 (en) Providing a security mechanism on a mobile device
US20230055241A1 (en) Digital processing systems and methods for external events trigger automatic text-based document alterations in collaborative work systems
US11621936B2 (en) Integrating a communication platform into a third-party platform
WO2022153122A1 (en) Systems, methods, and devices for enhanced collaborative work documents
US11232145B2 (en) Content corpora for electronic documents
US11741071B1 (en) Digital processing systems and methods for navigating and viewing displayed content
US20230333728A1 (en) Digital processing systems and methods for display navigation mini maps
US10831812B2 (en) Author-created digital agents

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: MONDAY.COM LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARAMATI, TAL;ZIONPOUR, RON;GREENHUT, GUY;AND OTHERS;SIGNING DATES FROM 20220407 TO 20220410;REEL/FRAME:060183/0610

STCF Information on status: patent grant

Free format text: PATENTED CASE