US20220043663A1 - Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form - Google Patents

Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form Download PDF

Info

Publication number
US20220043663A1
US20220043663A1 US17/511,194 US202117511194A US2022043663A1 US 20220043663 A1 US20220043663 A1 US 20220043663A1 US 202117511194 A US202117511194 A US 202117511194A US 2022043663 A1 US2022043663 A1 US 2022043663A1
Authority
US
United States
Prior art keywords
widget
interaction
widgets
computer
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/511,194
Inventor
Arash Massoudi
Sandra Irene Zylka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NextAxiom Tech Inc
Original Assignee
NextAxiom Tech Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NextAxiom Tech Inc filed Critical NextAxiom Tech Inc
Priority to US17/511,194 priority Critical patent/US20220043663A1/en
Assigned to NEXTAXIOM TECHNOLOGY, INC. reassignment NEXTAXIOM TECHNOLOGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ZYLKA, SANDRA IRENE, MASSOUDI, ARASH
Publication of US20220043663A1 publication Critical patent/US20220043663A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3247Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials involving digital signatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/907Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/93Document management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/60Digital content management, e.g. content distribution

Definitions

  • Field workers must often manually sift through volumes of information pertaining to irrelevant scenarios in order to navigate to the applicable steps, diagrams and instructions. Not only does this reduce the “wrench time”, or the time spent actually performing the task on hand, but also may lead to unintentional user errors. Paper procedures and instructions or their equivalent digitized documents provided require manual and time consuming navigation to relevant information and the next step, computational support, placekeeping of instruction steps, verification of tools and equipment pertaining to the procedure/instructions, verified acknowledgment of Cautions and Notes within instructions or other content blocks, verified completion of signatures, and support for other aspects of job performance.
  • non-interactive written procedures/instructions only allow for job monitoring once the job is complete, and only based on an employee's report of the job.
  • Various embodiments of the disclosure relate, generally, to computer implemented systems and methods to define user-computer interactions, computer guided navigation and application integration for digitized written procedures/instructions, instructional manuals and/or fillable forms (hereon referred to collectively or individually as the digitized document), in a structured metadata format containing the interaction and interface definitions together with the corresponding content blocks of the digitized document, and to perform the defined interactions and integrations guided through the structured metadata at job performance time.
  • the computer guided interaction and application interface definitions may be defined through an interaction authoring tool (Interaction Authoring Tool—IAT) by overlaying or superimposing user-computer interaction widgets (heron referred to as interaction widgets) and application integration widgets (hereon referred to as interface widgets) over content blocks of the digitized document to encapsulate each content block through a corresponding widget and then configuring the interaction and interface properties of said widget including configuring its sequence of appearance statically or conditionally based on dynamic data, relative to other widgets, for computer-aided navigation at job performance time.
  • IAT Interaction Authoring Tool
  • the computer-guided interaction and application interface definitions (Interaction and Interface Definition Metadata—IIDM), together with content blocks encapsulated through the widgets, are stored in a persistent computer data store as structured XML or functionally and sufficiently equivalent data structures. Based on some embodiments of the disclosure the IIDM may also be generated programmatically. At job performance time, computer guided interactions and application integration may be performed within an Interaction Performance Tool (IPT) driven by the interpretation of the IIDM.
  • IPT Interaction Performance Tool
  • Yet other embodiments of the disclosure relate, generally, to methods for generating IIDM programmatically or through IAT, capturing and aggregating information about the dynamic user-computer interactions during job performance to provide job performance monitoring, statistics and analytics and supporting computer-guided navigation from the contents encapsulated within one widget to contents encapsulated within another through static sequencing of widgets or dynamically through the automatic evaluation of conditional statements based on dynamic data. Furthermore, to methods accommodating advanced computer guidance for job requirements such as repeated action steps and other advanced interactions through different types of superimposed/overlaid and linked interaction widgets.
  • embodiments of the disclosure relate, generally, to methods of defining, and supporting at job performance time, the bidirectional integration of data and automatic dispatch of application functions from other applications to/from content blocks of an existing digital written procedure/instructions or fillable form, integrations which may be defined through the selection of software service interfaces with typed input/outputs such as standards-based Web Services (e.g., WSDL/SOAP or RESTful) that are automatically associated with a set of interface widgets that may be placed inside of any interaction or any widget encapsulating said content blocks (container widget).
  • standards-based Web Services e.g., WSDL/SOAP or RESTful
  • some embodiments of the disclosure relate, generally, to providing integrated methods through typed data interface widgets, to gather user inputs during job performance as it relates to the contents of one interaction widget and making such data referenceable by and available to other proceeding widgets to support dynamic computer-aided navigation and reduce user errors during job performance.
  • FIG. 1 shows a flowchart of an embodiment of a method to define user-computer interaction and computer-aided navigation through superimpose/overlay of interaction widgets on top of content blocks of any digitized written procedure/instructions, instructional manual and fillable forms, resulting in structured metadata definitions.
  • FIG. 2 shows a flowchart of an embodiment of a method to define application and data integration through superimpose/overlay of interface widgets on top of content blocks, or portions thereof, of any digitized written procedure/instructions, instructional manual and fillable forms, resulting in structured metadata definitions.
  • FIG. 3 shows a flowchart of a method to provide/perform, at job performance time, user-computer interaction and computer-aided navigation as well as application and data integration through superimpose/overlay of interaction and interface widgets on top of content blocks of any digitized written procedure/instructions and fillable forms based on structured metadata definitions.
  • FIG. 4 shows an example of an embodiment of the Interaction Authoring Tool (IAT), displaying an example of the digitized document, and an example of an interaction widget palette, with a Section interaction widget, selected from the palette and, superimposed on the content block of the first section of the digitized document together with an example of some of the widget's properties and attributes.
  • IAT Interaction Authoring Tool
  • FIG. 5 shows the same digitized document as FIG. 4 , with more superimposed and linked interaction widgets, where one of the interaction widgets contains child interaction widgets (or interaction sub-widgets) that correspond to instruction steps within the content blocks contained by the parent widget and with more space being added, dynamically, to the displayed digitized document to accommodate space for the Acknowledgment related interactions of some superimposed interaction widgets.
  • one of the interaction widgets contains child interaction widgets (or interaction sub-widgets) that correspond to instruction steps within the content blocks contained by the parent widget and with more space being added, dynamically, to the displayed digitized document to accommodate space for the Acknowledgment related interactions of some superimposed interaction widgets.
  • FIG. 6 shows the use of a Conditional Action step widget, within IAT, to define automatic branching and navigation to the next interaction widget based on dynamic user input later provided at job performance time within the Interaction Performance Tool (IPT).
  • IPT Interaction Performance Tool
  • FIG. 7 shows an example of an embodiment of the Interaction Performance Tool (IPT), performing user-computer interaction, computer-guided navigation and placekeeping based on the superimposed/overlaid interaction widgets and their configured attributes as partially shown/defined in FIG. 6 .
  • IPT Interaction Performance Tool
  • the efficiency of performing procedures/instructions may be significantly improved when dynamic information about the job, work environment or equipment/tools is presented, in a graphical user interface, to the person performing the job. Indeed, embodiments are able to provide information to a user that was not previously available. Some of this information resides in the core backend enterprise applications. Also, data entered during job performance, that may be captured using fillable forms or segments thereof, most often belong to the core backend enterprise applications such as Asset Management or Work Management systems. Furthermore, sometimes at job performance time, the ability to conveniently access a function of those backend applications while performing procedures/instructions, for example being able to create a work request through a Work Management system, may significantly improve job efficiency.
  • embodiments implement an improved computing system that is able to display a standardized procedure/instruction manual or fillable form, while also providing interactive access with additional enterprise data while the user is interacting with the manual or form.
  • Existing written procedures/instructions, instructional manuals or fillable forms, whether in paper form or as a digitized document do not provide such functionality.
  • existing computer systems providing such manuals and forms do not also include interactive instructions guiding a user through the form, nor do such systems provide access to additional enterprise data related to the form.
  • applying a method of embodiments of this invention minimizes change management costs and results in a non-intrusive and incremental approach to adding computer guided interaction and application integration to the performance of existing, as well as new, written procedures/instructions without requiring any significant change to the established existing processes of an organization and, instead, by adding a layer of processes, and computer guided interaction, on top of what already exists.
  • the application of a method of embodiments of this invention provides a non-intrusive, practical and incremental approach and means to improve work efficiency and reduce user error through computer guided interaction for navigation to the relevant job scenario, information and steps, computational support, automatic placekeeping of instruction steps, verification of tools and equipment pertaining the procedure/instructions, verified acknowledgment of Cautions and Notes within procedures, and for other aspects of performing a job.
  • a digitized form of the written procedure/instructions, instructional manual and/or fillable forms, in PDF with a rich content format or any other electronic format, referred to as the digitized document is imported into an Interaction Authoring Tool (IAT) as illustrated at step 101 of FIG. 1 and displayed as illustrated at step 102 . 1 in the background using a scrolled graphical user interface window on a display of a computing device.
  • the IAT 402 also displays, in a separate window, a rich palette of configurable interaction widgets as illustrated at step 102 . 2 where each widget in the palette, generally or specifically, corresponds to a different content type within a typical digitized document.
  • FIG. 1 shows a flowchart that depicts the simplified steps, in one embodiment of this disclosure, used to define user-computer interaction and computer-aided navigation through superimpose/overlay of interaction widgets over the content blocks of the digitized document.
  • steps 104 and 105 the person defining the interactions (the interaction designer) superimposes (to show the content underneath) or overlays (to block the content underneath or portions thereof) a selected widget, from said palette, over (or on top of) a corresponding content block and resizes the widget to contain and snag the desired content block (depicted in FIG. 1 , step 106 ).
  • the interaction designer may superimpose a Caution interaction widget, over the Caution content block of the underlying procedure/instructions, resize the widget to contain the written Caution section, then set the attributes of the widget such as job performance sequence designator (e.g., sequence number or other ordered designator), that this instance of Caution requires acknowledgment, where to place and what to label (with a label for pre-acknowledgment and another for post acknowledgment) the means of acknowledgment inside the widget, the display properties of the acknowledgment, so on so forth, to indicate an acknowledgement of the Caution section is required before the user may move to the next content block of the procedure/instructions at job performance time, and to set other properties available on the widget that may include look and feel properties.
  • job performance sequence designator e.g., sequence number or other ordered designator
  • the interaction designer may superimpose Step interaction widgets over each step of the procedure/instructions, set their display attributes, and sequence, through e.g., sequence attribute numbers, and link the steps widgets based on the order of performance, or conditionally, through binary questions, or more complex conditional statements, based on different job scenarios and thus provide automatic, dynamic, computer guided navigation of instructions within the Interaction Performance Tool (IPT) 702 (see FIG. 7 ) at job performance time.
  • IPT Interaction Performance Tool
  • FIG. 4 shows one embodiment of IAT 402 , where an example of written procedure/instructions, used for the inspection of a heater treater pump, is displayed as the digitized document 404 in a scrolled window, with a widget palette 406 displayed in a separate window in the top, right portion of FIG. 4 .
  • an interaction designer had already selected a general-purpose Section widget 408 from the palette 406 and superimposed it over the first section of the digitized document 404 , the section titled “Purpose”, and had configured some of the properties/attributes of the selected widget, displayed in a window 410 which is located below the widget palette display in FIG. 4 .
  • the selected Section widget 408 is configured to have a “Section Name” attribute with a value of “Purpose” and to be a “Collapsible” widget meaning that the widget may be displayed collapsed or expanded both within IAT 402 , when defining interactions, and through IPT 702 (see FIG. 7 ), when interactions are being performed.
  • the “Layering Effect” of the selected Section widget 408 is set to “Superimpose” so the widget 408 is superimposed (being transparent) over the “Purpose” section vs being overlaid (which would cover the content block under it).
  • the selected Section widget's sequence number (under Sequence->Current) is set to “1.0”, the previous widget's sequence number (Sequence->Previous) is not set since the selected widget is the first widget, and the sequence number of the next widget to be performed (Sequence->Next), at job performance time, is set to “2.0” (i.e. Sequence->Current+1) by default.
  • the interaction designer may change the Sequence->Next attribute to point to a widget with a different sequence number or to use a conditional widget to dynamically determine the “Next” widget to automatically navigate to at job performance time, as will be described later.
  • the selected widget is configured not to require an acknowledgement by the user of IPT 702 at job performance time, before the user may mark the widget as completed and proceed to the next widget containing the next content block to be considered/performed.
  • the selected Section widget as it may be the case with all interaction widgets, also snags the contents of the section or the content block contained within it as part of its metadata together with the value of its attributes as configured by the interaction designer.
  • step 107 if the digitized document is of a rich content format, such as PDF, the IAT 402 also captures the content of the content block contained by the selected widget as illustrated at step 107 . 1 .
  • step 109 the interaction designer configures the relevant user-computer interaction provided by the widget to be dynamically enforced when the contained content block of the digitized document is being followed at job performance time.
  • step 108 when a superimposed/overlaid widget needs to occupy a space that is larger than the space provided by the specific area of the displayed digitized document that is being contained by the widget, the displayed digitized document is modified, see steps 108 . 1 and 108 . 2 , to add the required space and the modified version is displayed instead of the original.
  • a Caution widget is superimposed over a Caution section of a digitized document, requiring acknowledgment at job performance time, if there is no room between the Caution section and the next section to insert an acknowledgment button, a modified version of the digitized document, containing the required added space will be created and the modified version will replace the last displayed digitized document in the associated display.
  • One way to accomplish the modification is to split the digitized document into two documents where the first document contains all the contents up to the end of said Caution section and the second document contains all the content of the digitized document after the Caution section and then to add the desired blank space to the end of the first document and merge back the first and the second document while readjusting all the coordinates of the widgets already superimposed/overlaid in locations of the second document to reflect the new coordinates resulting from the addition of the blank space and displaying the merged document instead of the last displayed digitized document.
  • step 111 to define the order of appearance/activation of the widgets at job performance time.
  • One of the attributes in common across all widgets is the attribute that defines the order of appearance/activation across content types contained by the widgets, for example through a sequence number attribute, resulting in computer-guided navigation at job performance time.
  • the order of appearance may be defined by statically linking/pointing one widget to another, for example by one widget referencing another widget's sequence number as its next widget to navigate to, or conditionally and based on dynamically gathered data at job performance type, one widget may point to one or more possible next widgets depending on dynamic evaluation of a conditional branch. For example, if a condition is evaluated to true, a widget that currently completed its interaction may point to the sequence number of one next widget, and if the condition is evaluated to be false, to another next widget.
  • These expressions may be expressed through special types of interaction widgets such as a Conditional Action widget described in this disclosure.
  • FIG. 5 shows the same digitized document 404 as FIG. 4 , with more superimposed and linked interaction widgets, where one of the interaction widgets contains child interaction widgets, such as child interaction widget 412 . 1 , (or interaction sub-widgets) that signify steps within the parent widget and with more space being added to the displayed digitized document to accommodate space for the Acknowledgment related interactions needed for some of the superimposed interaction widgets configured to require acknowledgment.
  • the metadata for an interaction sub-widget may not be required to contain the snagged image or the content contained within said sub-widget, since the parent widget already contains said content as part of its metadata and the position of sub-widget may be stored relative to the position of its containing parent widget.
  • the “Purpose” section is encapsulated by an interface widget 408 as also shown by FIG. 4 and explained earlier.
  • the Section interface widget 414 associated with the content block in the section titled “Scope” was configured to require acknowledgment of consideration/performance at job performance time, and since there was not enough space to insert an “Acknowledge” button (as a means for enforcing the acknowledgment at job performance time), in between the sections titled “Scope” and “Responsibilities” in the original displayed digitized document 404 within the Section interface widget 414 associated to the “Scope”, more space was added (refer to FIG. 1 , step 108 and the explanation of the step provided earlier).
  • sequence numbers corresponds to the numbering of the sections/sub-sections of the digitized document. Accordingly, in this case, the Previous and Next widget references for each widget corresponds to the previous and next section's numbering.
  • the interaction Section widget 412 associated with the fourth section of the displayed digitized document titled “PRECAUTIONS & LIMITATIONS”, contains interaction sub-widgets corresponding to each step of said section. These sub-widgets have hierarchical sequence numbers such as “4.1”, “4.2”, assigned based on the sequence number of their parent widget.
  • the information includes, but is not limited to, the superimposed/overlaid position of the graphical widgets with respect to the displayed digitized document, a snagged image corresponding to the content block of the digitized document that is contained by each widget (note that this may not be needed for interaction sub-widgets as noted earlier), all the original content of the corresponding content block contained by the widget if available and needed, and all of the widget properties, conditional and sequencing information and other attributes. Also, a copy of the last displayed version of the displayed digitized document is saved with the metadata.
  • a set of interaction widgets are provided to correspond, more specifically, to the sections within typical procedure/instructions, including but not limited to, Purpose, Scope, Responsibilities, Precautions and Limitations, Prerequisites, Parts and Materials, Instructions, References, and Attachments or a General Content widget that may contain any content block.
  • a general-purpose Section widget (as shown in FIG.
  • a “Section Name” as one of its configurable attributes, may be provided that may be generically mapped to any section within the digitized document by the interaction designer and its “Section Name” attribute can be set to the title of the mapped section.
  • the “Section Name” attribute of the widget may be set to “Purpose” by the interaction designer (or as discussed later through automation).
  • another set of widgets may be pre-configured to associate to the typical subsections under the Instructions section, including but not limited to, Step or Action Step, Time Dependent Step, Continuous Action Step, Note, Caution, Warning, Conditional Action, Branch (or Go to), Component Verification, Reference (or Refer to), Bulleted list, Figure, Table, Critical Step, and Hold/Inspection Point or Independent Verification.
  • These typed widgets are examples of widgets that may be available in the displayed “Interaction Widget Palette” referred to in FIG. 1 , step 102 . 2 .
  • a general Section widget with general and further configurable interaction properties is mapped to a section of a content block when no explicit keyword association is configured to match with a title of what is assumed to be a section of the content block.
  • One of the attributes of the widget can capture the title, if any, of the section.
  • a General Container or Content widget with general and further configurable interaction properties is mapped to a content block when no explicit keyword association is configured and matched with the title of a content block.
  • One of the attributes of such a widget is designed to capture the title, if any, of the content block.
  • an interaction widget associated with some content type may contain child interaction widgets and those child interaction widgets may contain other child interaction widgets to any level of depth.
  • One approach to capturing the order of widget navigation is to associate hierarchical sequence numbers to widgets. For example, if the interaction widget associated with an Instruction section of the digitized document has a sequence number of 5 , the first child Step interaction widget corresponding to the first step of the Instruction widget will have a sequence number of 5 . 1 , the second child Step widget will have a sequence number of 5 . 2 and so on. This scheme for sequence number can accommodate any level of widget parent/child relationship. The content type associated with a parent interaction widget will be marked completed automatically when all its relevant child interaction widgets are marked completed.
  • the set of available widgets, their interaction and interface behavior and their association with the sections and content blocks of procedure/instructions may be configuration-driven.
  • One way to approach the configuration mechanism is to introduce a superset widget that provides a superset of capabilities, for all possible interactions, and properties to turn those capabilities on/off, say with defaults set to off, and then to allow the user to define any number of named custom widgets each with a relevant subset of those capabilities, by turning on each desired capability through configuration.
  • the user may define a widget named “Caution” widget, using an instance of said superset widget, and only turn on the “acknowledgment required” and placekeeping interaction capabilities of the associated instance of superset widget through an XML-based configuration mechanism.
  • a dynamic Table of Contents (TOC) of all the widgets superimposed/overlaid on top of the digitized document is represented on a display, in the order defined by the links across the widgets, through their sequence numbers, or other means of sequencing, and with each entry in the TOC uniquely representing a superimposed/overlaid widget, in a manner that the TOC can be used to navigate to a particular widget in the display that contains the content blocks of the digitized document through hyperlinks or a similar manner both within IAT 402 and IIP.
  • TOC Dynamic Table of Contents
  • the interaction designer may define named views and then associate each superimposed/overlaid widget to one or more of said named views through an attribute provided on the widget for defining such associations such that, at job performance time, the user of the IPT 702 is able to switch to different views, where each selected view only displays the content of the widgets associated to said view.
  • the interaction designer can divide the digitized document into two parts: 1) Front Matter (usually containing sections such as Purpose, Scope, Responsibilities, Precautions and Limitations, and Prerequisites) vs Body (usually containing the procedure or instructions). And then require the person performing the job, at job performance time, to acknowledge the Front Matter before he/she may proceed to the Body.
  • the IAT 402 may provide a widget designed to contain multiple sections of the digitized document, say Section widget, in this case containing all the sections of the Front Matter, and turn on the acknowledgment interaction for that widget and link it to the first widget of the Body.
  • the procedure/instructions and the user-computer interaction widgets associated to its sections may be also presented with a different form factor on any device such as a smart phone or smart glasses at job performance time, without relying/using the digitized document in the background, and by first determining the position of the next widget with the smallest sequence number on the device display and then displaying the content relative to the position of the containing widget on the display, then based on widget sequence numbers, selecting a next widget and determining its position relative to the first widget and then displaying its content relative to the position of its containing widget, until all content blocks are displayed relative to the position of their containing widgets on the display. Then, placing the first widget on the display of a computational device to signify the current content block under performance and provide placekeeping and other interaction through the display of the widgets as work progresses at job
  • the widgets associated with some select sections of the procedure/instructions may be superimposed/overlaid on top of the saved digitized documents and the performance of some other select sections, including but not limited to steps within select instructions, maybe transferred to another device, such as a smart glass, without relying/using the digitized document in the background.
  • a widget associated with a Conditional Action may support branching based on an IF/WHEN [condition], in order to link the widget to another specific widget thus causing automatic navigation to the specific widget at job performance time.
  • the condition may be expressed as a binary or a series of binary questions, for example with Yes/No answers, linked through logical operators such as AND/OR/NOT or EITHER/OR, or any number of logical operators and conditions.
  • a Conditional Action Widget may reference data that was previously entered by a user to make a determination of whether a widget and its content should be displayed at all when a prior completed widget pointing to it is marked as completed or Not Applicable (N/A).
  • FIG. 6 shows an example of the use of a Conditional Action step widget 416 . 1 . 3 , within one embodiment of IAT 402 , to define automatic branching and navigation to the next interaction widget based on dynamic user input later provided at job performance time within the IPT 702 .
  • the interaction widget 416 associated with the “INSTRUCTIONS” section of the displayed digitized document contains a Step interaction sub-widget 416 . 1 associated to the first step of the instructions section.
  • the Step interaction sub-widget 416 . 1 itself, contains five interaction Step sub-widgets 416 . 1 . 1 , 416 . 1 . 2 , 416 . 1 . 3 , 416 . 1 . 4 , and 416 .
  • the sub-widget 416 . 1 . 3 associated to the third sub-step is a Conditional Action step interaction widget overlaid on top of the corresponding content block of the displayed digitized document.
  • the original instruction in the digitized document read “If Pump cover is not installed, Go To Section 6 . 2 ”, however, because the Conditional Action widget was overlaid (vs superimposed), the original content block is covered by the widget, and the interaction designer configures the Conditional Action widget to read: “Is Pump cover installed?”, and provides two buttons corresponding to two possible binary answers of “Yes” or “No” and in this case configures the “Yes” answer to go to the widget 416 . 1 .
  • the IPT 702 displays/activates the widget 416 . 1 . 4 associated with the next sequence number, 6 . 1 . 4 , and if the “No” is selected, the IPT 702 branches and displays/activates the widget 416 . 2 with 6 . 2 (and, in this case, also its first child widget 416 . 2 . 1 at sequence number 6 . 2 . 1 .
  • a widget associated to a Repeat section is provided to support two basic forms of repeating: 1) by embedding a Conditional Action widget that causes automatic navigation through branching to a prior step/widget, 2) by being configured to repeat a specified number of times.
  • the user will be able to see the original performance results and its associated data for each step without obscuring the use of the steps for the repeat, for example, through multiple tabs, where each tab is associated to an iteration of the repeat with the current iteration in the first tab.
  • IPT 702 either navigates the user back to the origin or to the next sequential widget, depending on the rules configured through IAT 402 or the structured metadata.
  • an interaction widget for example one associated with a simple Action Step, may be configured to enforce time requirements. These requirements may include, but are not limited to, minimum time, maximum time, or a time range for the performance of the widget.
  • time requirements may include, but are not limited to, minimum time, maximum time, or a time range for the performance of the widget.
  • the user is provided with a clock/stopwatch that he/she may manually start/stop/reset, where the clock changes visual state when the time requirement is met, which then the user may mark the step as completed.
  • an interaction widget for example one associated with a simple Action Step, may be configured to require and enforce Concurrent Verification (CV), where two users performing a step/section must be physically present together.
  • CV Concurrent Verification
  • both workers will perform the step together and sign off, through initials or digital signature sub-widgets, after the performance of the step/section is completed and before they are allowed to proceed to the next step/section of the procedure/instructions.
  • IPT 702 may visually render a CV step/section distinctly to ensure that the main user and the concurrent verifier are both present before performing the step.
  • an interaction widget may be configured to require and enforce Independent Verification (IV), where the only difference is that the verifier does not have to be present at the time the step/section is being performed.
  • IV Independent Verification
  • the IPT 702 will not yet progress to the next step, instead it sends a notification, maybe through an embedded Notification sub-widget, to the independent verifier with a link to the IV widget requiring verification, and only then will IPT 702 navigate the user to the widget associated with the next step/section.
  • IAT 402 providing procedure/instructions authoring capabilities through a palette 406 of interaction widgets used for authoring new procedures/instructions, containing widgets types corresponding to the widgets types used for superimposing/overlaying on top of the existing written procedure/instructions, except that these procedure/instructions authoring widgets provide means for authoring and creating new content blocks. Widget instances from this palette may be connected and combined, through sequence numbering, with the widgets used for superimposing interaction on top of existing content blocks.
  • any interaction widget may be added to a set of reusable widget templates, where reusable widgets templates are grouped by their type, for example reusable widget templates of type Caution, and made available within the IAT 402 so they may be combined with other interaction widgets.
  • reusable widgets templates are grouped by their type, for example reusable widget templates of type Caution, and made available within the IAT 402 so they may be combined with other interaction widgets.
  • the sequence number of the corresponding widget template becomes dynamic and, in some such embodiments, it is only assigned when an instance of the widget template is connected or embedded in other interaction widgets based on the sequence number of the connected or containing widget and if the widget template contains child widgets, the hierarchical sequence number of the child widgets are adapted to the parent widget.
  • another embodiment of the disclosure relates, generally, to the bidirectional integration of data and automatic dispatch of application functions from other applications, such as enterprise asset or work management systems, to/from an existing written procedure/instructions, instructional manual or fillable form, in PDF or any digitized binary format (the digitized document), driven by the selection of a software service interface with typed input/outputs, including but not limited to, standards-based Web Services, whereby selecting an input or output of a service operation, based on its type, an interface widget, configured to be associated to that type, is provided within the IAT 402 that may be overlaid or superimposed on any part of the content block of the digitized document, where said any part of content block may be encapsulated within an existing interaction widget.
  • IAT 402 saves the interface widget together with information regarding the selected software service and its input/outputs as structured metadata in IIDM.
  • an interface widget associated with the input of a service is activated, the user will be prompted to provide the input within the area covered by said input interface widget and once all inputs to a service are provided, the service will be either invoked automatically, or its execution may be tied to a user trigger event such as the press of a button through a button widget.
  • a widget associated with a service output is activated, at job performance time, the area of the content block covered by the interface widget associated with that output will be automatically populated with the associated output data from the execution of the service.
  • the service may be executed, only after all input data is entered by the user. If an interface widget is contained within an interaction widget, it will activate only when the containing interaction widget displays.
  • this method of integration there are a number of differences between this method of integration and other prior art methods.
  • One major difference between this method and other previous methods is that the previous methods either allow the user to place an input data widget anywhere on the display or to relate the input data widget to a fixed, inflexible location of an underlying digitized document.
  • an input widget used to take user input at performance time, or an output widget, used to populate a portion of the display with data from other systems, is placed relative to the position of a containing interaction widget, or any container widget used to encapsulate and contain a content block of the digitized document, and regardless of the digitized document and without requiring the digitized document at performance time.
  • a software service such as a standards-based Web Service (e.g., WSDL/SOAP or REST)
  • WSDL/SOAP standards-based Web Service
  • REST a software service
  • the associated software services are automatically executed resulting in direct integration with other software applications through their business logic layer, as opposed to simply saving the data in a database only to be integrated later.
  • the method of containing the input/output interface widgets within other linked interaction/container widgets supports new capabilities that not only does not require the inflexible association of the input/output widget to a hardcoded location of the digitized document, but also provides means for interactive computer-guided navigation at job performance time.
  • a Data Interface Widget for simple data entry by the user, without being tied to the inputs/outputs of a software service.
  • the Data Interface widget may be superimposed/overlaid on any part of a content block that is already contained within any other widget and the data entered through such widget is made available throughout the interactive instruction/procedure/form to be referenced by all widgets with sequence numbers greater than the sequence number of the Data Interface Widget or its containing widget.
  • the users of IAT 402 and IPT 702 may enter constants in interface widgets to support computation (ex min/max values). The location of a Data Interface Widget is saved relative to its containing widget just as with other interaction or interface sub-widgets.
  • FIG. 2 illustrates a flowchart for one embodiment of the method for defining application and data integration through superimpose/overlay of interface widgets on top of content blocks, or portions thereof, of written procedure/instructions, instructional manuals and/or fillable forms, resulting in structured metadata definitions saved as IIDM.
  • FIG. 2 illustrates obtaining a digitized document at step 201 and displaying the digitized document at step 202 . While FIG.
  • step 203 refers to a general “container widget” that encapsulates and contains (through a snagged image or capturing the contents of) a content block (step 205 ) of the displayed digitized document such that it can be used independent from the digitized document at job performance time
  • an “interface widget” described in earlier embodiments of this disclosure qualifies as a container widget and thus the process described in FIG. 1 and FIG. 2 may be combined to address both requirements for user-computer interaction and application integration in a single process.
  • FIG. 2 , step 206 accommodates creating interface widgets both based on software services (see steps 206 . 1 , 206 . 2 , and 206 .
  • interface widgets as described earlier or creating or selecting interface widgets (see step 207 ), referred to as simple Data Interface widgets, that do not require mapping to inputs/outputs of a software service but are meant to capture dynamic user input data, during job performance, and make such data available to other widgets for reference, for example, in the evaluation of dynamic conditionals and for dynamically branching to the next widget.
  • the interaction designer superimposes/overlays the interface widget over an area of the content block contained within a container/interaction widget and the relative position of the interface widget with respect to the container/interaction widget is added to the widget metadata.
  • step 210 if the interface widget needs to occupy a space that is larger than the space provided by the specific area of the displayed digitized document that is being contained by its containing container/interface widget, the digitized document may be modified (see step 210 . 1 and 210 . 2 ) to add more space following a process as explained earlier for FIG. 1 , step 108 .
  • Attributes of an interface widget are configured at step 211 .
  • FIG. 2 step 212 illustrates how a container/interaction widget may contain one or more child/sub widgets.
  • portions of the process can be repeated for other content blocks.
  • the process illustrated in FIG. 2 is concluded in steps 214 , 215 and 216 where widgets are sequenced, and all widget information and attributes are translated into IIDM and saved.
  • a configurable Notification interface widget is provided to cause the automatic publication of a notification to the person performing the job and/or to other users and applications through the dispatch of a software service associated to said widget.
  • the Notification widget may be configured to evaluate dynamic input/output data.
  • the notification message provides a link such that the person may launch an instance of the interactive procedure/instructions/form in an instance of the IPT 702 and automatically navigate to the same place where the notification was originated.
  • a configurable Equipment interface widget is provided to integrate events and data from specific Internet of Things (IoT) enabled equipment or to cause such equipment to perform a specific action at job performance time.
  • the Equipment widget can be configured to connect to a specific equipment through its unique identifier, which may be a URL, in order to invoke specific API's on the processor of that equipment.
  • FIG. 1 and FIG. 2 may easily combine the processes of FIG. 1 and FIG. 2 to create a single IIDM metadata containing definitions of both interface widgets and interaction widgets to provide interactions and well as integrations at job performance time for a single digitized document that represents written procedure/instructions, instructional manuals and/or fillable forms.
  • This may be accomplished, through one embodiment of this disclosure, generally, using interaction widgets defined through the process illustrated in FIG. 1 as “container widgets” referred to in Step 203 of FIG. 2 .
  • This can be simply accomplished by replacing Steps 201 , 202 , 203 , 204 and 205 in FIG. 2 , with Steps 101 through 110 of FIG. 1 .
  • interaction widgets may be added inside of the interface widgets.
  • FIG. 1 & FIG. 2 There are other ways of combining the processes illustrated in FIG. 1 & FIG. 2 . For example, by inserting steps 206 through 212 of FIG. 2 , in between steps 109 and 110 of FIG. 1 such that the creation of interface widgets and interaction widgets may be intermingled.
  • steps 206 through 212 of FIG. 2 in between steps 109 and 110 of FIG. 1 such that the creation of interface widgets and interaction widgets may be intermingled.
  • FIG. 1 and/or FIG. 2 or any flowchart that is derived by combining the processes in FIG. 1 and FIG. 2 into a single process resulting in a single IIDM that can be applied to the digitized document.
  • one variation may allow some interface widgets to be placed inside of other interface widgets or their sub-widgets.
  • the IPT 702 interprets the saved IIDM metadata resulting from the process in FIG. 1 and/or FIG. 2 or any new process resulting from combining the two processes, as mentioned earlier, and applying the combination to the digitized document.
  • the IPT 702 may display the content blocks encapsulated by interaction/container widgets through any of the following means, or other appropriate means: 1) by displaying a copy of the saved version of the digitized document in a scrolled window and providing placekeeping and interactions/integrations by superimposing/overlaying the first interaction/container widget corresponding to the first contained content block of the digitized document, based on the associated coordinates, and sequencing and other information saved in the metadata, and then when current widget is marked as completed, displaying the next interaction/container widget in the sequence and so on and so forth until the last widget is displayed and its interactions are performed.
  • the widgets are linked and relative to the simulated placement of the first widget (without the display of the widget itself) on an empty computer display/window and then the simulated placement of other widgets relative to the first widget and based on their sequence number, without displaying or relying on the saved digitized document, and then superimposing the first widget over the already displayed content block contained by the first widget to activate the widget and signify the initial content block under performance, and when said initial content block is marked as complete, to display/activate the next relevant widget to signify the next content block to be performed, and repeat the process until all relevant widgets are displayed and marked as completed.
  • the widgets displayed based on the structured metadata are used to provide computer-guidance and application integration and to dynamically advance the user to the next section of the procedure/instructions relevant to the job.
  • FIG. 3 relates to a flowchart of a method to perform, user-computer interaction and computer-aided navigation as well as application and data integration at job performance time, based on the structured metadata IIDM resulting from combining the processes described in FIG. 1 and FIG. 2 and applying the combined process to the digitized document.
  • said resulting IIDM is imported and loaded into computer memory into data structures (that may include hash maps, linked lists, arrays or a combination of other conventional data structures) that are optimized for the implementation of the process in FIG. 3 .
  • FIG. 1 hash maps, linked lists, arrays or a combination of other conventional data structures
  • step 302 relates to displaying all the content blocks contained by all the interface/container widgets, in the order of their sequence number, starting from the smallest sequence number using either of the means numbered “2)” or “3)” for displaying content blocks encapsulated by interaction/container widgets, described in the above paragraph without requiring a copy of the digitized document.
  • the widget associated to the first displayed content block is activated/displayed around said content block and marked as the “Current widget” (see step 304 ) to provide a placekeeping function, process all it sub-widgets (see steps 305 , 306 , 306 .
  • the widget may be marked as completed automatically (see step 307 ), for example, upon interaction requiring acknowledgment of the content block by the user, or manually by the user when automatically marking as completed is not an option.
  • IIP determines the next interface widget to be activated/displayed, if any, and marks the widget as “Current widget” (see steps 308 , 309 , 310 , and 311 ).
  • the “Current widget” completes all its interactions and is marked as completed through step 307 , when said widget points to a next interaction widget (again such next widget may be determined conditionally), as determined in step 308 of FIG. 3 , the next widget is displayed and activated, according to step 311 of FIG. 3 , and the processing of that widget will loop back to step 304 of FIG. 3 thus said next interaction widget will be marked as the “Current widget”.
  • the first or any subsequent widget that was marked as the “Current widget”, through performing step 304 of FIG. 3 does have one or more relevant child/sub-widgets, where such relevance may be determined through dynamic conditional branching based on job data gathered so far through the process, as determined in step 305 of FIG. 3 , all the sub-widgets of the “Current widget” and, in the case where any sub-widget is yet another interaction widget, all of its sub-widgets, and so on and so forth will be processed by the embodiment of IPT 702 according to the step 306 of FIG. 3 together with its sub-steps.
  • One can modify the process described in FIG. 3 to allow processing of interface widgets that contain other interface widgets, to any level of depth.
  • each sub-widget of the “Current widget” is either an interface widget or an interaction widget.
  • Each relevant sub-widget of the “Current widget” is processed through the sub-steps of step 306 .
  • the processing continues through step 309 , to step 310 of FIG. 3 , where the “Current widget” that is the parent interaction widget of the sub-widget is placed on a Stack data structure so its processing continues after its sub-widget is processed.
  • step 311 the interaction sub-widget is activated/displayed and processing loops back to step 304 , where the sub-widget becomes the “Current widget”.
  • interaction sub-widgets within other interaction sub-widgets, and so on and so forth to any level of depth may be processed.
  • the parent widget of an interaction sub-widget is pushed on a processing “Stack”, such that it can be ‘popped’ after its interaction sub-widget completes its processing through step 307 . c of FIG. 3 , to continue its processing as the “Current widget”.
  • the use of a Stack data structure accommodates embedded interaction sub-widgets to any level of depth.
  • a relevant sub-widget is an interface widget, as determined in step 306 . 1
  • the sub-widget is a Data Interface widget that based on its defined type is used to either gathered data by prompting the user or used for displaying and will be processed through step 306 . 6 .
  • the processing of the interface sub-widget moves to step 306 . 3 where the widget is determined to be either tied to the input(s) or output(s) of a software service.
  • step 306 . 4 the associated software service is executed, if it was not already executed, and the associated service output data is displayed through the sub-widget within the area of the content block occupied by the sub-widget. Otherwise, if the sub-widget is tied to input(s) of the associated service, in step 306 . 5 , the user is prompted for the referenced input data through the associated sub-widget and the service is executed when the last input data is gathered through an interface sub-widget. All the steps 306 . 4 , 306 . 5 , and 306 . 6 will go through step 306 . 7 of FIG.
  • step 306 where all input and/or output data gathered will be made available to be referenced and accessed by all proceeding widgets and sub-widgets and processing moves to step 306 .
  • step 306 where the processing continues to the next sub-widget, if any, through step 306 , and where all sub-widgets of the “Current widget” are processed, processing move to step 307 where all remaining interactions of the “Current widget” are completed and the widget is marked as complete and if there are any parent widgets on the Stack, each parent widget is ‘popped’ in order and its processing is resumed as the “Current Widget”.
  • the process illustrated in FIG. 3 does not address processing of interface widgets as sub-widgets of other interface widgets. One with ordinary skill in the art acknowledges that this process may be easily modified to accommodate such processing.
  • step 305 of FIG. 3 If the first or any subsequent widget that was marked as the “Current widget” performing its interactions does not have any child/sub-widgets as determined in step 305 of FIG. 3 , and it does not point to a next interaction widget, as determined in step 308 of FIG. 3 , there are no more widgets to be processed and thus all relevant interactions and integrations associated with the job are already performed, where processing moves to step 312 which signifies the end of processing of IIDM by this embodiment of IPT 702 according to FIG. 3 .
  • FIG. 7 shows an example of an embodiment of the Interaction Performance Tool (IPT) 702 , performing user-computer interaction, computer-guided navigation and placekeeping as shown (and partially defined) in FIG. 6 .
  • IPT Interaction Performance Tool
  • FIG. 7 the performance of steps 6 . 1 and 6 . 2 were already confirmed by the user through their associated Action Step interaction widgets and thus they are marked as “Completed” by the IPT.
  • the user has selected the button labeled “No” when the overlaid widget 416 . 1 . 3 associated with step 6 . 1 . 3 was activated/displayed, and therefore, the widgets 416 . 1 . 4 and 416 . 1 . 5 associated with steps 6 . 1 . 4 and 6 . 1 .
  • any of the widgets regardless of type, provide a set of general interaction capabilities at job performance time through IPT 702 some of which may be configured through the IAT 402 or the structured metadata associated with interaction widgets and their sub-widgets.
  • These capabilities may include, but are not limited to: a) the ability to add comments to each widget, through say a Comment sub-widget that can be placed and relocated to any area within said interaction widget by the user, b) the ability for a user to provide feedback/change-request on a content-block associated with an interaction widget, for example through a Feedback sub-widget similar to the Comment sub-widget, such that the procedure/instructions writer may consider the feedback at a later time, c) the ability to flag any interaction widget to be added/removed to/from bookmarks for easy navigation, d) the ability to expand or collapse all or some of the interaction widgets, including their child interaction widgets, e) the ability to capture media (ex.
  • the person defining the interactions through widgets can specify what percentage or how many units of work is associated with each widget relative to the overall work and for sub-widgets relative to the parent widget, i) the ability to enforce the navigation of the widgets associated with the numbered sections and steps of instruction, and the sub-widgets associated with bulleted sections in an interactive procedure/instructions in order, such that sequential performance is enforced and a later step/widget is not allowed to be marked complete (or N/A if not applicable) until the prerequisite step/widget is marked complete, or out of sequence, such that the associated steps/widgets can be performed in any order and marked as complete (or N/A if not applicable, based on the selected widget properties designated through TAT 402 or expressed in the structured metadata associated with a procedure/instructions/form, j) the ability to automatically mark a parent widget complete when all its child widgets are dispositioned or marked as complete, k) the ability to automatically provide centralized menus or Table of Contents for all
  • Another embodiment of the disclosure relates, generally, to capturing and aggregating information about the dynamic, user-computer interaction during job performance to provide job performance monitoring, statistics and analytics.
  • Examples of such data includes, but is not limited to, location of job performance through device GPS coordinates, the elapsed time between the completion of each step of instruction, and other dynamic data resulting from user-computer interaction throughout job performance.
  • another embodiment of the disclosure relates, generally, to automatic generation of IIDM by programmatically reading a stream of digitized written procedure/instructions through a computer processor and into a computer memory, for example in PDF format, and parsing and analyzing the contents and dispatching subroutines to programmatically superimpose widget metadata corresponding to typical procedure/instructions labels encountered where the label to widget associations may be configuration-driven. For example, when a Caution section is encountered in the stream, a rectangular Caution interface widget is automatically created with its dimensions corresponding to the area needed to sufficiently contain the content type of the associated Caution section, and a sequence number that is automatically assigned considering the sequence number of the last automatically generated interface widget then saved within the IIDM.
  • another embodiment of the disclosure relates, generally, to adding palette(s) of widgets to IAT 402 , designed for encapsulating commands that may drive one or more computerized machine(s) capable of carrying out a complex series of actions, where each command drives one or more action(s), and relates to a step of an instruction encapsulated with a widget from said additional palette(s).
  • one or more computerized machine(s) may be automatically driven to perform said steps(s).
  • another embodiment of the disclosure relates, generally, to automatic adjustment of the structured metadata in the IIDM associated with an original digitized document, based on a revised version of the original digitized document, by programmatically comparing the revised version with the original and identifying the sections deleted and added to/from the original document and then adjusting the IIDM to delete the widgets associated with the deleted sections and to re-compute the coordinates of the remaining widgets in IIDM to account for the new coordinates of the corresponding widgets.
  • the methods illustrated above may be practiced by a computer system including one or more processors and computer-readable media such as computer memory.
  • the computer memory may store computer-executable instructions that when executed by one or more processors cause various functions to be performed, such as the acts recited in the embodiments.
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, as discussed in greater detail below.
  • Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
  • Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
  • Computer-readable media that store computer-executable instructions are physical storage media.
  • Computer-readable media that carry computer-executable instructions are transmission media.
  • embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: physical computer-readable storage media and transmission computer-readable media.
  • Physical computer-readable storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage (such as CDs, DVDs, etc.), magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • a network or another communications connection can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above are also included within the scope of computer-readable media.
  • program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission computer-readable media to physical computer-readable storage media (or vice versa).
  • program code means in the form of computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer-readable physical storage media at a computer system.
  • NIC network interface module
  • computer-readable physical storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
  • the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.
  • the functionality described herein can be performed, at least in part, by one or more hardware logic components.
  • illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Library & Information Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Various embodiments described in this disclosure relate to methods and computer-based systems that implement those methods for overlaying or superimposing computer-user interaction widgets and application interface widgets on top of content blocks contained within digitized documents which represent written procedures or instructions, instructional manuals and fillable forms, in order to support means of providing dynamic user-computer interaction and computer guided navigation, as well as application functions and data integration, driven through structured interaction and integration metadata definitions, as it relates to said content blocks, during job performance.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 16/353,895 filed on Mar. 14, 2019, entitled “Method Of Defining And Performing Dynamic User-Computer Interaction, Computer Guided Navigation, And Application Integration For Any Procedure, Instructions, Instructional Manual, Or Fillable Form,” which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/676,197 filed on May 24, 2018 and entitled “Method of Defining and Performing Dynamic Human-Computer Interaction, Computer Guided Navigation, And Application Integration for Any Procedure, Instructions, Instructional Manual, or Fillable Form,” which applications are expressly incorporated herein by reference in their entirety.
  • BACKGROUND Background and Relevant Art
  • Across any industry that requires a structured approach to job performance, many jobs performed by people are guided by written procedures and work instructions, which ensures reliable and consistent performance of tasks and minimizes the risk of user errors. Even instructional manuals for goods, requiring setup and maintenance, contain instructions that need to be followed by a person using such goods.
  • Both instructional manuals as well as written procedures/instructions direct the actions of a person performing the job in a defined sequence, while, optionally, providing other information and documents such as diagrams, drawings, illustrations, images, and fillable forms.
  • Instructional manuals, written procedures and work instructions are often presented to one who performs a job in the form of paper documents or an equivalent digitized document, such as a PDF, on a computer device.
  • In more mission-critical industries, such as Nuclear or Oil & Gas, field workers often carry a large stack of paper work packages, or an equivalent digitized set of documents on a computer device, to the job performance site just to complete a single work order. Often such procedures and instructions encompass multiple different scenarios and thus require complex navigation of documents, whether physical documents or electronic, to the applicable scenario by the field worker.
  • Field workers must often manually sift through volumes of information pertaining to irrelevant scenarios in order to navigate to the applicable steps, diagrams and instructions. Not only does this reduce the “wrench time”, or the time spent actually performing the task on hand, but also may lead to unintentional user errors. Paper procedures and instructions or their equivalent digitized documents provided require manual and time consuming navigation to relevant information and the next step, computational support, placekeeping of instruction steps, verification of tools and equipment pertaining to the procedure/instructions, verified acknowledgment of Cautions and Notes within instructions or other content blocks, verified completion of signatures, and support for other aspects of job performance.
  • Furthermore, non-interactive written procedures/instructions only allow for job monitoring once the job is complete, and only based on an employee's report of the job.
  • There are various innovative approaches in defining and performing user-computer interaction and computer guided navigation when it comes to the performance of written procedures/instructions. However, any approach that requires fundamentally changing the way procedure/instructions are authored will suffer from significant change management costs and user adoption issues. For example, a typical nuclear utility invests tens of millions of dollars in the process of writing, reviewing, approving, and organizing tens of thousands of work instructions and procedures. In a nuclear utility, procedure writers currently use advanced, mature and full-featured text authoring tools such as Microsoft Word or Adobe FrameMaker to define procedures and work-instructions. Abruptly changing that with a new tool for authoring instructions as well as user-computer interaction is not only a colossal undertaking but also suffers from tremendous change management costs in using the new process and tools for procedure-writing. Besides the change management costs, such approaches pose a significant risk of adoption and user acceptance as currently procedure writers are accustomed to mature software for authoring instruction (without the interaction or application integration).
  • The subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.
  • BRIEF SUMMARY
  • The scope of the present invention is not limited to any degree by the statements within this summary.
  • Various embodiments of the disclosure relate, generally, to computer implemented systems and methods to define user-computer interactions, computer guided navigation and application integration for digitized written procedures/instructions, instructional manuals and/or fillable forms (hereon referred to collectively or individually as the digitized document), in a structured metadata format containing the interaction and interface definitions together with the corresponding content blocks of the digitized document, and to perform the defined interactions and integrations guided through the structured metadata at job performance time. The computer guided interaction and application interface definitions may be defined through an interaction authoring tool (Interaction Authoring Tool—IAT) by overlaying or superimposing user-computer interaction widgets (heron referred to as interaction widgets) and application integration widgets (hereon referred to as interface widgets) over content blocks of the digitized document to encapsulate each content block through a corresponding widget and then configuring the interaction and interface properties of said widget including configuring its sequence of appearance statically or conditionally based on dynamic data, relative to other widgets, for computer-aided navigation at job performance time. The computer-guided interaction and application interface definitions (Interaction and Interface Definition Metadata—IIDM), together with content blocks encapsulated through the widgets, are stored in a persistent computer data store as structured XML or functionally and sufficiently equivalent data structures. Based on some embodiments of the disclosure the IIDM may also be generated programmatically. At job performance time, computer guided interactions and application integration may be performed within an Interaction Performance Tool (IPT) driven by the interpretation of the IIDM.
  • Other embodiments of the disclosure relate, generally, to defining configuration driven interaction widgets which encapsulate content blocks from the digitized document with each widget supporting specific types of user-computer interactions, as it relates to its corresponding content block, and embedding select widgets within other widgets for more granular encapsulation and context sensitive interactions.
  • Yet other embodiments of the disclosure relate, generally, to methods for generating IIDM programmatically or through IAT, capturing and aggregating information about the dynamic user-computer interactions during job performance to provide job performance monitoring, statistics and analytics and supporting computer-guided navigation from the contents encapsulated within one widget to contents encapsulated within another through static sequencing of widgets or dynamically through the automatic evaluation of conditional statements based on dynamic data. Furthermore, to methods accommodating advanced computer guidance for job requirements such as repeated action steps and other advanced interactions through different types of superimposed/overlaid and linked interaction widgets.
  • Yet, other embodiments of the disclosure relate, generally, to methods of defining, and supporting at job performance time, the bidirectional integration of data and automatic dispatch of application functions from other applications to/from content blocks of an existing digital written procedure/instructions or fillable form, integrations which may be defined through the selection of software service interfaces with typed input/outputs such as standards-based Web Services (e.g., WSDL/SOAP or RESTful) that are automatically associated with a set of interface widgets that may be placed inside of any interaction or any widget encapsulating said content blocks (container widget). Furthermore, some embodiments of the disclosure relate, generally, to providing integrated methods through typed data interface widgets, to gather user inputs during job performance as it relates to the contents of one interaction widget and making such data referenceable by and available to other proceeding widgets to support dynamic computer-aided navigation and reduce user errors during job performance.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
  • Additional features and advantages will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the teachings herein. Features and advantages of the invention may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. Features of the present invention will become more fully apparent from the following description and appended claims, or may be learned by the practice of the invention as set forth hereinafter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above, and other aspects, features, and advantages of several embodiments of the present disclosure will be more apparent to one of ordinary skill in the art from the following Detailed Description as presented in conjunction with the following several appended drawings, understanding that these drawings depict only typical embodiments and are not therefore to be considered to be limiting in scope, including:
  • FIG. 1 shows a flowchart of an embodiment of a method to define user-computer interaction and computer-aided navigation through superimpose/overlay of interaction widgets on top of content blocks of any digitized written procedure/instructions, instructional manual and fillable forms, resulting in structured metadata definitions.
  • FIG. 2 shows a flowchart of an embodiment of a method to define application and data integration through superimpose/overlay of interface widgets on top of content blocks, or portions thereof, of any digitized written procedure/instructions, instructional manual and fillable forms, resulting in structured metadata definitions.
  • FIG. 3 shows a flowchart of a method to provide/perform, at job performance time, user-computer interaction and computer-aided navigation as well as application and data integration through superimpose/overlay of interaction and interface widgets on top of content blocks of any digitized written procedure/instructions and fillable forms based on structured metadata definitions.
  • FIG. 4 shows an example of an embodiment of the Interaction Authoring Tool (IAT), displaying an example of the digitized document, and an example of an interaction widget palette, with a Section interaction widget, selected from the palette and, superimposed on the content block of the first section of the digitized document together with an example of some of the widget's properties and attributes.
  • FIG. 5 shows the same digitized document as FIG. 4, with more superimposed and linked interaction widgets, where one of the interaction widgets contains child interaction widgets (or interaction sub-widgets) that correspond to instruction steps within the content blocks contained by the parent widget and with more space being added, dynamically, to the displayed digitized document to accommodate space for the Acknowledgment related interactions of some superimposed interaction widgets.
  • FIG. 6 shows the use of a Conditional Action step widget, within IAT, to define automatic branching and navigation to the next interaction widget based on dynamic user input later provided at job performance time within the Interaction Performance Tool (IPT).
  • FIG. 7 shows an example of an embodiment of the Interaction Performance Tool (IPT), performing user-computer interaction, computer-guided navigation and placekeeping based on the superimposed/overlaid interaction widgets and their configured attributes as partially shown/defined in FIG. 6.
  • Elements in the several figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. Also, common, but well-understood elements that are useful or necessary in commercially feasible embodiments are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • The efficiency of performing procedures/instructions may be significantly improved when dynamic information about the job, work environment or equipment/tools is presented, in a graphical user interface, to the person performing the job. Indeed, embodiments are able to provide information to a user that was not previously available. Some of this information resides in the core backend enterprise applications. Also, data entered during job performance, that may be captured using fillable forms or segments thereof, most often belong to the core backend enterprise applications such as Asset Management or Work Management systems. Furthermore, sometimes at job performance time, the ability to conveniently access a function of those backend applications while performing procedures/instructions, for example being able to create a work request through a Work Management system, may significantly improve job efficiency. Thus, embodiments implement an improved computing system that is able to display a standardized procedure/instruction manual or fillable form, while also providing interactive access with additional enterprise data while the user is interacting with the manual or form. Existing written procedures/instructions, instructional manuals or fillable forms, whether in paper form or as a digitized document do not provide such functionality. For example, existing computer systems providing such manuals and forms do not also include interactive instructions guiding a user through the form, nor do such systems provide access to additional enterprise data related to the form.
  • By separating concerns as it relates to instruction authoring vs interaction/interface authoring, applying a method of embodiments of this invention minimizes change management costs and results in a non-intrusive and incremental approach to adding computer guided interaction and application integration to the performance of existing, as well as new, written procedures/instructions without requiring any significant change to the established existing processes of an organization and, instead, by adding a layer of processes, and computer guided interaction, on top of what already exists.
  • The application of a method of embodiments of this invention provides a non-intrusive, practical and incremental approach and means to improve work efficiency and reduce user error through computer guided interaction for navigation to the relevant job scenario, information and steps, computational support, automatic placekeeping of instruction steps, verification of tools and equipment pertaining the procedure/instructions, verified acknowledgment of Cautions and Notes within procedures, and for other aspects of performing a job.
  • The following description is meant to enable one of ordinary skill in the art to practice the disclosed embodiments by the way of example and specific details of its key elements. The use of specific details, arrangement of components, specified features, examples or the like do not signify a preferred embodiment and are not intended to limit the scope of this invention to said details or examples. In addition, a subset of the components of this embodiment may be arranged by one of ordinary skill in the art in different configurations than specified and all drawings accompanying this disclosure are for illustrative purposes only. Therefore, one skilled in the relevant art will recognize, however, that the embodiments of the present disclosure can be practiced without one or more of the specific details, or with conventional methods used in the industry or other methods, components, and so forth.
  • In general, well-known structures and operations are not shown or described in detail to avoid obscuring aspects of the present disclosure. Other elements not described herein relate to various conventional methods, components and acts that one with ordinary skill in the art may incorporate in accord with the embodiment of this disclosure. For example, details concerning exception-handling, timing and other similar considerations have been omitted where such detail is not necessary for obtaining a complete understanding of this closure by one of ordinary skill in the art. Furthermore, any alterations and further modifications in the illustrated methods, and such further application of the principles of the invention as illustrated herein are contemplated as would normally occur with one skilled in the art incorporating the method of this invention.
  • As noted above, the following description is not to be taken in a limiting sense, but is made merely for the purpose of describing the general principles of exemplary embodiments. Many additional embodiments of this invention are possible. It is understood that no limitation of the scope of the invention is thereby intended. Reference throughout this specification to “one embodiment,” “an embodiment,” or similar language means that a particular feature, structure, or characteristic that is described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment,” “in an embodiment,” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment. Also, the phrases “at least one,” “one or more,” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation.
  • Moreover, no requirement exists for a system or method to address each and every problem sought to be resolved by the present disclosure, for such to be encompassed by the present claims. Furthermore, no element, component, or method step in the present disclosure is intended to be dedicated to the public regardless of whether the element, component, or method step is explicitly recited in the claims. However, that various changes and modifications in form, composition of components and so forth may be made, without departing from the spirit and scope of the present disclosure, as set forth in the appended claims, as may be apparent to those of ordinary skill in the art, are also encompassed by the present disclosure.
  • In one embodiment of the disclosure, generally, a digitized form of the written procedure/instructions, instructional manual and/or fillable forms, in PDF with a rich content format or any other electronic format, referred to as the digitized document, is imported into an Interaction Authoring Tool (IAT) as illustrated at step 101 of FIG. 1 and displayed as illustrated at step 102.1 in the background using a scrolled graphical user interface window on a display of a computing device. The IAT 402 also displays, in a separate window, a rich palette of configurable interaction widgets as illustrated at step 102.2 where each widget in the palette, generally or specifically, corresponds to a different content type within a typical digitized document. FIG. 1 shows a flowchart that depicts the simplified steps, in one embodiment of this disclosure, used to define user-computer interaction and computer-aided navigation through superimpose/overlay of interaction widgets over the content blocks of the digitized document. In FIG. 1, steps 104 and 105, the person defining the interactions (the interaction designer) superimposes (to show the content underneath) or overlays (to block the content underneath or portions thereof) a selected widget, from said palette, over (or on top of) a corresponding content block and resizes the widget to contain and snag the desired content block (depicted in FIG. 1, step 106).
  • For example, the interaction designer, may superimpose a Caution interaction widget, over the Caution content block of the underlying procedure/instructions, resize the widget to contain the written Caution section, then set the attributes of the widget such as job performance sequence designator (e.g., sequence number or other ordered designator), that this instance of Caution requires acknowledgment, where to place and what to label (with a label for pre-acknowledgment and another for post acknowledgment) the means of acknowledgment inside the widget, the display properties of the acknowledgment, so on so forth, to indicate an acknowledgement of the Caution section is required before the user may move to the next content block of the procedure/instructions at job performance time, and to set other properties available on the widget that may include look and feel properties. Similarly, the interaction designer may superimpose Step interaction widgets over each step of the procedure/instructions, set their display attributes, and sequence, through e.g., sequence attribute numbers, and link the steps widgets based on the order of performance, or conditionally, through binary questions, or more complex conditional statements, based on different job scenarios and thus provide automatic, dynamic, computer guided navigation of instructions within the Interaction Performance Tool (IPT) 702 (see FIG. 7) at job performance time.
  • FIG. 4 shows one embodiment of IAT 402, where an example of written procedure/instructions, used for the inspection of a heater treater pump, is displayed as the digitized document 404 in a scrolled window, with a widget palette 406 displayed in a separate window in the top, right portion of FIG. 4. In this example, an interaction designer had already selected a general-purpose Section widget 408 from the palette 406 and superimposed it over the first section of the digitized document 404, the section titled “Purpose”, and had configured some of the properties/attributes of the selected widget, displayed in a window 410 which is located below the widget palette display in FIG. 4. For example, the selected Section widget 408 is configured to have a “Section Name” attribute with a value of “Purpose” and to be a “Collapsible” widget meaning that the widget may be displayed collapsed or expanded both within IAT 402, when defining interactions, and through IPT 702 (see FIG. 7), when interactions are being performed. The “Layering Effect” of the selected Section widget 408 is set to “Superimpose” so the widget 408 is superimposed (being transparent) over the “Purpose” section vs being overlaid (which would cover the content block under it). The selected Section widget's sequence number (under Sequence->Current) is set to “1.0”, the previous widget's sequence number (Sequence->Previous) is not set since the selected widget is the first widget, and the sequence number of the next widget to be performed (Sequence->Next), at job performance time, is set to “2.0” (i.e. Sequence->Current+1) by default. The interaction designer may change the Sequence->Next attribute to point to a widget with a different sequence number or to use a conditional widget to dynamically determine the “Next” widget to automatically navigate to at job performance time, as will be described later. Furthermore, the selected widget is configured not to require an acknowledgement by the user of IPT 702 at job performance time, before the user may mark the widget as completed and proceed to the next widget containing the next content block to be considered/performed. The selected Section widget, as it may be the case with all interaction widgets, also snags the contents of the section or the content block contained within it as part of its metadata together with the value of its attributes as configured by the interaction designer. One with ordinary skill in the relevant art will acknowledge that considering the embodiments described in this disclosure, there are various different interaction attributes and IAT 402 designs that may be conceived utilizing conventional methods of implementation available in the industry.
  • As illustrated in FIG. 1, step 107, if the digitized document is of a rich content format, such as PDF, the IAT 402 also captures the content of the content block contained by the selected widget as illustrated at step 107.1. In FIG. 1, step 109, the interaction designer configures the relevant user-computer interaction provided by the widget to be dynamically enforced when the contained content block of the digitized document is being followed at job performance time.
  • As illustrated in FIG. 1, step 108, when a superimposed/overlaid widget needs to occupy a space that is larger than the space provided by the specific area of the displayed digitized document that is being contained by the widget, the displayed digitized document is modified, see steps 108.1 and 108.2, to add the required space and the modified version is displayed instead of the original. As an example, when a Caution widget is superimposed over a Caution section of a digitized document, requiring acknowledgment at job performance time, if there is no room between the Caution section and the next section to insert an acknowledgment button, a modified version of the digitized document, containing the required added space will be created and the modified version will replace the last displayed digitized document in the associated display. One way to accomplish the modification is to split the digitized document into two documents where the first document contains all the contents up to the end of said Caution section and the second document contains all the content of the digitized document after the Caution section and then to add the desired blank space to the end of the first document and merge back the first and the second document while readjusting all the coordinates of the widgets already superimposed/overlaid in locations of the second document to reflect the new coordinates resulting from the addition of the blank space and displaying the merged document instead of the last displayed digitized document.
  • Once the interaction designer has repeated the steps 104 through 110 of FIG. 1, for each desired content block or sub-content block (i.e. content within a content block already contained within an interface widget), repeating to contain all the desired content/sub-content blocks of the displayed digitized document, the interaction designer moves to step 111 to define the order of appearance/activation of the widgets at job performance time. One of the attributes in common across all widgets is the attribute that defines the order of appearance/activation across content types contained by the widgets, for example through a sequence number attribute, resulting in computer-guided navigation at job performance time. The order of appearance may be defined by statically linking/pointing one widget to another, for example by one widget referencing another widget's sequence number as its next widget to navigate to, or conditionally and based on dynamically gathered data at job performance type, one widget may point to one or more possible next widgets depending on dynamic evaluation of a conditional branch. For example, if a condition is evaluated to true, a widget that currently completed its interaction may point to the sequence number of one next widget, and if the condition is evaluated to be false, to another next widget. These expressions may be expressed through special types of interaction widgets such as a Conditional Action widget described in this disclosure.
  • FIG. 5 shows the same digitized document 404 as FIG. 4, with more superimposed and linked interaction widgets, where one of the interaction widgets contains child interaction widgets, such as child interaction widget 412.1, (or interaction sub-widgets) that signify steps within the parent widget and with more space being added to the displayed digitized document to accommodate space for the Acknowledgment related interactions needed for some of the superimposed interaction widgets configured to require acknowledgment. Note that the metadata for an interaction sub-widget, may not be required to contain the snagged image or the content contained within said sub-widget, since the parent widget already contains said content as part of its metadata and the position of sub-widget may be stored relative to the position of its containing parent widget.
  • As shown in FIG. 5, the “Purpose” section is encapsulated by an interface widget 408 as also shown by FIG. 4 and explained earlier. Explaining further, the Section interface widget 414 associated with the content block in the section titled “Scope” was configured to require acknowledgment of consideration/performance at job performance time, and since there was not enough space to insert an “Acknowledge” button (as a means for enforcing the acknowledgment at job performance time), in between the sections titled “Scope” and “Responsibilities” in the original displayed digitized document 404 within the Section interface widget 414 associated to the “Scope”, more space was added (refer to FIG. 1, step 108 and the explanation of the step provided earlier). Furthermore, the sequence numbers (signified by the Sequence->Current and attribute shown in FIG. 4) associated with the interface widgets, in this case, corresponds to the numbering of the sections/sub-sections of the digitized document. Accordingly, in this case, the Previous and Next widget references for each widget corresponds to the previous and next section's numbering. Furthermore, as demonstrated in FIG. 5, the interaction Section widget 412 associated with the fourth section of the displayed digitized document, titled “PRECAUTIONS & LIMITATIONS”, contains interaction sub-widgets corresponding to each step of said section. These sub-widgets have hierarchical sequence numbers such as “4.1”, “4.2”, assigned based on the sequence number of their parent widget.
  • Once the interaction designer has contained all the desired content (or sub-content) blocks through corresponding interaction widgets (or sub-widgets) and finished the configuration and static or dynamic and conditional sequencing of the widgets, information associated with all the widgets is converted to structured metadata and saved as IIDM (or other appropriate data) as illustrated in FIG. 1 steps 112 and 113. The information includes, but is not limited to, the superimposed/overlaid position of the graphical widgets with respect to the displayed digitized document, a snagged image corresponding to the content block of the digitized document that is contained by each widget (note that this may not be needed for interaction sub-widgets as noted earlier), all the original content of the corresponding content block contained by the widget if available and needed, and all of the widget properties, conditional and sequencing information and other attributes. Also, a copy of the last displayed version of the displayed digitized document is saved with the metadata.
  • Yet, in another embodiment of the disclosure, generally, a set of interaction widgets are provided to correspond, more specifically, to the sections within typical procedure/instructions, including but not limited to, Purpose, Scope, Responsibilities, Precautions and Limitations, Prerequisites, Parts and Materials, Instructions, References, and Attachments or a General Content widget that may contain any content block. Alternatively, or in conjunction with such specialized widgets associated with each typical section or content block, a general-purpose Section widget (as shown in FIG. 5), with a “Section Name” as one of its configurable attributes, may be provided that may be generically mapped to any section within the digitized document by the interaction designer and its “Section Name” attribute can be set to the title of the mapped section.
  • For example, if the Section widget is mapped to a section of the digitized document titled “Purpose”, then the “Section Name” attribute of the widget may be set to “Purpose” by the interaction designer (or as discussed later through automation). Furthermore, another set of widgets (or sub-widgets) may be pre-configured to associate to the typical subsections under the Instructions section, including but not limited to, Step or Action Step, Time Dependent Step, Continuous Action Step, Note, Caution, Warning, Conditional Action, Branch (or Go to), Component Verification, Reference (or Refer to), Bulleted list, Figure, Table, Critical Step, and Hold/Inspection Point or Independent Verification. These typed widgets are examples of widgets that may be available in the displayed “Interaction Widget Palette” referred to in FIG. 1, step 102.2.
  • In some embodiments, a general Section widget with general and further configurable interaction properties is mapped to a section of a content block when no explicit keyword association is configured to match with a title of what is assumed to be a section of the content block. One of the attributes of the widget can capture the title, if any, of the section.
  • In some embodiments, a General Container or Content widget with general and further configurable interaction properties is mapped to a content block when no explicit keyword association is configured and matched with the title of a content block. One of the attributes of such a widget is designed to capture the title, if any, of the content block.
  • Yet, in another embodiment of the disclosure, generally, an interaction widget associated with some content type may contain child interaction widgets and those child interaction widgets may contain other child interaction widgets to any level of depth. One approach to capturing the order of widget navigation is to associate hierarchical sequence numbers to widgets. For example, if the interaction widget associated with an Instruction section of the digitized document has a sequence number of 5, the first child Step interaction widget corresponding to the first step of the Instruction widget will have a sequence number of 5.1, the second child Step widget will have a sequence number of 5.2 and so on. This scheme for sequence number can accommodate any level of widget parent/child relationship. The content type associated with a parent interaction widget will be marked completed automatically when all its relevant child interaction widgets are marked completed.
  • Yet, in another embodiment of the disclosure, generally, the set of available widgets, their interaction and interface behavior and their association with the sections and content blocks of procedure/instructions may be configuration-driven. One way to approach the configuration mechanism is to introduce a superset widget that provides a superset of capabilities, for all possible interactions, and properties to turn those capabilities on/off, say with defaults set to off, and then to allow the user to define any number of named custom widgets each with a relevant subset of those capabilities, by turning on each desired capability through configuration. For example, the user may define a widget named “Caution” widget, using an instance of said superset widget, and only turn on the “acknowledgment required” and placekeeping interaction capabilities of the associated instance of superset widget through an XML-based configuration mechanism.
  • Yet, in another embodiment of the disclosure, generally, a dynamic Table of Contents (TOC) of all the widgets superimposed/overlaid on top of the digitized document is represented on a display, in the order defined by the links across the widgets, through their sequence numbers, or other means of sequencing, and with each entry in the TOC uniquely representing a superimposed/overlaid widget, in a manner that the TOC can be used to navigate to a particular widget in the display that contains the content blocks of the digitized document through hyperlinks or a similar manner both within IAT 402 and IIP.
  • Yet, in another embodiment of the disclosure, generally, the interaction designer may define named views and then associate each superimposed/overlaid widget to one or more of said named views through an attribute provided on the widget for defining such associations such that, at job performance time, the user of the IPT 702 is able to switch to different views, where each selected view only displays the content of the widgets associated to said view.
  • Yet, in another embodiment of the disclosure, generally, the interaction designer can divide the digitized document into two parts: 1) Front Matter (usually containing sections such as Purpose, Scope, Responsibilities, Precautions and Limitations, and Prerequisites) vs Body (usually containing the procedure or instructions). And then require the person performing the job, at job performance time, to acknowledge the Front Matter before he/she may proceed to the Body. Alternatively, the IAT 402 may provide a widget designed to contain multiple sections of the digitized document, say Section widget, in this case containing all the sections of the Front Matter, and turn on the acknowledgment interaction for that widget and link it to the first widget of the Body.
  • Yet, in another embodiment of the disclosure, generally, when the digitized document is expressed in a rich content type, by extracting and storing the original contents of the content block contained by a widget with the associated widget metadata, the procedure/instructions and the user-computer interaction widgets associated to its sections may be also presented with a different form factor on any device such as a smart phone or smart glasses at job performance time, without relying/using the digitized document in the background, and by first determining the position of the next widget with the smallest sequence number on the device display and then displaying the content relative to the position of the containing widget on the display, then based on widget sequence numbers, selecting a next widget and determining its position relative to the first widget and then displaying its content relative to the position of its containing widget, until all content blocks are displayed relative to the position of their containing widgets on the display. Then, placing the first widget on the display of a computational device to signify the current content block under performance and provide placekeeping and other interaction through the display of the widgets as work progresses at job performance time.
  • Yet, in another embodiment of the disclosure, generally, the widgets associated with some select sections of the procedure/instructions may be superimposed/overlaid on top of the saved digitized documents and the performance of some other select sections, including but not limited to steps within select instructions, maybe transferred to another device, such as a smart glass, without relying/using the digitized document in the background.
  • Yet, in one embodiment of this disclosure, generally, a widget associated with a Conditional Action may support branching based on an IF/WHEN [condition], in order to link the widget to another specific widget thus causing automatic navigation to the specific widget at job performance time. The condition may be expressed as a binary or a series of binary questions, for example with Yes/No answers, linked through logical operators such as AND/OR/NOT or EITHER/OR, or any number of logical operators and conditions. A Conditional Action Widget may reference data that was previously entered by a user to make a determination of whether a widget and its content should be displayed at all when a prior completed widget pointing to it is marked as completed or Not Applicable (N/A).
  • FIG. 6 shows an example of the use of a Conditional Action step widget 416.1.3, within one embodiment of IAT 402, to define automatic branching and navigation to the next interaction widget based on dynamic user input later provided at job performance time within the IPT 702. As shown, the interaction widget 416 associated with the “INSTRUCTIONS” section of the displayed digitized document, contains a Step interaction sub-widget 416.1 associated to the first step of the instructions section. The Step interaction sub-widget 416.1, itself, contains five interaction Step sub-widgets 416.1.1, 416.1.2, 416.1.3, 416.1.4, and 416.1.5. The sub-widget 416.1.3 associated to the third sub-step (with sequence number 6.1.3) is a Conditional Action step interaction widget overlaid on top of the corresponding content block of the displayed digitized document. The original instruction in the digitized document read “If Pump cover is not installed, Go To Section 6.2”, however, because the Conditional Action widget was overlaid (vs superimposed), the original content block is covered by the widget, and the interaction designer configures the Conditional Action widget to read: “Is Pump cover installed?”, and provides two buttons corresponding to two possible binary answers of “Yes” or “No” and in this case configures the “Yes” answer to go to the widget 416.1.4 with the next sequence number and “No” to branch to a widget 416.2 with sequence number 6.2. At job performance time, as shown in FIG. 7 and explained later in this disclosure, if the user presses the button label with “Yes”, the IPT 702 displays/activates the widget 416.1.4 associated with the next sequence number, 6.1.4, and if the “No” is selected, the IPT 702 branches and displays/activates the widget 416.2 with 6.2 (and, in this case, also its first child widget 416.2.1 at sequence number 6.2.1.
  • Yet, in one embodiment of this disclosure, generally, a widget associated to a Repeat section is provided to support two basic forms of repeating: 1) by embedding a Conditional Action widget that causes automatic navigation through branching to a prior step/widget, 2) by being configured to repeat a specified number of times. In any case, at job performance time, the user will be able to see the original performance results and its associated data for each step without obscuring the use of the steps for the repeat, for example, through multiple tabs, where each tab is associated to an iteration of the repeat with the current iteration in the first tab. When a Repeat section is complete, IPT 702 either navigates the user back to the origin or to the next sequential widget, depending on the rules configured through IAT 402 or the structured metadata.
  • Yet, in another embodiment of the disclosure, generally, an interaction widget, for example one associated with a simple Action Step, may be configured to enforce time requirements. These requirements may include, but are not limited to, minimum time, maximum time, or a time range for the performance of the widget. At job performance time, the user is provided with a clock/stopwatch that he/she may manually start/stop/reset, where the clock changes visual state when the time requirement is met, which then the user may mark the step as completed.
  • Yet, in one embodiment of this disclosure, generally, an interaction widget, for example one associated with a simple Action Step, may be configured to require and enforce Concurrent Verification (CV), where two users performing a step/section must be physically present together. In this case, both workers will perform the step together and sign off, through initials or digital signature sub-widgets, after the performance of the step/section is completed and before they are allowed to proceed to the next step/section of the procedure/instructions. IPT 702 may visually render a CV step/section distinctly to ensure that the main user and the concurrent verifier are both present before performing the step. Similarly, an interaction widget may be configured to require and enforce Independent Verification (IV), where the only difference is that the verifier does not have to be present at the time the step/section is being performed. When an interaction widget configured as IV is marked as complete, the IPT 702 will not yet progress to the next step, instead it sends a notification, maybe through an embedded Notification sub-widget, to the independent verifier with a link to the IV widget requiring verification, and only then will IPT 702 navigate the user to the widget associated with the next step/section.
  • Yet, another embodiment of the disclosure relates, generally, to IAT 402 providing procedure/instructions authoring capabilities through a palette 406 of interaction widgets used for authoring new procedures/instructions, containing widgets types corresponding to the widgets types used for superimposing/overlaying on top of the existing written procedure/instructions, except that these procedure/instructions authoring widgets provide means for authoring and creating new content blocks. Widget instances from this palette may be connected and combined, through sequence numbering, with the widgets used for superimposing interaction on top of existing content blocks.
  • Yet, in one embodiment of this disclosure, generally, any interaction widget may be added to a set of reusable widget templates, where reusable widgets templates are grouped by their type, for example reusable widget templates of type Caution, and made available within the IAT 402 so they may be combined with other interaction widgets. When a widget is made reusable, the sequence number of the corresponding widget template becomes dynamic and, in some such embodiments, it is only assigned when an instance of the widget template is connected or embedded in other interaction widgets based on the sequence number of the connected or containing widget and if the widget template contains child widgets, the hierarchical sequence number of the child widgets are adapted to the parent widget.
  • Yet, another embodiment of the disclosure relates, generally, to the bidirectional integration of data and automatic dispatch of application functions from other applications, such as enterprise asset or work management systems, to/from an existing written procedure/instructions, instructional manual or fillable form, in PDF or any digitized binary format (the digitized document), driven by the selection of a software service interface with typed input/outputs, including but not limited to, standards-based Web Services, whereby selecting an input or output of a service operation, based on its type, an interface widget, configured to be associated to that type, is provided within the IAT 402 that may be overlaid or superimposed on any part of the content block of the digitized document, where said any part of content block may be encapsulated within an existing interaction widget. IAT 402 saves the interface widget together with information regarding the selected software service and its input/outputs as structured metadata in IIDM. At job performance time, when an interface widget associated with the input of a service is activated, the user will be prompted to provide the input within the area covered by said input interface widget and once all inputs to a service are provided, the service will be either invoked automatically, or its execution may be tied to a user trigger event such as the press of a button through a button widget. When a widget associated with a service output is activated, at job performance time, the area of the content block covered by the interface widget associated with that output will be automatically populated with the associated output data from the execution of the service. In the case which both input(s) and output(s) of a service are associated with content blocks through multiple interface widgets, the service may be executed, only after all input data is entered by the user. If an interface widget is contained within an interaction widget, it will activate only when the containing interaction widget displays.
  • There are a number of differences between this method of integration and other prior art methods. One major difference between this method and other previous methods is that the previous methods either allow the user to place an input data widget anywhere on the display or to relate the input data widget to a fixed, inflexible location of an underlying digitized document. Instead, in this method, an input widget, used to take user input at performance time, or an output widget, used to populate a portion of the display with data from other systems, is placed relative to the position of a containing interaction widget, or any container widget used to encapsulate and contain a content block of the digitized document, and regardless of the digitized document and without requiring the digitized document at performance time. Another notable difference is that the inputs and outputs of a software service, such as a standards-based Web Service (e.g., WSDL/SOAP or REST), are implicitly associated to input/output widgets at design time. Furthermore, at performance time the associated software services are automatically executed resulting in direct integration with other software applications through their business logic layer, as opposed to simply saving the data in a database only to be integrated later. The method of containing the input/output interface widgets within other linked interaction/container widgets supports new capabilities that not only does not require the inflexible association of the input/output widget to a hardcoded location of the digitized document, but also provides means for interactive computer-guided navigation at job performance time.
  • Yet, another embodiment of the disclosure relates, generally, to a Data Interface Widget for simple data entry by the user, without being tied to the inputs/outputs of a software service. The Data Interface widget may be superimposed/overlaid on any part of a content block that is already contained within any other widget and the data entered through such widget is made available throughout the interactive instruction/procedure/form to be referenced by all widgets with sequence numbers greater than the sequence number of the Data Interface Widget or its containing widget. Furthermore, the users of IAT 402 and IPT 702 may enter constants in interface widgets to support computation (ex min/max values). The location of a Data Interface Widget is saved relative to its containing widget just as with other interaction or interface sub-widgets.
  • FIG. 2 illustrates a flowchart for one embodiment of the method for defining application and data integration through superimpose/overlay of interface widgets on top of content blocks, or portions thereof, of written procedure/instructions, instructional manuals and/or fillable forms, resulting in structured metadata definitions saved as IIDM. FIG. 2 illustrates obtaining a digitized document at step 201 and displaying the digitized document at step 202. While FIG. 2, step 203, refers to a general “container widget” that encapsulates and contains (through a snagged image or capturing the contents of) a content block (step 205) of the displayed digitized document such that it can be used independent from the digitized document at job performance time, an “interface widget” described in earlier embodiments of this disclosure qualifies as a container widget and thus the process described in FIG. 1 and FIG. 2 may be combined to address both requirements for user-computer interaction and application integration in a single process. FIG. 2, step 206 accommodates creating interface widgets both based on software services (see steps 206.1, 206.2, and 206.3) as described earlier or creating or selecting interface widgets (see step 207), referred to as simple Data Interface widgets, that do not require mapping to inputs/outputs of a software service but are meant to capture dynamic user input data, during job performance, and make such data available to other widgets for reference, for example, in the evaluation of dynamic conditionals and for dynamically branching to the next widget. Regardless of the type of interface widget, in FIG. 2 steps 208 and 209, the interaction designer superimposes/overlays the interface widget over an area of the content block contained within a container/interaction widget and the relative position of the interface widget with respect to the container/interaction widget is added to the widget metadata.
  • As illustrated in FIG. 2, step 210, if the interface widget needs to occupy a space that is larger than the space provided by the specific area of the displayed digitized document that is being contained by its containing container/interface widget, the digitized document may be modified (see step 210.1 and 210.2) to add more space following a process as explained earlier for FIG. 1, step 108. Attributes of an interface widget are configured at step 211. FIG. 2, step 212 illustrates how a container/interaction widget may contain one or more child/sub widgets. One can modify the process illustrated in FIG. 2 to allow interface widgets to contain other interface widgets or to allow container/interaction widgets to contain other container/interaction widgets. As illustrated at step 213, portions of the process can be repeated for other content blocks. The process illustrated in FIG. 2 is concluded in steps 214, 215 and 216 where widgets are sequenced, and all widget information and attributes are translated into IIDM and saved.
  • Yet, in one embodiment of this disclosure, generally, a configurable Notification interface widget is provided to cause the automatic publication of a notification to the person performing the job and/or to other users and applications through the dispatch of a software service associated to said widget. The Notification widget may be configured to evaluate dynamic input/output data. When another person is notified, the notification message provides a link such that the person may launch an instance of the interactive procedure/instructions/form in an instance of the IPT 702 and automatically navigate to the same place where the notification was originated.
  • Yet, in one embodiment of this disclosure, generally, a configurable Equipment interface widget is provided to integrate events and data from specific Internet of Things (IoT) enabled equipment or to cause such equipment to perform a specific action at job performance time. The Equipment widget can be configured to connect to a specific equipment through its unique identifier, which may be a URL, in order to invoke specific API's on the processor of that equipment.
  • One of ordinary skill in the related art, may easily combine the processes of FIG. 1 and FIG. 2 to create a single IIDM metadata containing definitions of both interface widgets and interaction widgets to provide interactions and well as integrations at job performance time for a single digitized document that represents written procedure/instructions, instructional manuals and/or fillable forms. This may be accomplished, through one embodiment of this disclosure, generally, using interaction widgets defined through the process illustrated in FIG. 1 as “container widgets” referred to in Step 203 of FIG. 2. This can be simply accomplished by replacing Steps 201, 202, 203, 204 and 205 in FIG. 2, with Steps 101 through 110 of FIG. 1. Such that, once all interface widgets are defined, interaction widgets may be added inside of the interface widgets. There are other ways of combining the processes illustrated in FIG. 1 & FIG. 2. For example, by inserting steps 206 through 212 of FIG. 2, in between steps 109 and 110 of FIG. 1 such that the creation of interface widgets and interaction widgets may be intermingled. Furthermore, one can create variations of the steps in FIG. 1 and/or FIG. 2, or any flowchart that is derived by combining the processes in FIG. 1 and FIG. 2 into a single process resulting in a single IIDM that can be applied to the digitized document. For example, one variation may allow some interface widgets to be placed inside of other interface widgets or their sub-widgets.
  • In one or more embodiment of this disclosure, at job performance time, the IPT 702 interprets the saved IIDM metadata resulting from the process in FIG. 1 and/or FIG. 2 or any new process resulting from combining the two processes, as mentioned earlier, and applying the combination to the digitized document. In one embodiment, the IPT 702 may display the content blocks encapsulated by interaction/container widgets through any of the following means, or other appropriate means: 1) by displaying a copy of the saved version of the digitized document in a scrolled window and providing placekeeping and interactions/integrations by superimposing/overlaying the first interaction/container widget corresponding to the first contained content block of the digitized document, based on the associated coordinates, and sequencing and other information saved in the metadata, and then when current widget is marked as completed, displaying the next interaction/container widget in the sequence and so on and so forth until the last widget is displayed and its interactions are performed. 2) By displaying all the snagged images of the content blocks associated with each widget in the sequence the widgets are linked and relative to the simulated placement of the first widget (without the display of the widget itself) on an empty computer display/window and then the simulated placement of other widgets relative to the first widget and based on their sequence number, without displaying or relying on the saved digitized document, and then superimposing the first widget over the already displayed content block contained by the first widget to activate the widget and signify the initial content block under performance, and when said initial content block is marked as complete, to display/activate the next relevant widget to signify the next content block to be performed, and repeat the process until all relevant widgets are displayed and marked as completed. 3) Similar to 2 above but, instead of displaying the image, to display the original content contained by the widget optionally in different form factors appropriate for the display on different computing devices. Regardless of the means of display, at job performance time, the widgets displayed based on the structured metadata are used to provide computer-guidance and application integration and to dynamically advance the user to the next section of the procedure/instructions relevant to the job.
  • Yet, in another embodiment of this disclosure, FIG. 3 relates to a flowchart of a method to perform, user-computer interaction and computer-aided navigation as well as application and data integration at job performance time, based on the structured metadata IIDM resulting from combining the processes described in FIG. 1 and FIG. 2 and applying the combined process to the digitized document. In step 301 of FIG. 3, said resulting IIDM is imported and loaded into computer memory into data structures (that may include hash maps, linked lists, arrays or a combination of other conventional data structures) that are optimized for the implementation of the process in FIG. 3. FIG. 3, step 302, relates to displaying all the content blocks contained by all the interface/container widgets, in the order of their sequence number, starting from the smallest sequence number using either of the means numbered “2)” or “3)” for displaying content blocks encapsulated by interaction/container widgets, described in the above paragraph without requiring a copy of the digitized document. After all the content blocks are displayed in step 302, moving to step 303 of FIG. 3, the widget associated to the first displayed content block is activated/displayed around said content block and marked as the “Current widget” (see step 304) to provide a placekeeping function, process all it sub-widgets (see steps 305, 306, 306.1, 306.2, 306.3, 306.4, 306.5, 306.6, 306.7, 306.8, 306.9) if any, and also to enforce its defined interactions. Once the first content block is considered/performed by the user and its enforced interactions completed, the widget may be marked as completed automatically (see step 307), for example, upon interaction requiring acknowledgment of the content block by the user, or manually by the user when automatically marking as completed is not an option. After the completion of the first widget, which includes the processing of its relevant sub-widgets described in step 306 and the sub-steps thereof of FIG. 3, if any, IIP determines the next interface widget to be activated/displayed, if any, and marks the widget as “Current widget” (see steps 308, 309, 310, and 311).
  • If the first or any subsequent widget that was marked as the “Current widget”, through performing step 304 of FIG. 3, does not have any child/sub-widgets relevant to the job scenario being performed as determined in step 305 of FIG. 3, where such determination may be made based on conditionally linked widgets that may result in skipping the performance of all or some of its sub-widgets, if any, by branching to another non-sequential widget, said “Current widget” completes all its interactions and is marked as completed through step 307, when said widget points to a next interaction widget (again such next widget may be determined conditionally), as determined in step 308 of FIG. 3, the next widget is displayed and activated, according to step 311 of FIG. 3, and the processing of that widget will loop back to step 304 of FIG. 3 thus said next interaction widget will be marked as the “Current widget”.
  • If the first or any subsequent widget that was marked as the “Current widget”, through performing step 304 of FIG. 3, does have one or more relevant child/sub-widgets, where such relevance may be determined through dynamic conditional branching based on job data gathered so far through the process, as determined in step 305 of FIG. 3, all the sub-widgets of the “Current widget” and, in the case where any sub-widget is yet another interaction widget, all of its sub-widgets, and so on and so forth will be processed by the embodiment of IPT 702 according to the step 306 of FIG. 3 together with its sub-steps. One can modify the process described in FIG. 3, to allow processing of interface widgets that contain other interface widgets, to any level of depth.
  • In step 306.1 according to one embodiment of the disclosure illustrated in FIG. 3, each sub-widget of the “Current widget” is either an interface widget or an interaction widget. Each relevant sub-widget of the “Current widget”, is processed through the sub-steps of step 306. For each relevant next sub-widget being processed, when the sub-widget is not an interface widget, as determined in step 306.1, it is implied that the sub-widget is an interaction widget, in which case, the processing continues through step 309, to step 310 of FIG. 3, where the “Current widget” that is the parent interaction widget of the sub-widget is placed on a Stack data structure so its processing continues after its sub-widget is processed. Then, through step 311, the interaction sub-widget is activated/displayed and processing loops back to step 304, where the sub-widget becomes the “Current widget”. Using this process, interaction sub-widgets, within other interaction sub-widgets, and so on and so forth to any level of depth may be processed. The parent widget of an interaction sub-widget is pushed on a processing “Stack”, such that it can be ‘popped’ after its interaction sub-widget completes its processing through step 307.c of FIG. 3, to continue its processing as the “Current widget”. The use of a Stack data structure accommodates embedded interaction sub-widgets to any level of depth. To keep the process flowchart in FIG. 3 easy to understand, many conventional details associated with error-handling or special cases have been omitted. One with ordinary skill in the art will acknowledge that such detail can be addressed through conventional data structures and processes common in the relevant art.
  • Referring once again to FIG. 3, when a relevant sub-widget is an interface widget, as determined in step 306.1, if the definition of that sub-widget is not tied to a software service, as determined in step 306.2, the sub-widget is a Data Interface widget that based on its defined type is used to either gathered data by prompting the user or used for displaying and will be processed through step 306.6. If the sub-widget is tied to a software service, as determined in step 306.2, the processing of the interface sub-widget moves to step 306.3 where the widget is determined to be either tied to the input(s) or output(s) of a software service. If it is tied to the output(s), through step 306.4, the associated software service is executed, if it was not already executed, and the associated service output data is displayed through the sub-widget within the area of the content block occupied by the sub-widget. Otherwise, if the sub-widget is tied to input(s) of the associated service, in step 306.5, the user is prompted for the referenced input data through the associated sub-widget and the service is executed when the last input data is gathered through an interface sub-widget. All the steps 306.4, 306.5, and 306.6 will go through step 306.7 of FIG. 3, where all input and/or output data gathered will be made available to be referenced and accessed by all proceeding widgets and sub-widgets and processing moves to step 306.8, where the processing continues to the next sub-widget, if any, through step 306, and where all sub-widgets of the “Current widget” are processed, processing move to step 307 where all remaining interactions of the “Current widget” are completed and the widget is marked as complete and if there are any parent widgets on the Stack, each parent widget is ‘popped’ in order and its processing is resumed as the “Current Widget”. The process illustrated in FIG. 3, does not address processing of interface widgets as sub-widgets of other interface widgets. One with ordinary skill in the art acknowledges that this process may be easily modified to accommodate such processing.
  • If the first or any subsequent widget that was marked as the “Current widget” performing its interactions does not have any child/sub-widgets as determined in step 305 of FIG. 3, and it does not point to a next interaction widget, as determined in step 308 of FIG. 3, there are no more widgets to be processed and thus all relevant interactions and integrations associated with the job are already performed, where processing moves to step 312 which signifies the end of processing of IIDM by this embodiment of IPT 702 according to FIG. 3.
  • FIG. 7 shows an example of an embodiment of the Interaction Performance Tool (IPT) 702, performing user-computer interaction, computer-guided navigation and placekeeping as shown (and partially defined) in FIG. 6. As shown in FIG. 7, the performance of steps 6.1 and 6.2 were already confirmed by the user through their associated Action Step interaction widgets and thus they are marked as “Completed” by the IPT. Furthermore, the user has selected the button labeled “No” when the overlaid widget 416.1.3 associated with step 6.1.3 was activated/displayed, and therefore, the widgets 416.1.4 and 416.1.5 associated with steps 6.1.4 and 6.1.5 were automatically marked as “N/A” and the configured text supplied for the “N/A” state of the corresponding widgets and the widget 416.2.1 with sequence number 6.2.1, corresponding to section 6.2.1 of the document is automatically activated/displayed and navigated to, as the next relevant step that needs performance and thus performed by the user.
  • Yet, in another embodiment of the disclosure, generally, any of the widgets, regardless of type, provide a set of general interaction capabilities at job performance time through IPT 702 some of which may be configured through the IAT 402 or the structured metadata associated with interaction widgets and their sub-widgets. These capabilities may include, but are not limited to: a) the ability to add comments to each widget, through say a Comment sub-widget that can be placed and relocated to any area within said interaction widget by the user, b) the ability for a user to provide feedback/change-request on a content-block associated with an interaction widget, for example through a Feedback sub-widget similar to the Comment sub-widget, such that the procedure/instructions writer may consider the feedback at a later time, c) the ability to flag any interaction widget to be added/removed to/from bookmarks for easy navigation, d) the ability to expand or collapse all or some of the interaction widgets, including their child interaction widgets, e) the ability to capture media (ex. video, pictures, sound) through Media sub-widgets that can be associated with any interaction widget, f) the ability for the user to overwrite the sequence in which the computer navigation is designed to follow based on the sequence numbers associated with the interaction widgets or conditional branching, under certain circumstances, however requiring justification in the form of enforced Comment/Feedback and initials of the user, g) provide the ability to jump to the current step or section of the interactive digitized document with one click, h) provide the ability for the user to see progress status of the completed steps against the total required steps local to interaction widgets associated with an instructions widget and its child widgets as well as overall progress with respect to the entire procedure/instructions/form. Note that using TAT 402, the person defining the interactions through widgets can specify what percentage or how many units of work is associated with each widget relative to the overall work and for sub-widgets relative to the parent widget, i) the ability to enforce the navigation of the widgets associated with the numbered sections and steps of instruction, and the sub-widgets associated with bulleted sections in an interactive procedure/instructions in order, such that sequential performance is enforced and a later step/widget is not allowed to be marked complete (or N/A if not applicable) until the prerequisite step/widget is marked complete, or out of sequence, such that the associated steps/widgets can be performed in any order and marked as complete (or N/A if not applicable, based on the selected widget properties designated through TAT 402 or expressed in the structured metadata associated with a procedure/instructions/form, j) the ability to automatically mark a parent widget complete when all its child widgets are dispositioned or marked as complete, k) the ability to automatically provide centralized menus or Table of Contents for all widgets and sub-widgets associated with the sections and sub-sections as well as for the references and figures such that the user may directly navigate to any of them through the menus, l) the ability to support full-text search as well as search/view by section type such as all Cautions or all the interface widgets used for data entry, through which search/view the user may directly navigate to the content of the widgets associated to the search results, m) the ability for the user to pause the performance of a section/step such that the associated widget timer freezes and no step can be dispositioned and the display becomes read only until resumed, n) ability to zoom in/out, pan and annotate figures/diagrams, o) to support four different states for each widget and sub-widget: Not-started, Under-performance, Completed or N/A (Not-applicable), p) to highlight Critical steps distinctly, q) to provide an indexed view for each widget/sub-widget type, such as Caution, Comment, Feedback, Media or other widgets and sub-widgets associated to each content-block, so it can be reviewed separately, and provide easy navigation to the interaction widget against which the comment was entered and thus to the content-block contained by that interaction widget.
  • Yet, another embodiment of the disclosure relates, generally, to capturing and aggregating information about the dynamic, user-computer interaction during job performance to provide job performance monitoring, statistics and analytics. Examples of such data includes, but is not limited to, location of job performance through device GPS coordinates, the elapsed time between the completion of each step of instruction, and other dynamic data resulting from user-computer interaction throughout job performance.
  • Yet, another embodiment of the disclosure relates, generally, to automatic generation of IIDM by programmatically reading a stream of digitized written procedure/instructions through a computer processor and into a computer memory, for example in PDF format, and parsing and analyzing the contents and dispatching subroutines to programmatically superimpose widget metadata corresponding to typical procedure/instructions labels encountered where the label to widget associations may be configuration-driven. For example, when a Caution section is encountered in the stream, a rectangular Caution interface widget is automatically created with its dimensions corresponding to the area needed to sufficiently contain the content type of the associated Caution section, and a sequence number that is automatically assigned considering the sequence number of the last automatically generated interface widget then saved within the IIDM.
  • Yet, another embodiment of the disclosure relates, generally, to adding palette(s) of widgets to IAT 402, designed for encapsulating commands that may drive one or more computerized machine(s) capable of carrying out a complex series of actions, where each command drives one or more action(s), and relates to a step of an instruction encapsulated with a widget from said additional palette(s). With the practice of this embodiment, instead of a user performing one or more of the steps within an instruction, one or more computerized machine(s) may be automatically driven to perform said steps(s).
  • Yet, another embodiment of the disclosure relates, generally, to automatic adjustment of the structured metadata in the IIDM associated with an original digitized document, based on a revised version of the original digitized document, by programmatically comparing the revised version with the original and identifying the sections deleted and added to/from the original document and then adjusting the IIDM to delete the widgets associated with the deleted sections and to re-compute the coordinates of the remaining widgets in IIDM to account for the new coordinates of the corresponding widgets.
  • Further, the methods illustrated above may be practiced by a computer system including one or more processors and computer-readable media such as computer memory. In particular, the computer memory may store computer-executable instructions that when executed by one or more processors cause various functions to be performed, such as the acts recited in the embodiments.
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: physical computer-readable storage media and transmission computer-readable media.
  • Physical computer-readable storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage (such as CDs, DVDs, etc.), magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above are also included within the scope of computer-readable media.
  • Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission computer-readable media to physical computer-readable storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer-readable physical storage media at a computer system. Thus, computer-readable physical storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
  • Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
  • Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
  • The present invention may be embodied in other specific forms without departing from its spirit or characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (8)

What is claimed is:
1. A method of performing multi-entity job performance using computer guided navigation, the method comprising:
at a first instance of an interaction performance tool at a first computing system, obtaining a set of typed dynamic graphical user interface interaction widgets ordered and organized based on one or more of the widgets obtaining steps from a digitized document comprising a written procedure, such that the widgets capture contents of content blocks of the digitized document;
at the first instance of the interaction performance tool at the first computing system, receiving user interaction at one of the widgets by a first user indicating that the user has completed a task from the written procedure; and
as a result of receiving the user interaction at the one of the widgets by the first user, causing a notification widget to send a notification to a second user at a second computing system and to cause the second computing system to navigate in a second instance of the interaction performance tool to a location in the interaction performance tool that generated the notification to allow the second user to continue interaction with the set of typed dynamic graphical user interface interaction widgets to perform additional tasks in the written procedure.
2. The method of claim 1, further comprising, using an equipment interface widget associated with the typed dynamic graphical user interface interaction widgets, causing Internet of Things enabled equipment to perform a specific action from the written procedure.
3. The method of claim 1, further comprising receiving feedback from the first user in a widget associated with the typed dynamic graphical user interface interaction widgets with respect to a content block, and as a result, providing the feedback back to the creator of the written procedure.
4. The method of claim 1, further comprising at the first computing system capturing media using a widget associated with the typed dynamic graphical user interface interaction widgets.
5. The method of claim 1, further comprising capturing information about actions performed from the written procedure.
6. The method of claim 5, wherein the information comprises location of action performance.
7. The method of claim 5, wherein the information comprises elapsed time between completion of steps from the written procedure.
8. The method of claim 1, further comprising receiving user input from the first user to pause performance of a section or step of the written procedure such that an associated widget timer freezes.
US17/511,194 2018-05-24 2021-10-26 Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form Pending US20220043663A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/511,194 US20220043663A1 (en) 2018-05-24 2021-10-26 Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862676197P 2018-05-24 2018-05-24
US16/353,895 US11175934B2 (en) 2018-05-24 2019-03-14 Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form
US17/511,194 US20220043663A1 (en) 2018-05-24 2021-10-26 Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/353,895 Continuation US11175934B2 (en) 2018-05-24 2019-03-14 Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form

Publications (1)

Publication Number Publication Date
US20220043663A1 true US20220043663A1 (en) 2022-02-10

Family

ID=68614032

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/353,895 Active 2039-05-28 US11175934B2 (en) 2018-05-24 2019-03-14 Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form
US17/511,194 Pending US20220043663A1 (en) 2018-05-24 2021-10-26 Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/353,895 Active 2039-05-28 US11175934B2 (en) 2018-05-24 2019-03-14 Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form

Country Status (1)

Country Link
US (2) US11175934B2 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11538123B1 (en) * 2019-01-23 2022-12-27 Wells Fargo Bank, N.A. Document review and execution on mobile devices
US11356845B1 (en) * 2019-07-10 2022-06-07 Sprint Communications Company L.P. Trusted operating system in an internet of things (IoT) device
US11960881B2 (en) * 2020-01-24 2024-04-16 Silicon Laboratories Inc. System and method for the delivery of software documentation
US20220091707A1 (en) 2020-09-21 2022-03-24 MBTE Holdings Sweden AB Providing enhanced functionality in an interactive electronic technical manual
US20220261530A1 (en) 2021-02-18 2022-08-18 MBTE Holdings Sweden AB Providing enhanced functionality in an interactive electronic technical manual
US11947906B2 (en) 2021-05-19 2024-04-02 MBTE Holdings Sweden AB Providing enhanced functionality in an interactive electronic technical manual
CN115373791B (en) * 2022-10-25 2022-12-30 北京航天驭星科技有限公司 Method and device for compiling automatic remote control operation of spacecraft

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130226907A1 (en) * 2012-02-23 2013-08-29 Applied Materials, Inc. Providing dynamic content in context of particular equipment
US20140137005A1 (en) * 2012-11-15 2014-05-15 Samsung Electronics Co., Ltd. User function operation method and electronic device supporting the same
US20140281967A1 (en) * 2013-03-15 2014-09-18 David Bodnick Systems, methods, and media for presenting interactive checklists
US20150042852A1 (en) * 2013-08-09 2015-02-12 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20150331568A1 (en) * 2013-01-25 2015-11-19 Mitsubishi Electric Corporation Program and electronic-manual display apparatus
US20160253298A1 (en) * 2014-02-26 2016-09-01 Empire Technology Development Llc Photo and Document Integration
US20180239959A1 (en) * 2017-02-22 2018-08-23 Anduin Transactions, Inc. Electronic data parsing and interactive user interfaces for data processing
US20180336791A1 (en) * 2017-05-18 2018-11-22 Pearson Education, Inc. Multi-level executive functioning tasks
US10621237B1 (en) * 2016-08-01 2020-04-14 Amazon Technologies, Inc. Contextual overlay for documents

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6636242B2 (en) * 1999-08-31 2003-10-21 Accenture Llp View configurer in a presentation services patterns environment
US7315848B2 (en) * 2001-12-12 2008-01-01 Aaron Pearse Web snippets capture, storage and retrieval system and method
US8316309B2 (en) * 2007-05-31 2012-11-20 International Business Machines Corporation User-created metadata for managing interface resources on a user interface
US20120063684A1 (en) * 2010-09-09 2012-03-15 Fuji Xerox Co., Ltd. Systems and methods for interactive form filling
RU2014110393A (en) * 2011-08-19 2015-09-27 Эппл Инк. INTERACTIVE CONTENT FOR DIGITAL BOOKS
US20130346195A1 (en) * 2012-06-26 2013-12-26 Digital Turbine, Inc. Method and system for recommending content
US9256072B2 (en) * 2013-10-02 2016-02-09 Philip Scott Lyren Wearable electronic glasses that detect movement of a real object copies movement of a virtual object
US10552525B1 (en) * 2014-02-12 2020-02-04 Dotloop, Llc Systems, methods and apparatuses for automated form templating
US20180018613A1 (en) * 2016-07-13 2018-01-18 eNex Solutions LLC Providing software for customizing on-site services

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130226907A1 (en) * 2012-02-23 2013-08-29 Applied Materials, Inc. Providing dynamic content in context of particular equipment
US20140137005A1 (en) * 2012-11-15 2014-05-15 Samsung Electronics Co., Ltd. User function operation method and electronic device supporting the same
US20150331568A1 (en) * 2013-01-25 2015-11-19 Mitsubishi Electric Corporation Program and electronic-manual display apparatus
US20140281967A1 (en) * 2013-03-15 2014-09-18 David Bodnick Systems, methods, and media for presenting interactive checklists
US20150042852A1 (en) * 2013-08-09 2015-02-12 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20160253298A1 (en) * 2014-02-26 2016-09-01 Empire Technology Development Llc Photo and Document Integration
US10621237B1 (en) * 2016-08-01 2020-04-14 Amazon Technologies, Inc. Contextual overlay for documents
US20180239959A1 (en) * 2017-02-22 2018-08-23 Anduin Transactions, Inc. Electronic data parsing and interactive user interfaces for data processing
US20180336791A1 (en) * 2017-05-18 2018-11-22 Pearson Education, Inc. Multi-level executive functioning tasks

Also Published As

Publication number Publication date
US11175934B2 (en) 2021-11-16
US20190361721A1 (en) 2019-11-28

Similar Documents

Publication Publication Date Title
US20220043663A1 (en) Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form
AU2018236912B2 (en) Managing and automatically linking data objects
US20230385033A1 (en) Storing logical units of program code generated using a dynamic programming notebook user interface
RU2498391C2 (en) Exchange of information between user interface of inner space of document editor and user interface of outer space of document editor
RU2409844C2 (en) Markup-based extensibility for user interfaces
CN1821956B (en) Using existing content to generate active content wizard executables for execution of tasks
US10008009B1 (en) Method for generating dynamic vector graphics
US7913225B2 (en) Error handling using declarative constraints in a graphical modeling tool
US8745581B2 (en) Method and system for selectively copying portions of a document contents in a computing system (smart copy and paste
US9910641B2 (en) Generation of application behaviors
US20130262968A1 (en) Apparatus and method for efficiently reviewing patent documents
CN104584062A (en) Automatic creating of tables of content for web pages
CN109976729B (en) Storage and computing display globally configurable data analysis software architecture design method
CN112799718A (en) Enumerated document generation method and device, electronic equipment and storage medium
US20070198915A1 (en) Document Processing Device And Document Processing Method
US20070208995A1 (en) Document Processing Device and Document Processing Method
WO2007081017A1 (en) Document processor
CN117215556A (en) Modularized page rapid construction method, system, equipment and medium
US9146913B2 (en) Specifications automation system and method
CN114253516A (en) UI (user interface) development method and device for data filling system and storage medium
WO2005098698A1 (en) Document processing device
Vieritz et al. Access to UML diagrams with the HUTN
CN112667210A (en) Modularization customizing method and device for geographic information system software
JP2007094453A (en) Program development support system, program development support method and program
US11526336B2 (en) Community-oriented, cloud-based digital annealing platform

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEXTAXIOM TECHNOLOGY, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MASSOUDI, ARASH;ZYLKA, SANDRA IRENE;SIGNING DATES FROM 20190311 TO 20190312;REEL/FRAME:057919/0516

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED