CN114968454B - Flow arrangement, display method, head-mounted display device, and computer-readable medium - Google Patents

Flow arrangement, display method, head-mounted display device, and computer-readable medium Download PDF

Info

Publication number
CN114968454B
CN114968454B CN202210470109.6A CN202210470109A CN114968454B CN 114968454 B CN114968454 B CN 114968454B CN 202210470109 A CN202210470109 A CN 202210470109A CN 114968454 B CN114968454 B CN 114968454B
Authority
CN
China
Prior art keywords
task
user
workflow
node
head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210470109.6A
Other languages
Chinese (zh)
Other versions
CN114968454A (en
Inventor
苏诚龙
杨远远
韩亚豪
王文兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Companion Technology Co ltd
Original Assignee
Hangzhou Companion Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Companion Technology Co ltd filed Critical Hangzhou Companion Technology Co ltd
Priority to CN202210470109.6A priority Critical patent/CN114968454B/en
Publication of CN114968454A publication Critical patent/CN114968454A/en
Application granted granted Critical
Publication of CN114968454B publication Critical patent/CN114968454B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Embodiments of the present disclosure disclose a process arrangement, a display method, a head mounted display device and a computer readable medium. One embodiment of the method comprises the following steps: generating a workflow according to the flow node information set and the flow node connection information set corresponding to the flow node information set, wherein the flow node information set corresponds to a starting node, at least one flow node and an ending node, and the workflow corresponds to each task interface; in response to detecting the issuing operation for the workflow, determining an issuing state of the workflow as an issued state, wherein each task interface corresponding to the workflow is used for displaying in the head-mounted display device corresponding to the workflow, and the flow node information corresponding to the flow node comprises an instruction for starting an executable unit of the head-mounted display device. According to the embodiment, an operator can perform actual operation without interruption, data do not need to be manually filled, and the data in the operation process can be stored through the head-mounted display device.

Description

Flow arrangement, display method, head-mounted display device, and computer-readable medium
Technical Field
Embodiments of the present disclosure relate to the field of computer technology, and in particular, to a process arrangement, a display method, a head-mounted display device, and a computer readable medium.
Background
The workflow is used for guiding the actual work of the operator. Currently, when an operator performs actual operations, the following modes are generally adopted: the operation is performed according to a workflow in the form of a paper or electronic document.
However, when the above manner is adopted, there are often the following technical problems: the operator needs to interrupt the current working process to check the work flow in the form of paper or electronic document, and the work flow in the form of paper needs to be filled with data manually by the operator, so that the filled data is difficult to store.
Disclosure of Invention
The disclosure is in part intended to introduce concepts in a simplified form that are further described below in the detailed description. The disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a flow arrangement, a display method, a head mounted display device and a computer readable medium to solve one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a method of scheduling a flow, the method comprising: generating a workflow according to a flow node information set and a flow node connection information set corresponding to the flow node information set, wherein the flow node information set corresponds to a start node, at least one flow node and an end node, the flow node connection information in the flow node connection information set corresponds to two flow node information in the flow node information set, and the workflow corresponds to each task interface; in response to detecting the issuing operation for the workflow, determining the issuing state of the workflow as an issued state, wherein each task interface corresponding to the workflow is used for displaying in the head-mounted display device corresponding to the workflow, and the flow node information corresponding to the flow node comprises an instruction for starting an executable unit of the head-mounted display device.
In a second aspect, some embodiments of the present disclosure provide a flow display method, applied to a head-mounted display device, the method including: in response to receiving a work task or identifying a work task corresponding to a work task identification code, displaying a task interface corresponding to a workflow corresponding to the received or identified work task in a display screen of the head-mounted display device, wherein the workflow is generated by adopting a method described in any implementation manner of the first aspect; determining whether the operation of a user aiming at the task interface meets the jump condition corresponding to the task interface; and in response to detecting that the operation of the user on the task interface meets the jump condition, jumping to the next task interface corresponding to the task interface and the operation.
In a third aspect, some embodiments of the present disclosure provide an electronic device comprising: one or more processors; one or more display screens for imaging in front of the eyes of a user wearing the head mounted display device; and a storage device having one or more programs stored thereon, which when executed by the one or more processors cause the one or more processors to implement the method described in any of the implementations of the second aspect described above.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium having a computer program stored thereon, wherein the program when executed by a processor implements the method described in any of the implementations of the first or second aspects above.
The above embodiments of the present disclosure have the following advantageous effects: by the flow arranging method of some embodiments of the present disclosure, an operator can perform actual operations without interruption, and without manually filling in data, the data in the operation process can be saved by the head-mounted display device. Specifically, the reasons for the operator to interrupt the progress of the job, to fill in the data manually, and to have difficulty in storing the data are: the operator needs to interrupt the current working process to check the work flow in the form of paper or electronic document, and the work flow in the form of paper needs to be filled with data manually by the operator, so that the filled data is difficult to store. Based on this, the process scheduling method of some embodiments of the present disclosure first generates a workflow according to a process node information set and a process node connection information set corresponding to the process node information set. The process node information set corresponds to a start node, at least one process node and an end node, the process node connection information in the process node connection information set corresponds to two process node information in the process node information set, and the workflow corresponds to each task interface. Thus, the generated workflow can represent each task item which can be directly referred to for execution when an operator actually works, and each task item can be directly presented to the operator in the form of a task interface. Then, in response to detecting a publishing operation for the workflow, a publishing state of the workflow is determined as a published state. And each task interface corresponding to the workflow is used for being displayed in the head-mounted display equipment corresponding to the workflow. The flow node information corresponding to the flow node includes instructions to activate the executable unit of the head-mounted display device. Therefore, the workflow in the released state can be directly displayed in the corresponding head-mounted display equipment so that an operator wearing the head-mounted display equipment can directly and sequentially watch the displayed task interfaces, and accordingly actual operations corresponding to the task interfaces can be sequentially carried out according to the order of the workflow. Also, because the workflow can be displayed in the head-mounted display device in the form of a task interface, an operator can follow the guidance of the head-mounted display device to directly perform actual work without additionally viewing the workflow in the form of paper or electronic documents. Data may also be entered and saved via the head mounted display device. Therefore, an operator can perform actual operation without intermittently filling data, and the data in the operation process can be stored through the head-mounted display device.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is an architecture diagram of an exemplary system in which some embodiments of the present disclosure may be applied;
FIG. 2 is a flow chart of some embodiments of a flow orchestration method according to the present disclosure;
FIG. 3 is a flow chart of further embodiments of a flow orchestration method according to the present disclosure;
FIG. 4 is a flow chart of some embodiments of a flow display method according to the present disclosure;
fig. 5 is a schematic structural diagram of a head mounted display device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings. Embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
FIG. 1 illustrates an exemplary system architecture 100 to which the flow orchestration method or flow display method of some embodiments of the present disclosure may be applied.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as a web browser application, a search class application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be electronic devices having a display screen and supporting information display. For example, the terminal devices 101, 102 include, but are not limited to, cell phones, computers. The terminal device 103 includes, but is not limited to, AR glasses, MR glasses, and the like. When the terminal devices 101, 102, 103 are software, they can be installed in the above-listed electronic devices. It may be implemented as a plurality of software or software modules, for example, for providing distributed services, or as a single software or software module. The present invention is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server providing support for information displayed on the terminal devices 101, 102, 103. The background server can analyze and process the received data such as the request and the like, and feed back the processing result to the terminal equipment.
It should be noted that the flow scheduling method provided by the embodiment of the present disclosure may be executed by the server 105 or the terminal device 101 or 102. The flow display method provided by the embodiments of the present disclosure may be performed by the terminal device 103.
The server may be hardware or software. When the server is hardware, the server may be implemented as a distributed server cluster formed by a plurality of servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules, for example, for providing distributed services, or as a single software or software module. The present invention is not particularly limited herein.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2, a flow 200 is shown in accordance with some embodiments of the flow orchestration method of the present disclosure. The process arranging method comprises the following steps:
step 201, generating a workflow according to the flow node information set and the flow node connection information set corresponding to the flow node information set.
In some embodiments, an execution body of the flow orchestration method (e.g., the server or the terminal device 101/102 shown in fig. 1) may generate a workflow according to the flow node information set and the flow node connection information set corresponding to the flow node information set. The process node information set may be each process node information determined by the process scheduling user. The set of process node connection information may be respective process node connection information determined by a process orchestration user for the set of process node information. The process node information set and the process node information set can be pushed by a process scheduling user through an API interface.
The flow node information may be attribute related information of a node corresponding to the workflow. The set of flow node information may correspond to a start node, at least one flow node, and an end node. The flow node information corresponding to the start node and the end node may include a node type and a node identification. The above-described flow nodes may be understood as intermediate nodes. The flow node information corresponding to the flow node may include a node type and flow node attribute information. The flow node attribute information may be information configured for the service data of the flow node, and may include, but is not limited to: the node identification, task interface information for display in a corresponding task interface, an identification of an API for receiving operational data of an operator, an identification of an API characterizing task execution logic of the node. The task interface information may include body text, and may further include, but is not limited to, at least one of: title, button configuration information.
It should be noted that, the flow node attribute information includes, but is not limited to, the above-mentioned configuration related information, the flow node attribute information of the flow node corresponds to a task item when the flow node actually operates, and the flow node attribute information of different flow nodes may include different configuration related information. For example, the flow node attribute information of the flow node corresponding to the voice input task item may include: node identification, title, body text, voice input type (which may be text or numbers), button configuration information (which may include button display text, button color). As an example, the flow node attribute information of the voice input task item may be "node identification: 02, title: voice input, text: please read out the number displayed at a, voice input type: number, button configuration information: confirm, cancel, the next step. The flow node attribute information of the photographing task item can be "node identification: 02, title: shooting, namely, text: please take a complete image of B, button configuration information: photographing and then carrying out the next step. The flow node attribute information of the flow node corresponding to the scan task item may include: and (5) checking a result. The check result may be scan configuration information that checks the scan result to determine whether it is the same as the check result. For example, the verification result may be "xxx1b". If the scanning result is "xxx2b", the scanning task item fails to be executed if the scanning result is different from the verification result, and an operator needs to scan again.
The flow node connection information may be related information for connecting two nodes. The flow node connection information may include: a front item node identification and a back item node identification. The corresponding relation between the front node identifier and the rear node identifier is as follows: the connecting line is connected to the node corresponding to the latter node identifier by the node corresponding to the former node identifier. When the node corresponding to the previous node identifier is a flow node, and the flow node attribute information of the node includes button configuration information, the flow node connection information may further include: jump button identification. The jump button identifier can represent that an operator clicks a button corresponding to the jump button identifier in a task interface corresponding to the previous node identifier and jumps to the task interface corresponding to the next node identifier. The jump button identifier may be an identifier uniquely representing the button, and may display text or a button identifier code for the button.
In practice, the execution body may create a workflow represented by the set of flow node information and the set of flow node connection information. The workflow may be related information of a flow for an operator to execute a corresponding operation according to the node information of each flow having an execution sequence. Any two connected flow node information in the workflow can be connected according to the flow node connection information corresponding to the two connected flow node information. The first flow node information in the two connected flow node information corresponds to a previous node identifier included in the flow node connection information, and the second flow node information corresponds to a subsequent node identifier included in the flow node connection information.
As an example, the workflow may be expressed as:
{ workflow name: two-line every other day inspection procedure of assembly;
linksInputList:[workflowNodeIdSouce:001,workflowNodeIdTarget:002;
workflowNodeIdSouce:002,workflowNodeIdTarget:003];
nodeinputList [ workflow NodeId:001, workflow NodeName: equipment photograph, info: "text: please take a whole body photograph of the equipment";
workflow NodeId 002, workflow NodeName equipment scanning, info "text please scan equipment";
workflow NodeId 003, workflow NodeName Speech input, info "text" please read out the meter display value "].
Wherein, workflow name is the workflow name. linksInputList is a set of flow node connection information. nodeInputList is a set of flow node information. The workflow NodeIdSouce is the forefront node identification. The workflow NodeIdTarget is the latter node identification. The workflow nodeid is the node identification. The workflow node name is the node name. info is task interface information.
It should be noted that, in one or more embodiments of the present application, the workflow may be in a data exchange format, for example, but not limited to, one of the following: JSON, XML.
Alternatively, before step 201, the executing entity may determine each flow node template selected by the user from the flow node template library as each target flow node template. The process node template library may be a set of process node templates corresponding to various task items. For example, the above-described flow node template library may include, but is not limited to: a start node template, an end node template, a condition judgment node template, a shooting node template and a voice input node template. The process node template library may further include a custom node template. The custom node template can be a node template which is automatically created by a user according to requirements. For example, a flow node template library may be displayed on a workflow editing page, and a flow orchestration user may select a flow node template by dragging the flow node template to an editing area. The workflow edit page may be a page for editing a node of the workflow and a connection line between nodes.
Then, the process node attribute information configured by the user for each target process node template in the target process node templates can be determined to be process node information, so as to obtain a process node information set. Then, each connecting line connected by the user for each target flow node template can be determined as each target connecting line. Wherein, the connecting lines in the connecting lines are directed lines led out from one target flow node template and directed to the other target flow node template by the user.
And finally, combining the two node identifiers corresponding to each target connecting wire in the target connecting wires and the jump button configuration information corresponding to the first node identifier into flow node connection information to obtain a flow node connection information set. The target connection line is a connection line pointing from the target process node template corresponding to the first node identifier to the target process node template corresponding to the second node identifier. The jump button configuration information may be information for configuring a button for jumping to a next target flow node template, and may include a jump button identifier. When the process node attribute information corresponding to the first node identifier does not include the skip button identifier, the skip button configuration information is null. Thus, the process orchestration user can configure the process node information set and the process node connection information set for generating the workflow in a dragging manner.
Optionally, before determining each connection line to which the user is connected for each target process node template as each target connection line, the execution body may determine the first target process node template selected by the user as the previous process node template. And then, in response to determining that the number of connections corresponding to the previous process node template is less than the threshold number of connections corresponding to the previous process node template, determining the second target process node template selected by the user as a subsequent process node template. The number of connected connection lines may be the number of connection lines led out from the former process node template. The connection number threshold may be at most the number of connection lines that may be led out by the former flow node template. Then, the former process node template and the latter process node template may be connected by a connection line.
In response to detecting the publishing operation for the workflow, the publishing state of the workflow is determined to be a published state, step 202.
In some embodiments, the execution body may determine the publishing state of the workflow as a published state in response to detecting a publishing operation for the workflow. The publishing operation may determine an operation to publish the workflow for a flow orchestration user. For example, the above-described publishing operation may be an operation in which the flow orchestration user confirms to publish through an API. The various task interfaces of the workflow in the published state may be available for display in a corresponding head-mounted display device. The head-mounted display device corresponding to the workflow may be a head-mounted display device that receives the workflow, or may be a head-mounted display device that recognizes the workflow. It should be noted that the number of the head-mounted display devices corresponding to the workflow may be one or more. The display order of the task interfaces in the head-mounted display device is determined according to the execution logic of the workflow.
The head mounted display device described above may be a device for viewing a presented virtual scene by a wearing user and may receive data entered by the user. The data may include, but is not limited to: voice, image, video. For example, the head-mounted display device may be AR glasses or MR glasses. The flow node information corresponding to the flow node may include instructions to activate the executable unit of the head mounted display device described above.
The above embodiments of the present disclosure have the following advantageous effects: by the flow arranging method of some embodiments of the present disclosure, an operator can perform actual operations without interruption, and without manually filling in data, the data in the operation process can be saved by the head-mounted display device. Specifically, the reasons for the operator to interrupt the progress of the job, to fill in the data manually, and to have difficulty in storing the data are: the operator needs to interrupt the current working process to check the work flow in the form of paper or electronic document, and the work flow in the form of paper needs to be filled with data manually by the operator, so that the filled data is difficult to store. Based on this, the process scheduling method of some embodiments of the present disclosure first generates a workflow according to a process node information set and a process node connection information set corresponding to the process node information set. The process node information set corresponds to a start node, at least one process node and an end node, the process node connection information in the process node connection information set corresponds to two process node information in the process node information set, and the workflow corresponds to each task interface. Thus, the generated workflow can represent each task item which can be directly referred to for execution when an operator actually works, and each task item can be directly presented to the operator in the form of a task interface. Then, in response to detecting a publishing operation for the workflow, a publishing state of the workflow is determined as a published state. And each task interface corresponding to the workflow is used for being displayed in the head-mounted display equipment corresponding to the workflow. The flow node information corresponding to the flow node includes instructions to activate the executable unit of the head-mounted display device. Therefore, the workflow in the released state can be directly displayed in the corresponding head-mounted display equipment so that an operator wearing the head-mounted display equipment can directly and sequentially watch the displayed task interfaces, and accordingly actual operations corresponding to the task interfaces can be sequentially carried out according to the order of the workflow. Also, because the workflow can be displayed in the head-mounted display device in the form of a task interface, an operator can follow the guidance of the head-mounted display device to directly perform actual work without additionally viewing the workflow in the form of paper or electronic documents. Data may also be entered and saved via the head mounted display device. Therefore, an operator can perform actual operation without intermittently filling data, and the data in the operation process can be stored through the head-mounted display device.
With further reference to FIG. 3, a flow 300 of further embodiments of a flow orchestration method is shown. The process 300 of the process orchestration method comprises the steps of:
step 301, generating a workflow according to the workflow global configuration information, the flow node information set and the flow node connection information set.
In some embodiments, the execution body of the flow orchestration method (e.g., the server or terminal device 101/102 shown in fig. 1) may generate a workflow according to the workflow global configuration information, the above-described flow node information set, and the above-described flow node connection information set. The workflow global configuration information may be configuration related information set by a flow scheduling user for each node corresponding to the flow node information set. For example, workflow global configuration information may include, but is not limited to: an identification of whether remote collaboration is supported, an identification of whether video recording is supported, an identification of an API for reporting a single-step operation result, an identification of an API for reporting a total task operation result, an identification of an API for reporting a static resource, and an attachment file or a website for an operator to view. The workflow global configuration information can be pushed by a flow orchestrator through an API, and can also be configured by editing pages for the flow orchestrator through a workflow. In practice, the execution body may create a workflow represented by the workflow global configuration information, the flow node information set, and the flow node connection information set. The workflow may be uniquely represented by a workflow identification.
In response to detecting the publishing operation for the workflow, the publishing state of the workflow is determined to be a published state, step 302.
In some embodiments, the specific implementation of step 302 and the technical effects thereof may refer to step 202 in those embodiments corresponding to fig. 2, which are not described herein.
Step 303, determining a workflow selected by the user from the workflows in the published state as target workflows.
In some embodiments, the execution body may determine a workflow selected by a user from among the workflows whose published states are published states as the target workflow. Wherein the user may schedule the user for the process. When the execution subject is a server, the user may select a workflow from among the workflows whose published states are published states through an API. When the execution body is a terminal device, the user may select a workflow from the workflows whose release states are released states, which are displayed in the terminal device.
Optionally, before step 303, the executing body may determine, in response to detecting the user binding information sent by any head-mounted display device, a binding user corresponding to the user binding information as an alternative task receiving user. The user binding information may be information characterizing binding of the user to the arbitrary head-mounted display device, and may include a user identifier of the binding user and a device identifier of the arbitrary head-mounted display device. Any of the head mounted display devices described above may then be determined to be the head mounted display device to which the candidate task receiving user described above corresponds. An alternative task receiving user may be used for selection by the flow orchestration user as a task receiving user. Thus, when a new head mounted display device access is detected, the corresponding user may be bound to the head mounted display device.
Alternatively, the execution body may determine each of the determined candidate task reception users as the candidate task reception user set. Then, in response to detecting a selection operation of the user with respect to the candidate task reception users in the above-described candidate task reception user set, the candidate task reception user selected by the user may be determined as the task reception user. The selection operation may be an operation for receiving a user group for an alternative task, or an operation for receiving a user for a single alternative task.
And step 304, receiving the user according to the target workflow and at least one task selected by the user, and generating a work task.
In some embodiments, the execution body may receive a user according to the target workflow and at least one task selected by the user, and generate a work task. The task receiving user may be an operator who performs a task. The at least one task receiving user may be selected by the user by selecting a user group, or may be selected one by one. The task receiving user may be represented in the form of a user identification. The above-mentioned work task may be a task that enables a designated operator to perform an actual job in accordance with a workflow. One job task corresponds to one workflow. One workflow may correspond to at least one work task. In practice, the executing entity may create a work task represented by the user with the identification of the target workflow and the at least one task. The job task may include a workflow identification of the target workflow and a user identification of the at least one task receiving user. It should be noted that, in one or more embodiments of the present application, the job may be in a data exchange format, for example, but not limited to, one of the following: JSON, XML.
In some optional implementations of some embodiments, the execution body may generate the work task according to the target workflow, the at least one task receiving user, and the task execution mode and the task network type selected by the user. The task execution mode may be an execution mode of a work task. The task execution mode may be, but is not limited to, one of the following: disposable tasks, permanent tasks, periodic tasks. The task network type may be a network connection type supported when performing a work task. The task network type may be, but is not limited to, one of the following: an online task, a semi-offline task. An online task may characterize a need to maintain a networked state while performing a work task. The semi-offline task may represent a broken network state when executing the work task, and the data is uploaded to the data storage end after networking. In practice, the execution subject may create a work task represented by the identification of the target workflow, the at least one task receiving user, the task execution manner, and the task network type. The above-described work tasks may be uniquely represented by work task identifications.
In step 305, in response to detecting the issuing operation for the work task, the issuing process is performed on the work task.
In some embodiments, the execution body may perform a publishing process on the work task in response to detecting a publishing operation for the work task. The issuing operation may determine an operation to issue the work task for a flow orchestration user. For example, when the execution body is a server, the issuing operation may be an operation that the flow layout user confirms to issue through an API. When the execution main body is a terminal device, the publishing operation may be a selection operation of a process scheduling user on a publishing control corresponding to the work task. In practice, the execution body may send the task to each head-mounted display device corresponding to the at least one task receiving user, so that each head-mounted display device displays a task interface in each task interface corresponding to the target workflow. The release state of the above-described work task may also be determined as a released state.
In some optional implementations of some embodiments, first, the execution body may generate a work task identification code according to the work task. The task identifier may be used for scanning each head-mounted display device corresponding to the at least one task receiving user, so as to display a task interface in each task interface corresponding to the target workflow. The job identification code may be, but is not limited to: two-dimensional codes, bar codes. The job identification code may also be any other type of identification code. The job identification code may then be sent to the associated terminal device. The associated terminal device may be any terminal device associated with the execution body. For example, the associated terminal device may be a device at a site where an actual job is performed, and the job task identification code may be directly displayed. The staff member may also print the task identification code stored in the associated terminal device as a paper version for posting to a job site for scanning by the head mounted display device. Thus, the head-mounted display device can identify the work task by means of the work task identification code.
Alternatively, the executing body may receive task execution information records corresponding to the task sent by the head mounted display devices in the respective head mounted display devices. The task execution information record may receive operation data of a user (operator) in executing the task process for a task wearing the head-mounted display device. The task performance information records may include, but are not limited to: operation time, node identification, operation button. When the node of the operation corresponds to the task item needing to be recorded with data, the task execution information record can also comprise the recorded data. The above-described entry data may be, but is not limited to, the following form: image, video, voice.
The execution body may display a task record page of the work task in response to detecting a task record viewing operation of the user for the work task, where the task record viewing operation may be an operation of viewing a task execution information record of the work task. And the task record page displays a task execution information record set corresponding to the work task. It should be noted that, the execution body is a terminal device.
The execution body may display a task report file of the work task in response to detecting a task report viewing operation of the user with respect to the work task. The task report viewing operation may be an operation of viewing a task report file of the above-described work task. The task report file may be a report file showing the task execution information record set. It should be noted that, the execution body is a terminal device.
As can be seen in fig. 3, the flow 300 of the flow orchestration method in some embodiments corresponding to fig. 3 embodies the step of expanding the generation of work tasks, as compared to the description of some embodiments corresponding to fig. 2. Thus, the solution described by these embodiments may generate work tasks through already published workflows. So that a task receiving user corresponding to a work task can perform the work task through the head-mounted display device.
With further reference to fig. 4, a flow 400 of some embodiments of a flow display method is shown. The process 400 of the process display method is applied to a head-mounted display device, and comprises the following steps:
in step 401, in response to receiving the work task or identifying the work task corresponding to the work task identification code, a task interface corresponding to the workflow corresponding to the received or identified work task is displayed in a display screen of the head-mounted display device.
In some embodiments, the execution body of the flow display method (such as the head-mounted display device shown in fig. 1) may respond to receiving the work task or identify the work task corresponding to the work task identification code, and display, on the display screen of the head-mounted display device, a task interface corresponding to the workflow corresponding to the received or identified work task. Wherein the workflow may be generated using steps in those embodiments corresponding to fig. 2 or fig. 3. The above-described job tasks may be generated using steps in those embodiments corresponding to fig. 3. The execution body can identify the work task corresponding to the work task identification code through the camera. In practice, the execution body may present different task interfaces according to the execution logic of the workflow.
Step 402, determining whether the operation of the user on the task interface meets the jump condition corresponding to the task interface.
In some embodiments, the execution body may determine whether the operation of the user on the task interface satisfies a jump condition corresponding to the task interface. The jump condition may be a condition for determining whether to jump to the next task interface. In practice, the executing body may determine that the operation of the user on the task interface meets the skip condition corresponding to the task interface in response to determining that the task interface is a scanning task interface and detecting a scanning result for a live-action. The scan task interface may be an interface for scanning a live-action. In practice, the executing body may further determine that the operation of the user on the task interface meets the skip condition corresponding to the task interface in response to determining that the task interface is a scanning task interface, and the detected scanning result for the live-action is the same as the check result preconfigured for the flow node corresponding to the task interface. The execution body may further determine that the operation of the user with respect to the task interface satisfies the skip condition corresponding to the task interface in response to determining that the task interface is a voice input task interface and detecting voice input by the user. The voice input task interface may be an interface for prompting a user to input voice. The executing body may further determine that the operation of the user on the task interface meets the skip condition corresponding to the task interface in response to determining that the task interface is a shooting task interface and detecting an image or video shot by the user. The shooting task interface may be an interface for prompting a user to shoot an image or video.
In some optional implementations of some embodiments, the executing body may control the associated camera to capture an image in response to determining that the task interface is an image capturing task interface and detecting a selection operation of a user on a capturing control displayed in the image capturing task interface. Then, in response to detecting a confirmation operation of the user on the image, it is determined that the operation of the user on the task interface satisfies the skip condition. The confirmation operation may be a selection operation performed by a user on an image confirmation control corresponding to the image. Thus, after confirming the shot image, the user can automatically jump to the next task interface.
In some optional implementations of some embodiments, the executing entity may determine that the user satisfies the jump condition for the operation of the task interface in response to detecting that a user wearing the head-mounted display device acts on a selection operation of a jump control, or detecting that a voice password issued by the user is the same as a voice command corresponding to the jump control. The jump control is displayed in the task interface. The jump control may be a control for receiving a selection operation of a user to jump to a next task interface. For example, the jump control described above may be displayed as "next". The user may select the jump control through a knob provided on the head mounted display device. For example, the jump control may be hovered over by a knob for a preset duration to select the jump control. For another example, the jump control may be hovered over by a knob that is pressed to select the jump control. The voice command may be a command in a voice form of a display text corresponding to the jump control. For example, the voice command may be "next". When the user speaks "next", it may be determined that the user operation on the task interface satisfies the jump condition.
Step 403, in response to detecting that the operation of the user on the task interface meets the skip condition, skipping to the next task interface corresponding to the task interface and the operation.
In some embodiments, the execution body may jump to a next task interface corresponding to the task interface and the operation in response to detecting that the operation of the user with respect to the task interface satisfies the jump condition. The next task interface may be a task interface arranged behind the task interface, where the operation is directed, under the execution logic of the workflow.
Optionally, the executing body may send a task execution information record generated by the user with respect to the operation of the task interface to the associated storage end and/or the coordination end in response to determining that the current network state is an online state. The storage terminal may be a terminal for storing task execution information records. For example, the storage may be an execution body of the flow scheduling method corresponding to fig. 2 or fig. 3. The cooperating terminal may be a terminal cooperating with the head-mounted display device. For example, the operator can call the cooperative end through the worn head-mounted display device, so that the user of the cooperative end can conduct remote guidance on the actual operation of the operator, and at the moment, the operator directly communicates with the user of the cooperative end through the worn head-mounted display device. The task performance information record may then be stored locally in response to determining that the current network state is an offline state. For example, the task execution information record may be stored in a local memory. Then, in response to determining that the current network state is an online state, the locally stored task execution information record that is not sent to the storage terminal may be sent to the storage terminal. Thus, the task execution information recording can be synchronized in real time in the online state. The task execution information record may be temporarily stored locally while in an offline state. The task execution information record temporarily stored locally can be synchronized to the storage side when the offline state is changed to the online state.
The above embodiments of the present disclosure have the following advantageous effects: by the flow display method of some embodiments of the present disclosure, an operator can perform an actual operation without interruption, and without manually filling in data, the data in the operation process can be saved by the head-mounted display device. Specifically, the reasons for the operator to interrupt the progress of the job, to fill in the data manually, and to have difficulty in storing the data are: the operator needs to interrupt the current working process to check the work flow in the form of paper or electronic document, and the work flow in the form of paper needs to be filled with data manually by the operator, so that the filled data is difficult to store. Based on this, in the flow display method of some embodiments of the present disclosure, first, in response to receiving a work task or identifying a work task corresponding to a work task identification code, a task interface corresponding to a workflow corresponding to the received or identified work task is displayed in a display screen of the head-mounted display device. Thus, the operator can directly refer to the information displayed on the task interface when actually working. And then determining whether the operation of the user aiming at the task interface meets the jump condition corresponding to the task interface. And then, in response to detecting that the operation of the user on the task interface meets the jump condition, jumping to the next task interface corresponding to the task interface and the operation. Therefore, after the task item of the current task interface is executed, the next task interface can be automatically jumped to for the operator to watch. Also, because the workflow can be displayed in the head-mounted display device in the form of a task interface, an operator can follow the guidance of the head-mounted display device to directly perform actual work without additionally viewing the workflow in the form of paper or electronic documents. Data may also be entered and saved via the head mounted display device. Therefore, an operator can perform actual operation without intermittently filling data, and the data in the operation process can be stored through the head-mounted display device.
Referring now to fig. 5, a schematic diagram of a head mounted display device (e.g., the head mounted display device of fig. 1) 500 suitable for use in implementing some embodiments of the present disclosure is shown. The head mounted display devices in some embodiments of the present disclosure may include, but are not limited to, AR glasses, MR glasses. The head mounted display device shown in fig. 5 is only one example and should not impose any limitation on the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the head mounted display device 500 may include a processing means (e.g., a central processor, a graphics processor, etc.) 501 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the head mounted display device 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, at least one display screen for imaging in front of the user's eyes (e.g., the display screen may include a micro display screen and optical elements), a speaker, a vibrator, etc.; and communication means 509. The communication means 509 may allow the head mounted display device 500 to communicate wirelessly or by wire with other devices to exchange data. While fig. 5 shows a head mounted display device 500 having various means, it should be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 5 may represent one device or a plurality of devices as needed.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via the communications device 509, or from the storage device 508, or from the ROM 502. The above-described functions defined in the methods of some embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
It should be noted that, the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the head mounted display device; or may be present alone without being fitted into the head mounted display device. The computer readable medium carries one or more programs which, when executed by the head mounted display device, cause the head mounted display device to: responding to the received work task or identifying the work task corresponding to the work task identification code, and displaying a task interface corresponding to the workflow corresponding to the received or identified work task in a display screen of the head-mounted display device; determining whether the operation of a user aiming at the task interface meets the jump condition corresponding to the task interface; and in response to detecting that the operation of the user on the task interface meets the jump condition, jumping to the next task interface corresponding to the task interface and the operation.
Computer program code for carrying out operations for some embodiments of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++, python, ruby, nodeJS, javascript and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above technical features, but encompasses other technical features formed by any combination of the above technical features or their equivalents without departing from the spirit of the invention. Such as the above-described features, are mutually substituted with (but not limited to) the features having similar functions disclosed in the embodiments of the present disclosure.

Claims (16)

1. A process orchestration method, comprising:
determining each process node template selected by a user from a process node template library as each target process node template;
Determining flow node attribute information configured by a user aiming at each target flow node template in the target flow node templates as flow node information to obtain a flow node information set;
determining each connecting line connected by a user aiming at each target process node template as each target connecting line;
combining two node identifiers corresponding to each target connecting line in the target connecting lines and jump button configuration information corresponding to a first node identifier into flow node connection information to obtain a flow node connection information set, wherein the target connecting lines are connecting lines pointing from a target flow node template corresponding to the first node identifier to a target flow node template corresponding to a second node identifier, and the jump button configuration information is information for configuring a button for jumping to a next target flow node template;
generating a workflow according to a flow node information set and a flow node connection information set corresponding to the flow node information set, wherein the flow node information set corresponds to a start node, at least one flow node and an end node, the flow node connection information in the flow node connection information set corresponds to two flow node information in the flow node information set, the workflow corresponds to each task interface, and each flow node information in the flow node information set corresponds to a task interface;
In response to detecting a release operation for the workflow, determining a release state of the workflow as a released state, wherein each task interface corresponding to the workflow is used for being displayed in a head-mounted display device corresponding to the workflow, flow node information corresponding to flow nodes comprises instructions for starting an executable unit of the head-mounted display device, the display order of each task interface in the head-mounted display device is determined according to the execution logic of the workflow, and the head-mounted display device is configured to jump to a corresponding task interface and a next task interface of the operation in response to detecting that the operation of a user for the displayed task interface meets a jump condition.
2. The method of claim 1, wherein the method further comprises:
determining a workflow selected by a user from all workflows with published states as target workflows;
receiving a user according to the target workflow and at least one task selected by the user, and generating a work task, wherein the work task corresponds to the target workflow;
and in response to detecting the release operation for the work task, performing release processing on the work task.
3. The method of claim 2, wherein the generating a workflow from a set of flow node information and a set of flow node connection information corresponding to the set of flow node information comprises:
and generating a workflow according to the workflow global configuration information, the flow node information set and the flow node connection information set.
4. The method of claim 1, wherein prior to said determining each connection line to which a user is connected for the each target flow node template as a respective target connection line, the method further comprises:
determining a first target process node template selected by a user as a previous process node template;
responsive to determining that the number of connections corresponding to the previous process node template is less than the connection number threshold corresponding to the previous process node template, determining a second target process node template selected by the user as a subsequent process node template;
and connecting the front-end process node template and the back-end process node template by a connecting wire.
5. The method of claim 2, wherein the receiving a user from the target workflow and the at least one task selected by the user, generating a work task, comprises:
And generating a work task according to the target workflow, the at least one task receiving user and the task execution mode and task network type selected by the user.
6. The method of claim 2, wherein the publishing the work task comprises:
and sending the work task to each head-mounted display device corresponding to the at least one task receiving user, so that each head-mounted display device displays a task interface in each task interface corresponding to the target workflow.
7. The method of claim 2, wherein the publishing the work task comprises:
generating a work task identification code according to the work task, wherein the work task identification code is used for the at least one task to receive each head-mounted display device corresponding to a user to scan so as to display a task interface in each task interface corresponding to the target workflow;
and sending the work task identification code to the associated terminal equipment.
8. The method according to claim 6 or 7, wherein the method further comprises:
receiving task execution information records corresponding to the work tasks sent by head-mounted display devices in the head-mounted display devices;
Responding to the detection of task record viewing operation of a user for the work task, displaying a task record page of the work task, wherein a task execution information record set corresponding to the work task is displayed in the task record page;
and in response to detecting a task report viewing operation of a user for the work task, displaying a task report file of the work task.
9. The method of claim 2, wherein prior to the determining as the target workflow the workflow selected by the user from among the workflows having published states as published states, the method further comprises:
in response to detecting user binding information sent by any head-mounted display equipment, determining a binding user corresponding to the user binding information as an alternative task receiving user;
and determining the arbitrary head-mounted display device as the head-mounted display device corresponding to the candidate task receiving user.
10. The method of claim 9, wherein prior to the receiving a user from the target workflow and the user selected at least one task, generating a work task, the method further comprises:
determining each of the determined candidate task receiving users as a set of candidate task receiving users;
In response to detecting a selection operation of the user for an alternative task receiving user in the set of alternative task receiving users, determining the alternative task receiving user selected by the user as the task receiving user.
11. A flow display method is applied to a head-mounted display device and comprises the following steps:
in response to receiving a work task or identifying a work task corresponding to a work task identification code, displaying a task interface corresponding to a workflow corresponding to the received or identified work task in a display screen of the head-mounted display device, wherein the workflow is generated by adopting the method according to any one of claims 1-10;
determining whether the operation of a user aiming at the task interface meets the jump condition corresponding to the task interface or not;
and in response to detecting that the operation of the user on the task interface meets the skip condition, skipping to a next task interface corresponding to the task interface and the operation.
12. The method of claim 11, wherein the determining whether the operation of the user on the task interface satisfies the skip condition corresponding to the task interface comprises:
responsive to determining that the task interface is an image capturing task interface, and detecting a selection operation of a user on a capturing control displayed in the image capturing task interface, controlling an associated camera to capture an image;
In response to detecting a confirmation operation of the user on the image, determining that the operation of the user on the task interface meets the jump condition.
13. The method of claim 11, wherein the determining whether the operation of the user on the task interface satisfies the skip condition corresponding to the task interface comprises:
and responding to the detection that a user wearing the head-mounted display device acts on the selection operation of the jump control, or detecting that the voice password sent by the user is the same as the voice command corresponding to the jump control, and determining that the operation of the user aiming at the task interface meets the jump condition, wherein the jump control is displayed in the task interface.
14. The method according to one of claims 11-13, wherein the method further comprises:
responding to the current network state to be an on-line state, and sending task execution information records generated by a user aiming at the operation of the task interface to an associated storage end and/or a cooperator end;
storing the task execution information record to a local place in response to determining that the current network state is an offline state;
and in response to determining that the current network state is an online state, sending the locally stored task execution information record which is not sent to the storage end.
15. A head mounted display device comprising:
one or more processors;
one or more display screens for imaging in front of the eyes of a user wearing the head mounted display device;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 11-14.
16. A computer readable medium having stored thereon a computer program, wherein the program when executed by a processor implements the method of any of claims 1-10 or 11-14.
CN202210470109.6A 2022-04-28 2022-04-28 Flow arrangement, display method, head-mounted display device, and computer-readable medium Active CN114968454B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210470109.6A CN114968454B (en) 2022-04-28 2022-04-28 Flow arrangement, display method, head-mounted display device, and computer-readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210470109.6A CN114968454B (en) 2022-04-28 2022-04-28 Flow arrangement, display method, head-mounted display device, and computer-readable medium

Publications (2)

Publication Number Publication Date
CN114968454A CN114968454A (en) 2022-08-30
CN114968454B true CN114968454B (en) 2024-04-12

Family

ID=82978618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210470109.6A Active CN114968454B (en) 2022-04-28 2022-04-28 Flow arrangement, display method, head-mounted display device, and computer-readable medium

Country Status (1)

Country Link
CN (1) CN114968454B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115145560B (en) * 2022-09-06 2022-12-02 北京国电通网络技术有限公司 Business orchestration method, apparatus, device, computer-readable medium, and program product
CN116701181B (en) * 2023-05-10 2024-02-02 海南泽山软件科技有限责任公司 Information verification flow display method, device, equipment and computer readable medium
CN117132245B (en) * 2023-10-27 2024-02-06 北京国电通网络技术有限公司 Method, device, equipment and readable medium for reorganizing online article acquisition business process

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406907A (en) * 2016-10-11 2017-02-15 传线网络科技(上海)有限公司 Application program flow execution control method and device
WO2018000606A1 (en) * 2016-06-30 2018-01-04 乐视控股(北京)有限公司 Virtual-reality interaction interface switching method and electronic device
CN108089696A (en) * 2016-11-08 2018-05-29 罗克韦尔自动化技术公司 For the virtual reality and augmented reality of industrial automation
CN110853115A (en) * 2019-10-14 2020-02-28 平安国际智慧城市科技股份有限公司 Method and equipment for creating development process page
CN110910081A (en) * 2018-09-17 2020-03-24 上海宝信软件股份有限公司 Workflow configuration implementation method and system based on laboratory information management system
CN111930372A (en) * 2020-08-06 2020-11-13 科大国创云网科技有限公司 Service arrangement solution method and system realized through draggable flow chart
CN112685036A (en) * 2021-01-13 2021-04-20 北京三快在线科技有限公司 Front-end code generation method and device, computer equipment and storage medium
CN113220118A (en) * 2021-04-20 2021-08-06 杭州灵伴科技有限公司 Virtual interface display method, head-mounted display device and computer readable medium
US11151898B1 (en) * 2020-04-15 2021-10-19 Klatt Works, Inc. Techniques for enhancing workflows relating to equipment maintenance
CN114035884A (en) * 2021-12-07 2022-02-11 深圳市锐思华创技术有限公司 UI interaction design method of AR HUD train control system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7100147B2 (en) * 2001-06-28 2006-08-29 International Business Machines Corporation Method, system, and program for generating a workflow
US10824310B2 (en) * 2012-12-20 2020-11-03 Sri International Augmented reality virtual personal assistant for external representation

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018000606A1 (en) * 2016-06-30 2018-01-04 乐视控股(北京)有限公司 Virtual-reality interaction interface switching method and electronic device
CN106406907A (en) * 2016-10-11 2017-02-15 传线网络科技(上海)有限公司 Application program flow execution control method and device
CN108089696A (en) * 2016-11-08 2018-05-29 罗克韦尔自动化技术公司 For the virtual reality and augmented reality of industrial automation
CN110910081A (en) * 2018-09-17 2020-03-24 上海宝信软件股份有限公司 Workflow configuration implementation method and system based on laboratory information management system
CN110853115A (en) * 2019-10-14 2020-02-28 平安国际智慧城市科技股份有限公司 Method and equipment for creating development process page
US11151898B1 (en) * 2020-04-15 2021-10-19 Klatt Works, Inc. Techniques for enhancing workflows relating to equipment maintenance
CN111930372A (en) * 2020-08-06 2020-11-13 科大国创云网科技有限公司 Service arrangement solution method and system realized through draggable flow chart
CN112685036A (en) * 2021-01-13 2021-04-20 北京三快在线科技有限公司 Front-end code generation method and device, computer equipment and storage medium
CN113220118A (en) * 2021-04-20 2021-08-06 杭州灵伴科技有限公司 Virtual interface display method, head-mounted display device and computer readable medium
CN114035884A (en) * 2021-12-07 2022-02-11 深圳市锐思华创技术有限公司 UI interaction design method of AR HUD train control system

Also Published As

Publication number Publication date
CN114968454A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN114968454B (en) Flow arrangement, display method, head-mounted display device, and computer-readable medium
CN109816447B (en) Intelligent monitoring method, device and storage medium for cabinet advertisement
US11164160B2 (en) System and method of providing to-do list of user
CN107967143B (en) Method, device and system for acquiring update indication information of source code of client application program
JP7230465B2 (en) ERROR DISPLAY SYSTEM, ERROR DISPLAY METHOD, INFORMATION PROCESSING DEVICE
KR20130009932A (en) Method, system, and apparatus for process management
CN110634220B (en) Information processing method and device
US20150088981A1 (en) Integrated social media server and architecture
EP3120249A1 (en) Information processing system, data process control method, program, and recording medium
CN112668283B (en) Document editing method and device and electronic equipment
CN112015654A (en) Method and apparatus for testing
CN110232091A (en) Mthods, systems and devices for synchrodata
CN113626002A (en) Service execution method and device
CN110618768A (en) Information presentation method and device
KR20120095325A (en) Method and system for providing social-computing-based location information using mobile augmented reality
US20200244592A1 (en) Resource reservation system, setting method, and non-transitory computer readable storage medium
CN112769848B (en) Message sending method and device
US11567758B2 (en) Configuration properties management for software
CN114092169A (en) Ordering test method and system and equipment for executing ordering test method
CN113407229A (en) Method and device for generating offline script
CN111143740A (en) Information processing method and device and electronic equipment
CN114253520B (en) Interface code generation method and device
CN112883697B (en) Workflow form generation method, device, electronic equipment and computer readable medium
JP7375727B2 (en) Data management system, terminal device, program, data input method, information processing system
US20240152504A1 (en) Data interaction method, apparatus, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant