CN111143004A - Scene guide method and device, electronic equipment and storage medium - Google Patents

Scene guide method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111143004A
CN111143004A CN201911357650.0A CN201911357650A CN111143004A CN 111143004 A CN111143004 A CN 111143004A CN 201911357650 A CN201911357650 A CN 201911357650A CN 111143004 A CN111143004 A CN 111143004A
Authority
CN
China
Prior art keywords
scene
flow
resource data
current
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911357650.0A
Other languages
Chinese (zh)
Other versions
CN111143004B (en
Inventor
杨斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201911357650.0A priority Critical patent/CN111143004B/en
Publication of CN111143004A publication Critical patent/CN111143004A/en
Application granted granted Critical
Publication of CN111143004B publication Critical patent/CN111143004B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a scene guide method, a scene guide device, electronic equipment and a storage medium. When the trigger operation of the current scene is detected, the scene flow of the current scene is determined, the resource data of the scene flow are obtained, the scene flow corresponds to the operation steps of a plurality of flow nodes, the operation steps of each flow node comprise the resource data to be obtained, and then the scene flow and the resource data of the scene flow are sequentially displayed, so that the problem of low business processing efficiency caused by the fact that the operation flow of the scene needs to be completed based on the operation sequence described in a manual and depending on rich experience in the prior art is solved, the purpose of automatically guiding the user to operate in the current scene through the interaction of the displayed scene flow and the resource data and the user is achieved, and the effect of improving the business processing efficiency is achieved.

Description

Scene guide method and device, electronic equipment and storage medium
Technical Field
Embodiments of the present invention relate to computer technologies, and in particular, to a scene guidance method and apparatus, an electronic device, and a storage medium.
Background
There are various tools and applications in medical device systems that are self-contained, but have some business associations between them. When a user completes one service in one scene and switches to the next service, a plurality of tools and applications are involved in switching between different services.
In the prior art, when a user switches services, the operation flow of the scene needs to be completed based on the operation sequence described in the manual and relying on rich experience, but the operation efficiency is low and the operation is complicated.
Disclosure of Invention
The embodiment of the invention provides a scene guiding method and device, electronic equipment and a storage medium, which are used for automatically guiding a scene operation flow and achieving the purpose of improving the convenience and high efficiency of operation.
In a first aspect, an embodiment of the present invention provides a scene guidance method, where the method includes:
when the trigger operation of the current scene is detected, determining the scene flow of the current scene;
acquiring resource data of the scene process;
and sequentially displaying the scene flow and the resource data of the scene flow, wherein the scene flow and the resource data corresponding to the scene flow are used for guiding the operation of a user in the current scene.
In a second aspect, an embodiment of the present invention further provides a scene guidance apparatus, where the scene guidance apparatus includes:
the scene flow determining module is used for determining the scene flow of the current scene when the triggering operation of the current scene is detected;
the resource data acquisition module is used for acquiring the resource data of the scene process;
and the display module is used for sequentially displaying the scene flow and the resource data of the scene flow, wherein the scene flow and the resource data corresponding to the scene flow are used for guiding the operation of a user in the current scene.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements a scene booting method according to any one of the first aspect when executing the computer program.
In a fourth aspect, the present invention further provides a storage medium containing computer-executable instructions, where the computer-executable instructions, when executed by a computer processor, implement a scene boot method according to any one of the first aspect.
According to the technical scheme provided by the embodiment of the invention, when the trigger operation of the current scene is detected, the scene flow of the current scene is determined, the resource data of the scene flow is obtained, the scene flow corresponds to a plurality of operation steps, each operation step comprises the resource data to be obtained, and then the resource data of the scene flow and the scene flow are sequentially displayed, so that the problem of low business processing efficiency caused by the fact that the operation flow of the scene needs to be completed based on the operation sequence described in a manual and depending on rich experience in the prior art is solved, the purpose of automatically guiding the user to work in the current scene through the interaction of the displayed scene flow and the resource data with the user is achieved, and the effect of improving the business processing efficiency is realized.
Drawings
Fig. 1 is a schematic flowchart of a scene guidance method according to an embodiment of the present invention;
fig. 2 is a logic diagram of a scene guidance method according to an embodiment of the present invention
Fig. 3 is a flowchart illustrating a scene guidance method according to a second embodiment of the present invention;
fig. 4 is a display interface diagram of a scene guidance method according to a second embodiment of the present invention;
fig. 5 is an interaction diagram of a scene guidance method according to a second embodiment of the present invention;
fig. 6 is a schematic display diagram of a scene guidance method according to a second embodiment of the present invention;
fig. 7 is a schematic structural diagram of a scene guidance device according to a third embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a scene guidance method according to an embodiment of the present invention, where the embodiment is applicable to a case of automatically guiding a workflow of a scene, and the method may be executed by a scene guidance device, where the device may be implemented by software and/or hardware and is generally integrated in a terminal. Referring specifically to fig. 1, the method may include the steps of:
s110, when the trigger operation of the current scene is detected, the scene flow of the current scene is determined.
It should be noted that, in the scene flow development stage, a scene flow needs to be configured for each scene, so that when a trigger operation on the current scene is detected, the scene flow of the current scene is obtained. Optionally, the configuration of the scene flow may be implemented by:
(a) acquiring an operation step set of each scene, wherein the operation step set is based on historical operation steps and/or a preset operation step set recorded under each scene;
(b) for any scene, extracting corresponding steps from the operation step set based on the description information of the current scene, and combining the extracted steps based on execution logic to generate a scene flow of the current scene. The description information includes key information of the scene, for example, an operation purpose, a name of a selected operation device, and the like.
The historical operation steps are a set formed by combining the historical operation steps input by the user for each scene, or the operation steps of the user in each scene are recorded and stored, and the preset operation step set can be a combination of the operation steps of each scene input by the user according to an operation manual. It is understood that in the scene flow development phase, the developer sets at least one operation step, description information, and execution logic for each scene. When the electronic equipment acquires the description information of each scene, extracting the historical operation steps and/or the preset operation step set corresponding to each scene, then extracting the operation steps corresponding to each scene according to the description information of each scene, and combining the operation steps according to the execution logic of each scene to obtain the scene flow of each scene.
It should be noted that, in some scenarios, a scenario flow needs to jointly execute a certain service in combination with some resource data and the like. In contrast, in the scene process development stage, a developer may input the operation results of the historical operation steps related to each scene or the pre-input resource data to the electronic device, and the electronic device establishes a corresponding relationship between the acquired operation results of the historical operation steps or the pre-input resource data and the scene process according to the description information and the execution logic. Therefore, when the electronic equipment detects the triggering operation of the current scene input by the user, the user can be guided to complete the work flow according to the scene flow of the current scene and the corresponding resource data.
Setting a flow guide logic in a scene flow development stage by combining with the schematic diagram explanation shown in fig. 2, as shown in fig. 2, a developer inputs operation step sets of each scene to an electronic device, the electronic device performs intelligent assembly according to self processing logic after receiving the operation step sets, for some operation steps of each scene, service processing may need to be completed by combining resource data, the step sets after the intelligent assembly and the resource data are combined to generate a scene flow route, and the scene flow and the resource data are used as a guide flow tool to perform visual interactive scene guide on a user until the service under the scene is executed.
Illustratively, the current scene is a CT scan, the description information of the scene is "start CT scanning device to perform CT imaging", the electronic device extracts various operation steps according to the description information, including moving the scan control frame, adjusting the scan window width, changing the size of the field of view, and the like, and then combines each operation step according to the execution logic of the CT imaging, for example, the execution logic is: adjusting the frame and the inclination angle- > starting exposure- > moving the scanning control frame- > adjusting the scanning window width or changing the view field size, collecting the combined operation steps as the scene flow of the current scene, and establishing a corresponding relation between the operation results or the operation steps, so that when a user starts the CT scanning device to perform CT imaging, the electronic device guides the user to work in the scene according to the scene flow and the resource data corresponding to the scene flow.
And S120, acquiring resource data of the scene process.
Wherein the resource data includes: at least one of application tools, picture data, text data, and audio/video data.
Optionally, the resource data of the scene flow may be acquired by:
(A) analyzing scene description keywords contained in a scene flow;
(B) if the scene description keyword contains a link address, importing an application tool corresponding to the current scene according to the link address;
(C) and if the scene description keyword contains attribute information of the picture data, the text data or the audio/video data, importing the picture data, the text data or the audio/video data corresponding to the current scene according to the attribute information.
Illustratively, the current scene is a CT image data analysis, and the electronic device acquires a scene flow of the current scene, where the acquiring the scene flow includes: receiving original scanning data, filtering, extracting characteristic data, generating a chart and the like, obtaining each operation step, analyzing scene description keywords of each step contained in a scene flow, for the step of extracting the feature data, analyzing the scene description keywords to extract the feature data, the electronic device jumps to data processing software (e.g., Matlab software) according to the link address included in the scene description keyword, performs data processing in conjunction with the data processing software, for the step of generating the graph, if the scene description keyword is analyzed as data processing, the computer imports the anatomical data of the same scanning object or the historical CT scanning data of the scanning object according to the attribute information contained in the scene description keyword, and combines the original scanning data with the anatomical data or the historical CT scanning data of the scanning object to generate the graph. Therefore, after the scene flow of the current scene is combined with the resource data of the scene flow, the processing service of the user in the scene can be guided.
And S130, displaying the scene flow and the resource data of the scene flow in sequence.
The scene process and the resource data are used for guiding the operation of the user in the current scene, so that the user and the equipment interactively complete the service processing under the automatic guidance of the electronic equipment by displaying the scene process and the resource data.
The embodiment of the invention provides a scene guiding method, which is characterized in that when the triggering operation of the current scene is detected, the scene flow of the current scene is determined, the resource data of the scene flow is obtained, the scene flow corresponds to a plurality of operation steps, each operation step comprises the resource data to be obtained, and then the resource data of the scene flow and the scene flow are sequentially displayed, so that the problem of low business processing efficiency caused by the fact that the operation flow of the scene needs to be completed based on the operation sequence described in a manual and depending on rich experience in the prior art is solved, the purpose of automatically guiding a user to work in the current scene through the interaction of the displayed scene flow and the resource data with the user is achieved, and the effect of improving the business processing efficiency is achieved.
Example two
Fig. 3 is a flowchart illustrating a scene guidance method according to a second embodiment of the present invention. The technical solution of this embodiment is refined on the basis of the above embodiment, and optionally, the displaying the scene flow and the resource data corresponding to the scene flow in sequence includes: determining a process node corresponding to the current operation; and displaying node information corresponding to the process node corresponding to the current operation and the resource data, wherein the node information comprises at least one of an operation description language contained in each process node and guidance information among the process nodes. Referring specifically to fig. 3, the method may include the steps of:
s310, when the trigger operation of the current scene is detected, the scene flow of the current scene is determined. The scene flow comprises at least one of an operation flow chart and a flow description language corresponding to the current scene.
S320, acquiring resource data of the scene process. Wherein the resource data includes: at least one of application tools, picture data, text data, and audio/video data.
S330, determining the flow node corresponding to the current operation.
And S340, displaying node information and resource data corresponding to the process nodes corresponding to the current operation, wherein the node information comprises at least one of operation description languages contained in each process node and guidance information among the process nodes.
In the scene process development stage, a developer configures operation buttons and guide information at each process node, and the guide information may include guidance information between an operation description language and the process nodes. When a user wants to work in a certain scene, an operation instruction is input by clicking an operation button of the electronic equipment, the electronic equipment determines a corresponding process node according to the operation button and displays an operation description language of the process node and guide information among the process nodes, the user selects a step to be executed step by step according to the operation description language and the guide information among the process nodes, and when the step of each process node is executed, an execution result is recorded, and the user can select a rollback operation or select the operation button of other process nodes to continue to process the service.
As shown in fig. 4, the display interface diagram provided in this embodiment includes operation buttons such as "view environment", "adjust environment", "view empty calibration table", "execute empty calibration", "measure artifact radius", and the like on the left side of the display interface diagram, the display interface diagram includes operation buttons such as "reset", "exit", and the guidance information includes: there are an abnormality in the environment between the "viewing environment" and the "adjustment environment", an "image abnormality" between the "adjustment environment" and the "viewing empty correction table", and the like. Illustratively, if the flow node corresponding to the current operation is the non-linear analysis, the user selects the reset operation button, the exit operation button and the operation button of the next flow node to process the service by viewing the guidance information and the description language.
Optionally, in the scene flow development phase, the developer can also configure the bootstrap function for any scene. Optionally, an automatic skip function may be configured for all process nodes in any scene, and the automatic skip function is used to control automatic skip of the process nodes; or, configuring an automatic jump function for some flow nodes in any scene, and configuring other flow nodes to jump under the trigger of a user. Further, when the user processes the service by using the scenario flow, the scenario guidance method further includes: receiving a self-guiding trigger operation, determining a self-guiding step according to a self-guiding rule, then controlling an operation device to sequentially execute the self-guiding step in a scene flow, and receiving execution feedback information of the operation device, wherein the execution feedback information comprises an operation result and operation completion information; the bootstrap rule may include a bootstrap order of the flow nodes, feedback information whether to display the flow nodes, the number of the bootstrap flow nodes, and the like.
For example, in an application scenario corresponding to the display interface diagram shown in fig. 4, a developer may configure an automatic skip function for all process nodes, when a user controls and processes a service of an application scenario by using an electronic device, input a self-boot trigger operation to the electronic device, the electronic device sequentially determines self-boot steps according to a self-boot rule, and controls an operation device to sequentially execute each self-boot step until the service of the application scenario is completely executed. As shown in fig. 5, each application scenario corresponds to a scenario flow one to one, and if all the operation steps of the application scenario are set to be self-guided, a user can perform service processing through the scenario flow corresponding to the scenario until a job is completed.
For example, in the application scenario of the display interface corresponding point shown in fig. 4, the developer may further configure an automatic skip function for the first four process nodes, configure the remaining process nodes to skip under the trigger of the user, and when the first four steps are executed, the electronic device sequentially determines the self-guiding steps according to the self-guiding rule, and controls the operation device to sequentially execute each self-guiding step until the first four process nodes of the application scenario are executed completely, and displays feedback information corresponding to the fourth process node, for the next process node, the user may execute the step of measuring the artifact radius through the feedback information displayed on the display interface, and then execute the operation of the next process node according to the feedback information of the step until all the process nodes are executed completely.
It can be understood that, in order to save the display space and guide the user operation conveniently, the main window mode and the floating mode of the scene flow and the resource data corresponding to the scene flow can be switched and displayed at any time. Optionally, the method further comprises: and when the suspension display operation of the scene flow is detected, performing suspension display on information corresponding to the suspension display operation. As shown in fig. 6, in the hover mode, a guidance page may be expanded, unexpanded, or locked, and the guidance page may include a scenario flow, resource data, an operation description language, or guidance information between flow nodes.
The embodiment of the invention provides a scene guiding method, which is characterized in that a user can process services step by step through display information by determining a process node corresponding to current operation and displaying node information and resource data corresponding to the process node corresponding to the current operation, so that the purpose of automatically guiding the user to operate in a current scene through interaction between the displayed scene process and the resource data and the user is achieved; or, the scene guiding method also receives the self-guiding triggering operation, determines the self-guiding step according to the self-guiding rule, controls the operation device to sequentially execute the self-guiding step in the scene flow, and receives the execution feedback information of the operation device, so as to achieve the purpose of not triggering the complete self-guiding service processing service by the user; in addition, the method also provides a suspension display mode, selects a display space and achieves the effect of improving the operation experience of the user.
EXAMPLE III
Fig. 7 is a schematic structural diagram of a scene guidance device according to a third embodiment of the present invention. Referring to fig. 7, the apparatus includes: a scene flow determination module 71, a resource data acquisition module 72, and a display module 73.
The scene flow determining module 71 is configured to determine a scene flow of a current scene when a trigger operation of the current scene is detected; a resource data obtaining module 72, configured to obtain resource data of a scene process; and the display module 73 is configured to sequentially display the scene flow and the resource data of the scene flow, where the scene flow and the resource data corresponding to the scene flow are used to guide the user to operate in the current scene.
On the basis of the technical schemes, the scene flow comprises at least one of an operation flow chart and a flow description language corresponding to the current scene;
the resource data includes: at least one of application tools, picture data, text data, and audio/video data.
On the basis of the above technical solutions, the resource data obtaining module 72 is further configured to analyze scene description keywords included in the scene flow;
if the scene description keyword contains a link address, importing an application tool corresponding to the current scene according to the link address;
and if the scene description keyword contains attribute information of the picture data, the text data or the audio/video data, importing the picture data, the text data or the audio/video data corresponding to the current scene according to the attribute information.
On the basis of the above technical solutions, the apparatus further includes: the device comprises an operation step set acquisition module and a scene flow generation module. The operation step set acquisition module is used for acquiring an operation step set of each scene, wherein the operation step set is based on historical operation steps and/or a preset operation step set recorded under each scene; and the scene flow generation module is used for extracting corresponding steps from the operation step set based on the description information of the current scene for any scene, and combining the extracted steps based on execution logic to generate the scene flow of the current scene.
On the basis of the above technical solutions, the apparatus further includes: an acquisition module and a corresponding relation establishing module; the acquisition module is used for acquiring operation results of historical operation steps or pre-input resource data; and the corresponding relation establishing module is used for establishing a corresponding relation between the operation result or the resource data and each operation step in the scene flow.
On the basis of the above technical solutions, the display module 73 is further configured to determine a process node corresponding to the current operation;
and displaying node information and resource data corresponding to the process nodes corresponding to the current operation, wherein the node information comprises at least one of an operation description language contained in each process node and guidance information among the process nodes.
On the basis of the technical schemes, the suspension display module is used for performing suspension display on information corresponding to suspension display operation when the suspension display operation of the scene flow is detected.
On the basis of the above technical solutions, the apparatus further includes: a bootstrap module; the self-guiding module is used for receiving self-guiding trigger operation and determining a self-guiding step according to a self-guiding rule;
and controlling the operation equipment to sequentially execute the self-guiding steps in the scene flow, and receiving execution feedback information of the operation equipment, wherein the feedback information comprises an operation result and operation completion information.
The embodiment of the invention provides a scene guiding device, which is used for determining a scene flow of a current scene and acquiring resource data of the scene flow when the trigger operation of the current scene is detected, wherein the scene flow corresponds to a plurality of operation steps, each operation step comprises the resource data to be acquired, and then the resource data of the scene flow and the resource data of the scene flow are sequentially displayed, so that the problem of low business processing efficiency caused by the fact that the operation flow of the scene needs to be completed based on an operation sequence described in a manual and depending on rich experience in the prior art is solved, the purpose of automatically guiding a user to operate in the current scene through the interaction of the displayed scene flow and the resource data with the user is achieved, and the effect of improving the business processing efficiency is realized.
Example four
Fig. 8 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. FIG. 8 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
As shown in FIG. 8, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 8, and commonly referred to as a "hard drive"). Although not shown in FIG. 8, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. The memory 28 may include at least one program product having a set of program modules (e.g., the scene flow determination module 71, the resource data acquisition module 72, and the display module 73 of the scene guidance apparatus) configured to perform the functions of the embodiments of the invention.
A program/utility 44 having a set of program modules 46 (e.g., a scenario flow determination module 71, a resource data acquisition module 72, and a display module 73 of a scenario-oriented apparatus) may be stored, for example, in memory 28, such program modules 46 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which or some combination of which may comprise an implementation of a network environment. Program modules 46 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with electronic device 12, and/or with any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, implementing a scene guide method provided by an embodiment of the present invention, the method including:
when the trigger operation of the current scene is detected, determining the scene flow of the current scene;
acquiring resource data of a scene flow;
and sequentially displaying the scene flow and the resource data of the scene flow, wherein the scene flow and the resource data corresponding to the scene flow are used for guiding the operation of a user in the current scene.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement a scene guide method provided by the embodiment of the present invention.
Of course, those skilled in the art can understand that the processor can also implement the technical solution of the scene guide method provided by any embodiment of the present invention.
EXAMPLE five
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a scene guidance method provided in an embodiment of the present invention, where the method includes:
when the trigger operation of the current scene is detected, determining the scene flow of the current scene;
acquiring resource data of a scene flow;
and sequentially displaying the scene flow and the resource data of the scene flow, wherein the scene flow and the resource data corresponding to the scene flow are used for guiding the operation of a user in the current scene.
Of course, the computer program stored on the computer-readable storage medium provided by the embodiments of the present invention is not limited to the above method operations, and may also perform related operations in a scene guidance method provided by any embodiments of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device.
A computer readable signal medium may include scene flows, resource data, etc. having computer readable program code embodied therein. Such propagated scene flow, resource data, etc. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It should be noted that, in the embodiment of the scene guide apparatus, the modules included in the embodiment are only divided according to the functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (11)

1. A method of scene guidance, comprising:
when the trigger operation of the current scene is detected, determining the scene flow of the current scene;
acquiring resource data of the scene process;
and sequentially displaying the scene flow and the resource data of the scene flow, wherein the scene flow and the resource data are used for guiding the operation of a user in the current scene.
2. The method of claim 1, wherein the scene flow comprises at least one of an operation flow chart and a flow description language corresponding to the current scene;
the resource data includes: at least one of application tools, picture data, text data, and audio/video data.
3. The method of claim 2, wherein obtaining resource data of the scene flow comprises:
analyzing scene description keywords contained in the scene flow;
if the scene description keyword contains a link address, importing an application tool corresponding to the current scene according to the link address;
and if the scene description keyword contains attribute information of picture data, text data or audio/video data, importing the picture data, the text data or the audio/video data corresponding to the current scene according to the attribute information.
4. The method of claim 1, further comprising:
acquiring an operation step set of each scene, wherein the operation step set is based on recording historical operation steps and/or a preset operation step set under each scene;
for any scene, extracting corresponding steps from the operation step set based on the description information of the current scene, and combining the extracted steps based on execution logic to generate a scene flow of the current scene.
5. The method of claim 4, further comprising:
acquiring operation results of historical operation steps or pre-input resource data;
and establishing a corresponding relation between the operation result or the resource data and each operation step in the scene flow.
6. The method of claim 1, wherein displaying the scene flow and the resource data corresponding to the scene flow in sequence comprises:
determining a process node corresponding to the current operation;
and displaying node information corresponding to the process node corresponding to the current operation and the resource data, wherein the node information comprises at least one of an operation description language contained in each process node and guidance information among the process nodes.
7. The method of claim 6, further comprising:
and when the suspension display operation of the scene flow is detected, performing suspension display on information corresponding to the suspension display operation.
8. The method of claim 1, further comprising:
receiving a self-guiding trigger operation, and determining a self-guiding step according to a self-guiding rule;
and controlling the operation equipment to sequentially execute the self-guiding steps in the scene flow and receive execution feedback information of the operation equipment, wherein the execution feedback information comprises an operation result and operation completion information.
9. A scene guidance apparatus, comprising:
the scene flow determining module is used for determining the scene flow of the current scene when the triggering operation of the current scene is detected;
the resource data acquisition module is used for acquiring the resource data of the scene process;
and the display module is used for sequentially displaying the scene flow and the resource data of the scene flow, wherein the scene flow and the resource data corresponding to the scene flow are used for guiding the operation of a user in the current scene.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements a scene guidance method according to any one of claims 1 to 8 when executing the computer program.
11. A storage medium containing computer-executable instructions which, when executed by a computer processor, implement a scene boot method as claimed in any one of claims 1 to 8.
CN201911357650.0A 2019-12-25 2019-12-25 Scene guiding method and device, electronic equipment and storage medium Active CN111143004B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911357650.0A CN111143004B (en) 2019-12-25 2019-12-25 Scene guiding method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911357650.0A CN111143004B (en) 2019-12-25 2019-12-25 Scene guiding method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111143004A true CN111143004A (en) 2020-05-12
CN111143004B CN111143004B (en) 2024-02-27

Family

ID=70520037

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911357650.0A Active CN111143004B (en) 2019-12-25 2019-12-25 Scene guiding method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111143004B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112199135A (en) * 2020-09-01 2021-01-08 北京达佳互联信息技术有限公司 Information guiding method, device, electronic equipment and storage medium
CN112435149A (en) * 2020-12-03 2021-03-02 郑州捷安高科股份有限公司 Simulation method, device, equipment and storage medium based on scene guidance prompt
CN112634093A (en) * 2020-12-30 2021-04-09 北京金堤科技有限公司 Method, application and device for acquiring guide problem node graph for generating contract
CN112837159A (en) * 2021-02-24 2021-05-25 中国工商银行股份有限公司 Transaction guiding method and device based on scene elements, electronic equipment and medium
CN113220272A (en) * 2021-04-27 2021-08-06 支付宝(杭州)信息技术有限公司 Method, device and equipment for accessing open capability of service platform
CN114625467A (en) * 2022-03-21 2022-06-14 浙江网商银行股份有限公司 Operation guiding method and device
WO2023125998A1 (en) * 2021-12-30 2023-07-06 上海联影医疗科技股份有限公司 Medical equipment operation guidance method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311939A (en) * 2007-05-23 2008-11-26 徐海军 Operating system self-guiding, safe access control storage technology realization method
CN101847228A (en) * 2010-03-29 2010-09-29 清华大学 Workflow static planning method based on process mode
CN107004044A (en) * 2014-11-18 2017-08-01 皇家飞利浦有限公司 The user guidance system and method for augmented reality equipment, use
CN108230804A (en) * 2017-12-25 2018-06-29 郑玉宣 A kind of virtual reality mine emergency rehearsal and operative skill Training Methodology and system
JP2019136068A (en) * 2018-02-06 2019-08-22 グリー株式会社 Game processing system, game processing method, and game processing program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311939A (en) * 2007-05-23 2008-11-26 徐海军 Operating system self-guiding, safe access control storage technology realization method
CN101847228A (en) * 2010-03-29 2010-09-29 清华大学 Workflow static planning method based on process mode
CN107004044A (en) * 2014-11-18 2017-08-01 皇家飞利浦有限公司 The user guidance system and method for augmented reality equipment, use
CN108230804A (en) * 2017-12-25 2018-06-29 郑玉宣 A kind of virtual reality mine emergency rehearsal and operative skill Training Methodology and system
JP2019136068A (en) * 2018-02-06 2019-08-22 グリー株式会社 Game processing system, game processing method, and game processing program

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112199135A (en) * 2020-09-01 2021-01-08 北京达佳互联信息技术有限公司 Information guiding method, device, electronic equipment and storage medium
CN112435149A (en) * 2020-12-03 2021-03-02 郑州捷安高科股份有限公司 Simulation method, device, equipment and storage medium based on scene guidance prompt
CN112634093A (en) * 2020-12-30 2021-04-09 北京金堤科技有限公司 Method, application and device for acquiring guide problem node graph for generating contract
CN112634093B (en) * 2020-12-30 2023-11-03 北京金堤科技有限公司 Method, application and device for acquiring guide problem node diagram for generating contract
CN112837159A (en) * 2021-02-24 2021-05-25 中国工商银行股份有限公司 Transaction guiding method and device based on scene elements, electronic equipment and medium
CN112837159B (en) * 2021-02-24 2024-04-02 中国工商银行股份有限公司 Transaction guiding method and device based on scene element, electronic equipment and medium
CN113220272A (en) * 2021-04-27 2021-08-06 支付宝(杭州)信息技术有限公司 Method, device and equipment for accessing open capability of service platform
WO2023125998A1 (en) * 2021-12-30 2023-07-06 上海联影医疗科技股份有限公司 Medical equipment operation guidance method and system
CN114625467A (en) * 2022-03-21 2022-06-14 浙江网商银行股份有限公司 Operation guiding method and device

Also Published As

Publication number Publication date
CN111143004B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
CN111143004B (en) Scene guiding method and device, electronic equipment and storage medium
CN110286896B (en) Visual editing method, device, equipment and storage medium
CN111008520B (en) Annotating method and device, terminal equipment and storage medium
CN106303723B (en) Video processing method and device
CN113963770A (en) Report file generation method and device, computer equipment and storage medium thereof
CN110532159A (en) Data decryptor method, apparatus, equipment and computer readable storage medium
CN109726380A (en) Table edit method and device
CN112799656B (en) Script file configuration method, device, equipment and storage medium for automation operation
CN114415871A (en) Graphic code management method and device
CN106878773B (en) Electronic device, video processing method and apparatus, and storage medium
CN113593677A (en) Image processing method, device, equipment and computer readable storage medium
CN111736825B (en) Information display method, device, equipment and storage medium
CN113312036A (en) Large-screen display method, device and equipment of Web page and storage medium
CN111638787A (en) Method and device for displaying information
CN108563485B (en) Input panel display method and device
CN110958243A (en) Network vulnerability submitting method and device, storage medium and electronic equipment
CN112631537B (en) Remote data display method and device, electronic equipment and storage medium
CN104516860A (en) Methods and systems for selecting text within a displayed document
CN109190097B (en) Method and apparatus for outputting information
CN114047863A (en) Page interaction method and device
CN114779994B (en) Selective editing method, apparatus, electronic device, and computer-readable storage medium
CN110703971A (en) Method and device for publishing information
CN111104026A (en) Method and device for recommending service
CN111124387A (en) Modeling system, method, computer device and storage medium for machine learning platform
CN113031891B (en) Screen selection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Applicant after: Shanghai Lianying Medical Technology Co.,Ltd.

Address before: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Applicant before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.

GR01 Patent grant
GR01 Patent grant