CN114035725B - Teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and storage medium - Google Patents

Teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and storage medium Download PDF

Info

Publication number
CN114035725B
CN114035725B CN202110987077.2A CN202110987077A CN114035725B CN 114035725 B CN114035725 B CN 114035725B CN 202110987077 A CN202110987077 A CN 202110987077A CN 114035725 B CN114035725 B CN 114035725B
Authority
CN
China
Prior art keywords
node
logic
teaching
teaching task
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110987077.2A
Other languages
Chinese (zh)
Other versions
CN114035725A (en
Inventor
朱皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan United Imaging Healthcare Co Ltd
Original Assignee
Wuhan United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan United Imaging Healthcare Co Ltd filed Critical Wuhan United Imaging Healthcare Co Ltd
Priority to CN202110987077.2A priority Critical patent/CN114035725B/en
Publication of CN114035725A publication Critical patent/CN114035725A/en
Priority to PCT/CN2022/112781 priority patent/WO2023024974A1/en
Application granted granted Critical
Publication of CN114035725B publication Critical patent/CN114035725B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The application relates to a teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and a storage medium. The method comprises the following steps: responding to a use instruction triggered based on a display interface, and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface; acquiring a selection instruction triggered by a user based on the teaching task list; and determining a target teaching task from the candidate teaching tasks according to the selection instruction, and operating the target teaching task. The method can help the user to know the ultrasonic equipment and ensure that the user practice learns the use function of the ultrasonic equipment.

Description

Teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and storage medium
Technical Field
The present application relates to the technical field of ultrasonic devices, and in particular, to a teaching method and apparatus for an ultrasonic device, an ultrasonic imaging device, and a storage medium.
Background
With the development of ultrasonic imaging technology, it is necessary for a novice who cannot proficiently use an ultrasonic imaging apparatus to learn the operation of the ultrasonic imaging apparatus first.
In the traditional technology, the teaching of the ultrasonic imaging equipment is mainly performed by teaching methods such as actively reading a product instruction manual, learning a teaching group or preset parameters built in the ultrasonic imaging equipment, automatically identifying image features and prompting.
However, the conventional teaching method of the ultrasonic imaging apparatus has a problem of low teaching efficiency.
Disclosure of Invention
Based on this, it is necessary to provide a teaching method, apparatus, ultrasonic imaging apparatus, and storage medium of an ultrasonic apparatus capable of improving teaching efficiency in view of the above-described technical problems.
A method of teaching an ultrasound device, the method comprising:
Responding to a use instruction triggered based on a display interface, and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface;
acquiring a selection instruction triggered by the user based on the teaching task list;
And determining a target teaching task from the candidate teaching tasks according to the selection instruction, and operating the target teaching task.
In one embodiment, the editing interface includes a creation area, and creating each candidate teaching task in the creation area includes:
Acquiring a creation instruction based on the creation area, and responding to the creation instruction to display a node menu;
Acquiring a node selection instruction based on the node menu; the node selection instruction comprises node identification of the selected course node;
configuring the target course nodes corresponding to the node identifiers based on the creation areas to obtain configuration information of each target course node;
And creating the candidate teaching task according to the configuration information of each target course node and each target course node.
In one embodiment, the configuring the target course node corresponding to the node identifier based on the creation area, to obtain configuration information of each target course node includes:
displaying a configuration frame of each target course node according to the node selection instruction;
And obtaining configuration information of each target course node based on the configuration frame of each target course node.
In one embodiment, the node menus include a base node menu and an ultrasound application node menu; the method for acquiring the creation instruction based on the creation area, responding to the creation instruction, and displaying the node menu comprises the following steps:
and acquiring the creation instruction based on the creation area, and responding to the creation instruction to display the basic node menu and the ultrasonic application node menu.
In one embodiment, the node selection instruction includes a first node selection instruction and a second node selection instruction, and the acquiring the node selection instruction based on the node menu includes:
Acquiring the first node selection instruction based on the basic node menu; the first node selection instruction comprises a basic node identification of the selected course node;
Acquiring the second node selection instruction based on the ultrasonic application node menu; the second node selection instruction includes an ultrasound application node identification of the selected course node.
In one embodiment, the configuration information includes: the type information of each of the target course nodes, the logic information of each of the target course nodes, the setting information of each of the target course nodes, the position information of each of the target course nodes, the logic pin connection information of each of the target course nodes, the data pin connection information of each of the target course nodes, the script attribute information of each of the target course scripts, and the configuration information of each of the target course scripts.
In one embodiment, the editing interface further includes an information prompt area; the method further comprises the steps of:
Acquiring a setting instruction based on the editing interface and/or the information prompt area;
And setting node data and node attributes of each target course node in response to the setting instruction.
In one embodiment, the editing interface further comprises an editing region; the method further comprises the steps of:
Acquiring an editing instruction based on the editing area; the editing instruction indicates a node to be edited;
responding to the editing instruction, and displaying an editing interface of the node to be edited on the editing area;
And acquiring editing information based on the editing interface, and performing at least one of verification operation, storage operation, setting operation and searching operation on the node to be edited according to the editing information.
A teaching apparatus for an ultrasound device, the apparatus comprising:
The first display module is used for responding to the use instruction triggered based on the display interface and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface;
the first acquisition module is used for acquiring the selection instruction triggered by the user based on the teaching task list;
And the operation module is used for determining a target teaching task from the candidate teaching tasks according to the selection instruction and operating the target teaching task.
An ultrasound imaging apparatus comprising a memory storing a computer program and a processor which when executing the computer program performs the steps of:
Responding to a use instruction triggered based on a display interface, and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface;
acquiring a selection instruction triggered by the user based on the teaching task list;
And determining a target teaching task from the candidate teaching tasks according to the selection instruction, and operating the target teaching task.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
Responding to a use instruction triggered based on a display interface, and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface;
acquiring a selection instruction triggered by the user based on the teaching task list;
And determining a target teaching task from the candidate teaching tasks according to the selection instruction, and operating the target teaching task.
According to the teaching method, the device, the computer equipment and the storage medium of the ultrasonic equipment, the teaching task list can be displayed by responding to the use instruction triggered based on the display interface, the teaching task list comprises a plurality of candidate teaching tasks, and each candidate teaching task is created in advance based on the editing interface, so that visual selection can be performed among the plurality of candidate teaching tasks contained in the teaching task list through visual display, namely, the target teaching task can be intuitively selected from the plurality of candidate teaching tasks contained in the teaching task list according to the use instruction triggered by a user, the selected target teaching task is created based on the visual interface, and then the selected target teaching task can be operated, the user is helped to know the ultrasonic equipment through the target teaching task, and the user is ensured to practice the use function of the ultrasonic equipment.
Drawings
FIG. 1 is an application environment diagram of a teaching method of an ultrasound device in one embodiment;
FIG. 2 is a flow diagram of a method of teaching an ultrasound device in one embodiment;
FIG. 3 is a flow chart of a teaching method of an ultrasound apparatus in another embodiment;
FIG. 3a is a schematic diagram of an editing interface of an ultrasound device in one embodiment;
FIG. 3b is a schematic diagram of features of a base node in one embodiment;
FIG. 3c is a schematic representation of features of an ultrasound application node in one embodiment;
FIG. 4 is a flow chart of a teaching method of an ultrasound apparatus in another embodiment;
FIG. 5 is a flow chart of a teaching method of an ultrasound apparatus in another embodiment;
FIG. 6 is a flow chart of a teaching method of an ultrasound apparatus in another embodiment;
FIG. 7 is a schematic diagram of an editing interface of an ultrasound device in one embodiment;
FIG. 8 is a block diagram of a teaching apparatus of an ultrasound device in one embodiment;
Reference numerals illustrate:
1: a first region; 2: a second region;
3: a third region; 4: a node menu is created.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The teaching method of the ultrasonic equipment provided by the application can be applied to the ultrasonic imaging equipment shown in figure 1. The ultrasonic imaging equipment comprises an ultrasonic host, a camera shooting equipment, a display, an ultrasonic probe, a driver and a mechanical arm, wherein the ultrasonic host is respectively connected with the camera shooting equipment, the display, the ultrasonic probe, the driver and the mechanical arm, and the camera shooting equipment is arranged on the mechanical arm. The ultrasonic host comprises a processor, an image processor, a probe connector port and a controller which are connected through a system bus, wherein the processor can execute the steps of the method embodiment. Wherein the processor of the ultrasound host is configured to provide computing and control capabilities. The ultrasonic host comprises a probe connection port for communicating with the ultrasonic probe through network connection. Optionally, the ultrasound host may be a server, a personal computer, a personal digital assistant, other terminal devices, such as a tablet computer, a mobile phone, etc., or a cloud or remote server, and the embodiment of the application does not limit the specific form of the ultrasound host.
Ultrasonic imaging apparatuses are generally composed of an ultrasonic signal transmitting and receiving device, an ultrasonic imaging and displaying device, and an ultrasonic signal processing device. Existing ultrasound imaging and display devices may also be used as user-operated input devices, such as touch screens. Because the device has functions of user input and image output display, the device is complex and difficult to quickly grasp. At present, the use teaching of ultrasonic imaging equipment is mainly performed by a product instruction manual, a teaching group built in an ultrasonic imaging system or a teaching method for prompting by automatically identifying image characteristics, and the like, however, the methods have the following defects: 1) The product use manual is often composed of a large number of descriptive characters, is not easy to learn and understand quickly, is time-consuming to search knowledge points, and cannot learn efficiently; 2) The teaching group and the preset parameters built in the ultrasonic imaging system may not have the content to be learned by the current user, or the teaching group provided by the teaching group has a great deal of redundancy and has the learned content, and the preset parameters provided by the teaching group may improve the image, but the user does not know the principle and logic of the parameter settings, and cannot thoroughly grasp the working principle of the ultrasonic imaging device; 3) The teaching method for prompting by automatically recognizing the image features may not perform comprehensive content learning, but only help the user to repeatedly memorize the related image features, and cannot learn the whole ultrasonic system. Accordingly, it is necessary to provide a teaching method, apparatus, computer device, and storage medium of an ultrasonic device capable of improving teaching efficiency.
In one embodiment, as shown in fig. 2, a teaching method of an ultrasound apparatus is provided, and an example of application of the method to the ultrasound imaging apparatus in fig. 1 is described, including the following steps:
s201, responding to a use instruction triggered based on a display interface, and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on the editing interface.
The teaching task list may be a vertically arranged list or a horizontally arranged list, in which a plurality of candidate teaching tasks related to operation of the ultrasonic imaging apparatus, for example, imaging tasks, adjustment tasks, and the like, are displayed, and each candidate task is created in advance based on an editing interface. Optionally, each candidate teaching task may include a plurality of course nodes for executing the teaching task and logic relationships among the plurality of course nodes, and the execution logic of the corresponding candidate teaching task and the execution content of the candidate teaching task can be reflected through the logic relationships among the plurality of course nodes of the candidate teaching task, so that the corresponding candidate teaching task is executed according to the execution logic and the execution content.
Specifically, when a user needs to use the teaching function of the ultrasonic imaging device, the user triggers a use instruction on a display interface of the ultrasonic imaging device, the ultrasonic imaging device responds to the use instruction triggered by the user, invokes a teaching task list, and displays the invoked teaching task list. Optionally, if the ultrasonic imaging device is a touch screen type ultrasonic imaging device, the user can trigger a teaching instruction through a long-press touch screen, and pop up a teaching task list at a long-press position for display; if the ultrasonic imaging device is a non-contact ultrasonic imaging device, the user can trigger the instruction for teaching through methods such as gesture, voice, eye movement, consciousness and the like; in addition, the teaching instruction can be triggered by a mouse, a keyboard, a track ball and the like.
S202, acquiring a selection instruction triggered by a user based on the teaching task list.
Specifically, after the ultrasonic imaging device displays the teaching task list to the user, the user selects the teaching task to be executed from the list based on the teaching task list, and when the user selects the task in the teaching task list, a selection instruction is generated, and the ultrasonic imaging device acquires the selection instruction triggered by the user.
S203, determining a target teaching task from the candidate teaching tasks according to the selection instruction, and operating the target teaching task.
Specifically, the ultrasonic imaging device determines candidate teaching tasks selected by a user in the teaching task list as target teaching tasks according to a selection instruction triggered by the user, and operates a plurality of course nodes and logic relations among the plurality of course nodes included in the determined target teaching tasks to operate the target teaching tasks. Optionally, the determined target teaching task may be one candidate teaching task in the teaching task list, or may be a plurality of candidate teaching tasks in the teaching task list.
In the teaching method of the ultrasonic equipment, the teaching task list can be displayed by responding to the use instruction triggered based on the display interface, the teaching task list comprises a plurality of candidate teaching tasks, and each candidate teaching task is created in advance based on the editing interface, so that visual selection can be performed among the plurality of candidate teaching tasks included in the teaching task list through visual display, namely, the target teaching task can be intuitively selected from the plurality of candidate teaching tasks included in the teaching task list according to the use instruction triggered by a user, the selected target teaching task is created based on the visual interface, and then the selected target teaching task can be operated, the user is helped to know the ultrasonic equipment through the target teaching task, and the user is ensured to practice the use function of the ultrasonic equipment.
In the scenario where the ultrasound imaging apparatus displays a teaching task list in response to a use instruction triggered by a user based on a display interface, the displayed teaching task list includes a plurality of candidate teaching tasks, where the plurality of candidate teaching tasks are created in advance, and in one embodiment, the editing interface includes a creation area, as shown in fig. 3, and a process of creating each candidate teaching task in the creation area includes:
S301, acquiring a creation instruction based on the creation area, and displaying a node menu in response to the creation instruction.
Illustratively, the schematic diagram of the editing interface may be shown in fig. 3a, and the creation area included in the editing interface may be shown in the second area in fig. 3a, specifically, the ultrasound imaging device obtains a creation instruction triggered by the user based on the creation area, and exposes a node menu in response to the creation instruction, where the exposed node menu may be as marked by note 4 shown in fig. 3 a. Alternatively, the presented node menus may include a base node menu and an ultrasound application node menu. Alternatively, the ultrasound imaging device may acquire a creation instruction based on the creation area described above, and expose a base node menu and an ultrasound application node menu in response to the creation instruction.
Alternatively, the base node menu may include, but is not limited to, the following types of nodes: ① Event receiving node: when events provided by some ultrasonic systems or events customized by users are received, the nodes connected with the events are triggered and executed, for example, the users click a certain interface button, or a certain interface is opened, or the events such as a certain input are completed, and the nodes after the event nodes are executed. ② Event issuing node: an event is customized and the input data of the node is sent out together. ③ Mathematical variable node: including boolean, numeric, string, vector, object type. These types are also data input and output types of some other types of nodes, and variable nodes can be calculated as inputs of other types of nodes alone. ④ Logical compute node: the logical compute node may perform logical computation on the data. Such as addition, subtraction, multiplication and division of numerical value and vector type, true and false discrimination of Boolean type, magnitude and equality discrimination of numerical value type, equality discrimination of character string type, and the like. ⑤ Executing a functional node: the node may choose to perform a function of the ultrasound device. Such as mode switching, freezing, measurement packet selection, etc. ⑥ Display node: an interface is displayed on the screen, and the user can be prompted on how to operate in the form of characters, pictures and videos. ⑦ Logic waiting nodes: the backward execution will continue only after the logic of the waiting node has input. ⑧ End node: and finishing the teaching task. Illustratively, the node control appearance characteristics of the basic node menu can be as shown in fig. 3b, the nodes are in rectangular graph outer frames, different wire frame colors or base colors can be provided according to the node types, the node is internally divided into an upper region and a lower region, the upper region is provided with node names (the node can be a custom name), the lower region is provided with node data, the node data represents logic, data input and output pins of the node, and the data object names are displayed beside the pins. The data input and output pin areas have a signature character: for example, in this example, the logical connection pins are marked with triangular symbols, and the data connection pins are marked with dots. the left side of the second area is a logic and data input pin, and the right side is a logic and data output pin. The pins of the node have at least one of logic input, logic output, data input, and data output, e.g., the event node has no logic input and data input pins, and the end node has no logic output and data output pins.
Alternatively, the ultrasound application node menu may include, but is not limited to, the following types of nodes: ① Mode switching node: the node triggers a mode switch and changes the parameter configuration of the ultrasound machine after entering the corresponding mode parameter. ② Measurement packet switching node: the node will switch the measurement packet so that the user operates under the specified measurement packet. ③ Probe analog data node: by selecting the raw data of the ultrasound image saved on the machine as the simulation data of the teaching task. After executing the analog data, the ultrasound device no longer takes as input the real data of the connected probe, but takes as input the analog data. ④ Image parameter setting node: by setting the parameter data under the node, the parameter data of the ultrasonic machine is changed. ⑤ Function activation node: functions of a certain ultrasound device, such as freezing, storing a map, viewing a patient, viewing a playback, etc., may be activated. Illustratively, the node control appearance characteristics of the node menu of the ultrasonic application node can be as shown in fig. 3c, the node is in a rectangular graph outline, different wire frame colors or base colors can be provided according to the node type, the node is internally divided into an upper area and a lower area, the upper area is provided with node names (the node can be defined by a custom name), the lower area is provided with node data, the node data represents logic, data input and output pins of the node, and the data object names are displayed beside the pins. The data input and output pin areas have a signature character: for example, the logic connection pins are marked by triangle symbols, the data connection pins are marked by dots, the left side of the second area is a logic and data input pin, the right side is a logic and data output pin, at least one of the pins of the node exists in logic input, logic output, data input and data output, for example, the event node has no logic input and data input pin, and the end node has no logic output and data output pin.
S302, acquiring a node selection instruction based on a node menu; the node selection instructions include node identifications of the selected course nodes.
Specifically, after the ultrasonic imaging device responds to a creation instruction triggered by a user to display a node menu, the user can select nodes in the displayed node menu based on the displayed node menu, and the ultrasonic imaging device acquires a node selection instruction based on the displayed node menu, wherein the acquired node selection instruction comprises node identifications of selected course nodes. Alternatively, the node identifier of the selected course node may include the node name of the selected course node, or may include the node ID of the selected course node, where the node name corresponds to the node name in the node menu, e.g., the selected course node is an event node, and the node name is an event node. Optionally, on the basis that the node menu includes a basic node menu and an ultrasound application node menu, the node selection instruction acquired by the ultrasound imaging device based on the node menu may include a first node selection instruction and a second node selection instruction, the ultrasound imaging device may acquire the first node selection instruction based on the basic node menu, where the first node selection instruction includes a basic node identifier of the selected course node, and the ultrasound imaging device may acquire the second node selection instruction based on the ultrasound application node menu, where the second node selection instruction includes an ultrasound application node identifier of the selected course node.
S303, configuring the target course nodes corresponding to the node identifications based on the creation areas to obtain configuration information of each target course node.
The node identifier of each selected course node corresponds to one target course node, and the ultrasonic imaging device can configure the target course node corresponding to each node identifier based on the creation area to obtain configuration information of each target course node. Optionally, the configuration information for each target course node includes: type information of each target course node, logic information of each target course node, setting information of each target course node, position information of each target course node, logic pin connection information of each target course node, data pin connection information of each target course node, script attribute information of each target course script, and configuration information of each target course script. The configuration information of each target course node includes node attribute and node data of each target course node, logic information of each target course node, logic pin connection information of each target course node, and data pin connection information of each target course node in the configuration information may be node data, and type information of each target course node, setting information of each target course node, position information of each target course node, script attribute information of each target course script, and configuration information of each target course script in the configuration information may be node attribute. It will be appreciated that the candidate teaching task is created from the configuration information of the target course node and the target course node by selecting the target course node by the node identification (i.e., the names of the event node and the variable node in the node menu or the IDs and character strings corresponding to the nodes), and configuring the information of the target course node, i.e., configuring the node attribute of each target course node and the node data of each target course node.
Optionally, when configuring a target course node corresponding to a node identifier of a selected course node, the ultrasonic imaging device sequentially processes each node along a logic pin connection line from a node corresponding to a first node identifier, and first analyzes internal logic and setting information of each node; and checking input data pin connection information of the nodes, if the nodes are connected, reversely searching output data pin connection information of the front nodes along the input connection line until all the front data meet the requirements, obtaining configuration information of target course nodes corresponding to each node identifier, optionally, decoding a teaching task script into a graphical interface by the ultrasonic imaging equipment when the existing teaching task is opened on the ultrasonic imaging equipment of the teaching task editor, sequentially obtaining the nodes by a processor module of the ultrasonic imaging equipment according to node logic pin connection information in the teaching task script, rendering the nodes of the types to corresponding positions by a rendering module according to type information and position information of the nodes, connecting the nodes by rendered line segments by the rendering module according to internal logic and setting information of the nodes, and rendering names, input and output logic pins and data pins of the nodes in the editor by the rendering module.
S304, creating candidate teaching tasks according to the target course nodes and the configuration information of the target course nodes.
Specifically, the candidate teaching task includes a plurality of course nodes and logic relations among the course nodes, configuration information of each target course node can be obtained by configuring the target course node corresponding to the node identifier of the course node selected by the user, and then the logic relations among the target course nodes can be obtained, so that the candidate teaching task can be created according to the configuration information of each target course node and each target course node. Alternatively, the ultrasound imaging device may use each target course node as a course node included in the candidate teaching task, and use configuration information of each target course node as a logical relationship between course nodes included in the candidate teaching task, so as to obtain the candidate teaching task.
In this embodiment, the creation instruction can be acquired based on the creation area included in the editing interface, so that the node menu can be displayed in response to the creation instruction, so that a user can select a course node in a self-defined manner through the intuitively displayed node menu, so that the ultrasonic imaging device can acquire the node selection instruction including the node identifier of the selected course node based on the node menu, and further can accurately configure the target course node corresponding to the node identifier based on the creation area, so as to obtain the configuration information of each target course node, and thus, candidate teaching tasks can be quickly created according to the configuration information of the selected target course node and the selected target course node, and the efficiency of creating candidate teaching tasks is improved.
In the scenario where the node identifier of the selected course node is configured to obtain the configuration information of each target course node based on the creation area, in one embodiment, as shown in fig. 4, the step S303 includes:
S401, displaying a configuration frame of each target course node according to the node selection instruction.
The node identification of the selected course node is provided with a configuration frame corresponding to the corresponding target course node, and the configuration frame of each target course node comprises configuration information of each target course node, such as type information of the node, logic information of the node and the like. Specifically, when the ultrasonic imaging device configures the target course node corresponding to the selected node identifier, the ultrasonic imaging device may display a configuration frame of each target course node according to the node selection instruction, and obtain configuration information of each target course node.
S402, based on the configuration frames of the target course nodes, the configuration information of the target course nodes is obtained.
The configuration frame of each target course node comprises configuration information of each target course node, and the ultrasonic imaging device can obtain the configuration information of each target course node based on the configuration information of each target course node. Alternatively, the ultrasound imaging device may determine the configuration information selected by the user in the configuration frame of the target course node as the configuration information of each target course node.
In this embodiment, the ultrasound imaging device can display the configuration frame of each target course node according to the node selection instruction, so that the user can intuitively see the configuration information of each target course node through the configuration frame of each target course node, and further, the ultrasound imaging device can accurately obtain the configuration information of each target course node selected by the user based on the configuration frame of each target course node, thereby improving the accuracy of obtaining the configuration information of each target course node.
In some scenarios, the ultrasound imaging device may further set node data and node attributes of the target course node, where, based on the above embodiment, in one embodiment, the editing interface further includes an information prompt area, as shown in fig. 5, the method further includes:
s501, acquiring a setting instruction based on an editing interface and/or an information prompt area.
Specifically, the ultrasound imaging apparatus may acquire the setting instruction based on the information prompt area included in the editing interface, or may acquire the setting instruction based on the above-described editing interface. Alternatively, the information prompt area may be the third area shown in fig. 3a, that is, the ultrasound imaging apparatus may acquire the setting instruction from the third area shown in fig. 3a, or may acquire the setting instruction from the entire editing interface shown in fig. 3 a.
S502, setting node data and node attributes of each target course node in response to the setting instruction.
Specifically, the ultrasound imaging apparatus sets the node data and the node attribute of each target course node in response to the above-obtained setting instruction. Alternatively, the node data of each target course node may be input data of the node, or may be output data of the node, for example, the input data of the node may be pwmode. Alternatively, the node attributes of each target course node may be the type of each target course node, e.g., clickevent, etc.
It should be noted that, on the basis of the above embodiment, the working state of the editor of the ultrasound imaging device may be displayed in the editing interface or the information prompt area, for example, the current working state of the editor of the ultrasound imaging device may be displayed as a verification course script state, or the current working state of the editor of the ultrasound imaging device may be displayed as a saved course script state.
In this embodiment, the ultrasound imaging apparatus can acquire the setting instruction based on the editing interface and/or the information prompt area, so that the node data and the node attribute of each target course node can be set in response to the setting instruction, so that the node data and the node data of each target course node can be accurately set, and the set target course node can adapt to a wider application scenario.
In some scenarios, the ultrasound imaging device may further edit the node that needs to be edited, where, based on the above embodiment, in one embodiment, the editing interface further includes an editing area, as shown in fig. 6, and the method further includes:
S601, acquiring an editing instruction based on an editing area; the edit instruction indicates a node to be edited.
The user can edit the nodes to be edited on an editing area included in an editing interface of the ultrasonic imaging device, and an editing instruction indicates the nodes to be edited. Illustratively, the editing interface may include an information prompt area such as the first area shown in FIG. 3 a. Specifically, in this embodiment, the ultrasound imaging device may acquire an edit instruction triggered by a user in the information prompt area of the edit interface, and determine a node to be edited according to the edit instruction triggered by the user.
S602, displaying an editing interface of the node to be edited on the editing area in response to the editing instruction.
Specifically, the ultrasound imaging apparatus displays an editing interface (first area) of a node to be edited on an editing area included in the editing interface when receiving the editing instruction in response to the above-described editing instruction. Optionally, the editing interface includes editing information, that is, the ultrasound imaging device may display the editing information while displaying the editing interface of the node to be edited.
S603, acquiring editing information based on the editing interface, and performing at least one of verification operation, storage operation, setting operation and searching operation on the node to be edited according to the editing information.
Specifically, the displayed editing interface includes editing information, and the ultrasonic imaging device may acquire the editing information based on the editing interface, and edit the node to be edited according to the acquired editing information. Optionally, the editing of the node to be edited by the ultrasonic imaging apparatus according to the acquired editing information may include performing at least one of a verification operation, a save operation, a setting operation, and a search operation on the node to be edited.
In this embodiment, the ultrasound imaging apparatus can acquire an edit instruction indicating a node to be edited based on the edit area, and display an edit interface of the node to be edited on the edit area by responding to the edit instruction, so that edit information can be acquired based on the edit interface of the node to be edited, and the node to be edited is edited according to the edit information, so that the node to be edited can be edited and adjusted in real time, and the edited node can adapt to a wider application scenario.
It can be appreciated that based on the teaching method of the ultrasonic device, an ultrasonic device may be provided, where the ultrasonic device may include a processor module, a storage module, an input module, an output module, a rendering module, and a network communication module, where the processor module is responsible for interactive logic processing of a graphical interface, compiling the graphical interface into a teaching task script, decoding the teaching task script into a graphical interface element, and executing the teaching task script to enable the ultrasonic machine to operate according to specified steps and data. And the storage module is responsible for storing the local teaching task script. And the input module is responsible for collecting the operation of the teaching task editor by a user and executing the collection of the operation of the teaching task user. And the output module is responsible for displaying the operation of the teaching task editor by a user and displaying information for executing the teaching task. And the rendering module is responsible for running a visual rendering technology related algorithm and program, and rendering the interface elements decoded and output by the processor module to the input module for display. And the network communication module is responsible for communication with the server and synchronously uploading or downloading teaching task related data in the account.
The teaching method of the ultrasonic device provided by the application is described below by a specific embodiment, and the example is described by taking an ultrasonic imaging device with a full touch screen as an example:
S1, after a user logs in the ultrasonic imaging device, the ultrasonic imaging device automatically requests a teaching task data list of the user from a server. The locally available teaching tasks are updated after the list data is obtained. The list data contains the user account created by himself and also includes the account set to be shared in the rights domain. Already active tutorial tasks and inactive tutorial tasks can be seen in the list.
S2, a user checks whether teaching tasks meeting the current requirements exist. If so, the teaching task is selected and activated, which task is to be executed when the conditions are met. If not, the user clicks a new teaching task or edit teaching task button of the interface to enter a creating and editing interface of the teaching task.
S3, editing and modifying one selected existing teaching task in the previous list, or creating and naming one teaching task, wherein a user can enter a visual editing interface of the teaching task after selecting the above operation;
S4, as shown in FIG. 7, a new node is created in the second area of the visual editing interface by long-pressing a screen on the canvas or clicking an output pin of the existing node and dragging the outgoing line.
S5, clicking the node, and setting the node name of the node. Specifically, selecting the name of the node as a click event triggers the event when the user clicks a control on the screen, and the node outputs the name of the event sent by the control.
S6, clicking at the EventName pin of the event node, dragging by long-pressing, leading out a line segment to a proper position, ending touching, popping up a new node menu, and selecting a logic node. Thus, a logic node is newly established, and the EventName pin of the last node is connected. The input data of the logical node is also automatically named as the EventName type. And (3) popping up a new node menu according to the blank area for a long time, creating a variable node, clicking the variable node to set the input data value of the variable node to be PW Mode, and connecting the output pin of the node with the second input pin of the logic node. When the logic node judges that the EventName character string is 'PW Mode', the Result pin outputs true, otherwise, false is output.
S7, using a similar method in S4-S6, creating a display node, connecting a logic output pin of the event node with a logic input pin of the display node, setting the attribute of the display node to play a video, setting the condition of playing the video as a bool type variable, setting the content of playing the video, setting the display position of a video control and the like. And connecting a Result output pin of the previous logic node with a Condition input pin of the video display node. And finally, connecting an end node to end the teaching task. This boot task is: when a user triggers a click on the PW Mode button, a teaching video is played, and the teaching task is ended.
S8, clicking for saving and uploading after editing the complete teaching task. The system compiles the teaching task, automatically starts uploading if the compiling is passed, and prompts which part of the teaching task is provided with error reporting if the compiling is not passed. After the compiling is completed, the system uploads the teaching task script to the cloud server under the account number of the user. If the task is set to be shared by the domain rights, other accounts in the domain rights can view and use the teaching task after uploading.
S9, returning to the interface for selecting the available teaching task list, and activating the selected teaching task.
Based on the steps in S1 to S9, after the teaching task is created, the created teaching task needs to be executed, and the following detailed description of how the ultrasound imaging device executes the script may include the following steps:
s1, after a user logs in the system, downloading the installed teaching task script, namely the script of the teaching task created by the embodiment.
S2, when a user clicks a PW mode switching button, the system sends a clicking event, at the moment, a receiving event node of the teaching task script is triggered, a CLICKEVENT node is pushed onto a stack, and an output pin EventName of the node is set to be a PW mode by the system event.
S3, fetching CLICKEVENT the node from the stack, wherein the node has no input data and executes the node logic (in fact, the event receiving node has no additional node logic).
S4, the node has output, a logical output node DisplayVideo of the node is put in a stack, and a isEqual node connected with a data output node of the node is put in the stack.
S5, two non-fetched nodes exist in the stack, and isEqual nodes at the top of the stack are fetched first.
The S6, isEqual node has two input pins, wherein EventName has been assigned by the output of CLICKEVENT node, so that another node connected by input pin string, a mathematical variable node, is put in the stack.
S7, taking out a mathematical variable node at the top of the stack, wherein the node does not have input data, the node does not have additional node logic, outputting a character string with a value of PW mode, assigning the character string to a string input pin of isEqual nodes, and putting isEqual nodes in the stack.
And S8, taking out isEqual nodes at the top of the stack.
And S9, if the two input pins of the node have assignment, executing the node logic, and judging whether the values of the two input pins are equal. Here equal, result output pin value is true. The DisplayVideo node to which the pin is connected is placed in the stack where it has been previously deposited, so that the placement is not repeated.
S10, taking out DisplayVideo nodes at the top of the stack, wherein one data input pin Condition of the node has a value, the value is true, and executing the node logic.
And S11, displayVideo node logic is that when the value of the input pin condition is true, triggering a video configured in the node to play the corresponding position on the ultrasonic screen device.
S12, a node EndTask connected with the logic output of the DisplayVideo node is put in the stack, the node is a termination node of the task, and no other nodes exist in the stack, so that the teaching task is completed.
It should be understood that, although the steps in the flowcharts of fig. 2-6 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in fig. 2-6 may include multiple steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the steps or stages in other steps or other steps.
In one embodiment, as shown in fig. 8, there is provided a teaching apparatus of an ultrasonic device, including: the system comprises a first display module, a first acquisition module and an operation module, wherein:
The first display module is used for responding to the use instruction triggered based on the display interface and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on the editing interface.
The first acquisition module is used for acquiring a selection instruction triggered by a user based on the teaching task list.
And the operation module is used for determining a target teaching task from the candidate teaching tasks according to the selection instruction and operating the target teaching task.
The teaching device of the ultrasonic equipment provided in this embodiment may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
Optionally, on the basis of the foregoing embodiment, the editing interface includes a creation area, and the apparatus further includes: the system comprises a second display module, a second acquisition module, a configuration module and a creation module, wherein:
and the second display module is used for acquiring a creation instruction based on the creation area and displaying a node menu in response to the creation instruction.
The second acquisition module is used for acquiring a node selection instruction based on the node menu; the node selection instructions include node identifications of the selected course nodes.
And the configuration module is used for configuring the target course nodes corresponding to the node identifications based on the creation areas to obtain the configuration information of each target course node.
And the creation module is used for creating candidate teaching tasks according to the configuration information of each target course node and each target course node.
Optionally, the configuration information includes: type information of each target course node, logic information of each target course node, setting information of each target course node, position information of each target course node, logic pin connection information of each target course node, data pin connection information of each target course node, script attribute information of each target course script, and configuration information of each target course script.
The teaching device of the ultrasonic equipment provided in this embodiment may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
On the basis of the above embodiment, optionally, the above configuration module includes: a display unit and an acquisition unit, wherein:
And the display unit is used for displaying the configuration frame of each target course node according to the node selection instruction.
And the acquisition unit is used for acquiring the configuration information of each target course node based on the configuration frame of each target course node.
The teaching device of the ultrasonic equipment provided in this embodiment may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
On the basis of the above embodiment, optionally, the node menu includes a base node menu and an ultrasound application node menu; the second display module includes: a display unit, wherein:
And the display unit is used for acquiring a creation instruction based on the creation area and displaying a basic node menu and an ultrasonic application node menu in response to the creation instruction.
The teaching device of the ultrasonic equipment provided in this embodiment may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
On the basis of the above embodiment, optionally, the node selection instruction includes a first node selection instruction and a second node selection instruction, and the display unit is configured to obtain the first node selection instruction based on a basic node menu; the first node selection instruction comprises a basic node identification of the selected course node; acquiring a second node selection instruction based on the ultrasonic application node menu; the second node selection instruction includes an ultrasound application node identification of the selected course node.
The teaching device of the ultrasonic equipment provided in this embodiment may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
On the basis of the above embodiment, optionally, the editing interface further includes an information prompt area; the device further comprises: a third obtaining module and a setting module, wherein:
And the third acquisition module is used for acquiring the setting instruction based on the editing interface and/or the information prompt area.
And the setting module is used for setting the node data and the node attribute of each target course node in response to the setting instruction.
The teaching device of the ultrasonic equipment provided in this embodiment may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
On the basis of the above embodiment, optionally, the above editing interface further includes an editing area; the device further comprises: the system comprises a fourth acquisition module, a third display module and an editing module, wherein:
The fourth acquisition module is used for acquiring an editing instruction based on the editing area; the edit instruction indicates a node to be edited.
And the third display module is used for displaying an editing interface of the node to be edited on the editing area in response to the editing instruction.
The editing module is used for acquiring editing information based on the editing interface, and performing at least one of verification operation, storage operation, setting operation and searching operation on the node to be edited according to the editing information.
The teaching device of the ultrasonic equipment provided in this embodiment may execute the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein.
For specific limitations of the teaching apparatus of the ultrasonic device, reference may be made to the above limitations of the teaching method of the ultrasonic device, and no further description is given here. The above-mentioned various modules in the teaching device of the ultrasonic apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, an ultrasound imaging apparatus is provided comprising a memory and a processor, the memory having stored therein a computer program which when executed by the processor performs the steps of:
Responding to a use instruction triggered based on a display interface, and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface;
acquiring a selection instruction triggered by a user based on the teaching task list;
and determining a target teaching task from the candidate teaching tasks according to the selection instruction, and operating the target teaching task.
The ultrasound imaging apparatus provided in the above embodiment has similar implementation principles and technical effects to those of the above method embodiment, and will not be described herein.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
Responding to a use instruction triggered based on a display interface, and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface;
acquiring a selection instruction triggered by a user based on the teaching task list;
and determining a target teaching task from the candidate teaching tasks according to the selection instruction, and operating the target teaching task.
The computer readable storage medium provided in the above embodiment has similar principle and technical effects to those of the above method embodiment, and will not be described herein.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. A method of teaching an ultrasound device, the method comprising:
Responding to a use instruction triggered based on a display interface, and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface;
acquiring a selection instruction triggered by a user based on the teaching task list;
determining a target teaching task from the candidate teaching tasks according to the selection instruction, and operating the target teaching task;
The target teaching task comprises an established event node, a logic node, a variable node, a display node and an ending node, wherein the node name of the event node is a click event; the data output pin of the event node is connected with the first data input pin of the logic node, and the logic output pin of the event node is connected with the logic input pin of the display node; the data output pin of the variable node is connected with the second data input pin of the logic node; the data output pin of the logic node is connected with the data input pin of the display node; the logic output pin of the display node is connected with the logic input pin of the ending node, and the ending node is used for ending the target teaching task;
the running the target teaching task comprises the following steps:
Triggering the event node when clicking an interface button, pushing the event node into a stack, and setting an output value of a data output pin in the event node as a character string of an event name;
Under the condition that the event node has output, placing the display node in the stack, and placing the logic node in the stack;
taking out the logic node from the stack top, assigning a first data input pin in the logic node as an output value of a data output pin in an event node, and placing the variable node in the stack;
Taking out the variable node from the stack top, assigning a second data input pin in the logic node as an input data value of the variable node, and placing the logic node in the stack;
taking out the logic node from the stack top, and judging whether the values of two data input pins of the logic node are equal or not;
taking out the display node from the stack top, and executing the display node under the condition that a data input pin of the display node has a value and the value of the data input pin in the display node is true;
placing the end node in the stack;
and completing the target teaching task under the condition that the ending node is the ending node of the target teaching task and no other nodes exist in the stack.
2. The method of claim 1, wherein the editing interface includes a creation area, and wherein creating each of the candidate teaching tasks in the creation area comprises:
Acquiring a creation instruction based on the creation area, and responding to the creation instruction to display a node menu;
Acquiring a node selection instruction based on the node menu; the node selection instruction comprises node identification of the selected course node;
configuring the target course nodes corresponding to the node identifiers based on the creation areas to obtain configuration information of each target course node;
And creating the candidate teaching task according to the configuration information of each target course node and each target course node.
3. The method of claim 2, wherein configuring the target course node corresponding to the node identifier based on the creation area to obtain configuration information for each target course node comprises:
displaying a configuration frame of each target course node according to the node selection instruction;
And obtaining configuration information of each target course node based on the configuration frame of each target course node.
4. The method of claim 2, wherein the node menus include a base node menu and an ultrasound application node menu; the method for acquiring the creation instruction based on the creation area, responding to the creation instruction, and displaying the node menu comprises the following steps:
and acquiring the creation instruction based on the creation area, and responding to the creation instruction to display the basic node menu and the ultrasonic application node menu.
5. The method of claim 4, wherein the node selection instructions comprise a first node selection instruction and a second node selection instruction, the obtaining the node selection instructions based on the node menu comprising:
Acquiring the first node selection instruction based on the basic node menu; the first node selection instruction comprises a basic node identification of the selected course node;
Acquiring the second node selection instruction based on the ultrasonic application node menu; the second node selection instruction includes an ultrasound application node identification of the selected course node.
6. The method of claim 2, wherein the editing interface further comprises an information prompt area; the method further comprises the steps of:
Acquiring a setting instruction based on the editing interface and/or the information prompt area;
And setting node data and node attributes of each target course node in response to the setting instruction.
7. The method of any one of claims 1-6, wherein the editing interface further comprises an editing region; the method further comprises the steps of:
Acquiring an editing instruction based on the editing area; the editing instruction indicates a node to be edited;
responding to the editing instruction, and displaying an editing interface of the node to be edited on the editing area;
And acquiring editing information based on the editing interface, and performing at least one of verification operation, storage operation, setting operation and searching operation on the node to be edited according to the editing information.
8. A teaching apparatus for an ultrasound device, the apparatus comprising:
The first display module is used for responding to the use instruction triggered based on the display interface and displaying a teaching task list; the teaching task list comprises a plurality of candidate teaching tasks; each candidate teaching task is created in advance based on an editing interface;
the first acquisition module is used for acquiring a selection instruction triggered by a user based on the teaching task list;
The operation module is used for determining a target teaching task from the candidate teaching tasks according to the selection instruction and operating the target teaching task;
The target teaching task comprises an established event node, a logic node, a variable node, a display node and an ending node, wherein the node name of the event node is a click event; the data output pin of the event node is connected with the first data input pin of the logic node, and the logic output pin of the event node is connected with the logic input pin of the display node; the data output pin of the variable node is connected with the second data input pin of the logic node; the data output pin of the logic node is connected with the data input pin of the display node; the logic output pin of the display node is connected with the logic input pin of the ending node, and the ending node is used for ending the target teaching task;
The operation module is further used for:
Triggering the event node when clicking an interface button, pushing the event node into a stack, and setting an output value of a data output pin in the event node as a character string of an event name;
Under the condition that the event node has output, placing the display node in the stack, and placing the logic node in the stack;
taking out the logic node from the stack top, assigning a first data input pin in the logic node as an output value of a data output pin in an event node, and placing the variable node in the stack;
Taking out the variable node from the stack top, assigning a second data input pin in the logic node as an input data value of the variable node, and placing the logic node in the stack;
taking out the logic node from the stack top, and judging whether the values of two data input pins of the logic node are equal or not;
taking out the display node from the stack top, and executing the display node under the condition that a data input pin of the display node has a value and the value of the data input pin in the display node is true;
placing the end node in the stack;
and completing the target teaching task under the condition that the ending node is the ending node of the target teaching task and no other nodes exist in the stack.
9. An ultrasound imaging device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202110987077.2A 2021-08-26 2021-08-26 Teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and storage medium Active CN114035725B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110987077.2A CN114035725B (en) 2021-08-26 2021-08-26 Teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and storage medium
PCT/CN2022/112781 WO2023024974A1 (en) 2021-08-26 2022-08-16 Medical device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110987077.2A CN114035725B (en) 2021-08-26 2021-08-26 Teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114035725A CN114035725A (en) 2022-02-11
CN114035725B true CN114035725B (en) 2024-06-25

Family

ID=80139937

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110987077.2A Active CN114035725B (en) 2021-08-26 2021-08-26 Teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114035725B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023024974A1 (en) * 2021-08-26 2023-03-02 武汉联影医疗科技有限公司 Medical device and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107222621A (en) * 2017-05-27 2017-09-29 珠海市魅族科技有限公司 A kind of information processing method and device, computer installation and storage medium
CN111193960A (en) * 2019-09-27 2020-05-22 腾讯科技(深圳)有限公司 Video processing method and device, electronic equipment and computer readable storage medium
CN111901665A (en) * 2020-08-28 2020-11-06 完美世界控股集团有限公司 Teaching resource playing method and device and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9285971B2 (en) * 2012-06-20 2016-03-15 Google Inc. Compartmentalized image editing system
CN112130937A (en) * 2019-06-24 2020-12-25 北京京东尚科信息技术有限公司 Page display method and device, storage medium and electronic equipment
CN111862699B (en) * 2020-07-08 2022-05-27 天津洪恩完美未来教育科技有限公司 Method and device for visually editing teaching courses, storage medium and electronic device
CN112084315B (en) * 2020-09-07 2024-06-11 腾讯科技(深圳)有限公司 Question-answer interaction method, device, storage medium and equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107222621A (en) * 2017-05-27 2017-09-29 珠海市魅族科技有限公司 A kind of information processing method and device, computer installation and storage medium
CN111193960A (en) * 2019-09-27 2020-05-22 腾讯科技(深圳)有限公司 Video processing method and device, electronic equipment and computer readable storage medium
CN111901665A (en) * 2020-08-28 2020-11-06 完美世界控股集团有限公司 Teaching resource playing method and device and storage medium

Also Published As

Publication number Publication date
CN114035725A (en) 2022-02-11

Similar Documents

Publication Publication Date Title
CN111260545B (en) Method and device for generating image
CN108170611A (en) Automated testing method and device, storage medium, electronic equipment
CN104268006B (en) The back method and device of key mouse script
US7420556B2 (en) Information processing method and information processing apparatus
CN110090444B (en) Game behavior record creating method and device, storage medium and electronic equipment
US10810113B2 (en) Method and apparatus for creating reference images for an automated test of software with a graphical user interface
US10628133B1 (en) Console and method for developing a virtual agent
CN113449877B (en) Method and system for demonstrating machine learning modeling process
CN112558824A (en) Page display method and device and computer storage medium
TW201604719A (en) Method and apparatus of controlling a smart device
US20220318077A1 (en) Data engine
US20180124453A1 (en) Dynamic graphic visualizer for application metrics
WO2020057255A1 (en) Terminal interface recognition-based voice control method and system, and intelligent terminal
WO2023109525A1 (en) Quick setting method and apparatus for electronic device, and storage medium and electronic device
CN114035725B (en) Teaching method and device of ultrasonic equipment, ultrasonic imaging equipment and storage medium
CN111190826A (en) Testing method and device for virtual reality immersive tracking environment, storage medium and equipment
JP2017146839A (en) Component information retrieval device, component information retrieval method and program
CN108052506B (en) Natural language processing method, device, storage medium and electronic equipment
CN114629800B (en) Visual generation method, device, terminal and storage medium for industrial control network target range
CN112988304B (en) Recording method and device of operation mode, electronic equipment and storage medium
US12026084B2 (en) Automated testing of mobile devices using visual analysis
CN113282268B (en) Sound effect configuration method and device, storage medium and electronic equipment
CN109684525B (en) Document display method and device, storage medium and test equipment
CN113468069A (en) Application testing method and device, computer equipment and storage medium
CN110209242A (en) Button function binding method, button function calling method, button function binding device, button function calling device and projection control equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant