CN114035713A - Ultrasonic scanning flow control method and system - Google Patents

Ultrasonic scanning flow control method and system Download PDF

Info

Publication number
CN114035713A
CN114035713A CN202110962335.1A CN202110962335A CN114035713A CN 114035713 A CN114035713 A CN 114035713A CN 202110962335 A CN202110962335 A CN 202110962335A CN 114035713 A CN114035713 A CN 114035713A
Authority
CN
China
Prior art keywords
protocol
scanning
executed
user
list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110962335.1A
Other languages
Chinese (zh)
Inventor
宋剑伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan United Imaging Healthcare Co Ltd
Original Assignee
Wuhan United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan United Imaging Healthcare Co Ltd filed Critical Wuhan United Imaging Healthcare Co Ltd
Priority to CN202110962335.1A priority Critical patent/CN114035713A/en
Publication of CN114035713A publication Critical patent/CN114035713A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS

Abstract

The embodiment of the specification provides an ultrasonic scanning flow control method and system. The method comprises the following steps: acquiring a preset scanning protocol of a target object, and displaying the preset scanning protocol in a to-be-selected list of an interactive interface; determining a scanning protocol to be executed selected from the list to be selected by a user, wherein the scanning protocol to be executed is displayed in the selected list of the interactive interface; and scanning the target object according to the position sequence of the scanning protocol to be executed in the selected list.

Description

Ultrasonic scanning flow control method and system
Technical Field
The specification relates to the technical field of medical treatment, in particular to an ultrasonic scanning flow control method and system.
Background
When a doctor uses an ultrasonic system to scan a patient, the doctor needs to scan each position of a target scanned organ of the patient item by item. Due to the different forms and structures of various tissues of human body, the degrees of reflection and refraction and ultrasonic wave absorption corresponding to different organ tissues and/or different positions of the same organ tissue are different. Thus, each organ and different positions of each organ may require scanning using different scanning parameters, e.g. deflection angle of the scanning line, scanning depth, power, composition parameters, etc. However, due to the different experience, scanning technique, etc. of each doctor, some regions of the organ may be scanned for the target of the patient, and/or scanning parameters of each position of the organ may not be accurately determined.
Therefore, it is desirable to provide an ultrasound scanning flow control method, which provides a reference for the standardized scanning of doctors.
Disclosure of Invention
One aspect of the present description provides an ultrasound scanning flow control method. The method comprises the following steps: acquiring a preset scanning protocol of a target object, and displaying the preset scanning protocol in a to-be-selected list of an interactive interface; determining a scanning protocol to be executed selected from the list to be selected by a user, wherein the scanning protocol to be executed is displayed in the selected list of the interactive interface; and scanning the target object according to the position sequence of the scanning protocol to be executed in the selected list.
Another aspect of the present specification provides an ultrasound scanning procedure control diagnosis system, comprising: the acquisition module is used for acquiring a preset scanning protocol of the target object and displaying the preset scanning protocol in a to-be-selected list of an interactive interface; the determining module is used for determining a scanning protocol to be executed selected by a user from the list to be selected, and the scanning protocol to be executed is displayed in the selected list of the interactive interface; and the execution module is used for executing scanning on the target object according to the position sequence of the scanning protocol to be executed in the selected list.
In some embodiments, the preset scanning protocol comprises a plurality of protocol units, and the scanning position corresponding to each protocol unit at least covers part of the target object.
In some embodiments, the determining a scanning protocol to be executed selected by a user from the candidate list comprises: responding to a selection instruction of the user for a preset scanning protocol in the to-be-selected list, and displaying a corresponding protocol unit in the preset scanning protocol in the selected list; and determining the protocol units in the selected list as the scanning protocols to be executed.
In some embodiments, the displaying, in response to a selection instruction of a preset scanning protocol in the candidate list by the user, a corresponding protocol unit in the selected list includes: acquiring a touch point or a trigger point of the user acting on the interactive interface; judging whether the touch point or the trigger point is in a display icon of one of the protocol units in the to-be-selected list; in response to the touch point or the trigger point being within the presentation icon of the one protocol unit, presenting the one protocol unit in the selected list when the touch point or the trigger point leaves the presentation icon of the one protocol unit and the position to which the one protocol unit is dragged is within the border of the selected list; and/or when the touch point or the trigger point leaves the display icon of one protocol unit and the dragged position of one protocol unit is not in the frame of the selected list, controlling one protocol unit to automatically return to the to-be-selected list in an animation mode.
In some embodiments, the determining whether the touch point or the trigger point is in a presentation icon of one of the protocol units in the to-be-selected list includes: determining a reference coordinate point and a preset range according to the shape of the display icon of one protocol unit; and judging whether the touch point or the trigger point is in the display icon of one protocol unit or not based on the reference coordinate point, the preset range and the coordinates of the touch point or the trigger point.
In some embodiments, the performing scanning on the target object according to the position order of the scanning protocol to be performed in the selected list includes: and according to the position sequence of the scanning protocol to be executed in the selected list, responding to the instruction of finishing scanning of the currently scanned protocol unit, and automatically executing the next protocol unit in the selected list.
In some embodiments, the method further comprises: and when one protocol unit in the preset scanning protocol is selected, displaying the information content of the protocol unit.
In some embodiments, the method further comprises: and in response to the user being interrupted when selecting the scanning protocol to be executed from the preset scanning protocols, loading the scanning protocol to be executed selected by the user before interruption to the selected list when the ultrasonic equipment resumes running.
In some embodiments, the method further comprises: and responding to the editing instruction of the user on the preset scanning protocol and/or the protocol unit of the scanning protocol to be executed, adjusting the position sequence of the protocol unit of the scanning protocol to be executed in the selected list and/or adding a new protocol unit.
In some embodiments, the method further comprises: and responding to a selection instruction of the scanning mode of the target object, and entering a free scanning process or a protocol selection process of the target object.
Another aspect of the present specification provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method as described above when executing the computer program.
Another aspect of the present specification provides a computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the method as described above.
The ultrasonic scanning process control method provided in the embodiments of the present description may define or formulate a preset scanning protocol including a plurality of protocol units for each target object, display the preset scanning protocol of the target object in a to-be-selected list of an interactive interface, display a corresponding scanning protocol unit in a selected list as a to-be-executed scanning protocol according to a selection of a user, and scan the target object based on a position sequence of the to-be-executed scanning protocol in the selected list. Accordingly, scanning protocols corresponding to the target object are refined, meanwhile, finer scanning protocol references are provided for users, scanning parameters of the target object are standardized, missing scanning can be effectively prevented, accuracy of the parameters during scanning and scanning convenience are improved, and therefore accuracy of scanning results and scanning efficiency can be improved.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an exemplary application scenario of an ultrasound scanning procedure control system according to some embodiments herein;
FIG. 2 is an exemplary hardware/software schematic of an ultrasound device shown in accordance with some embodiments herein;
FIG. 3 is an exemplary block diagram of an ultrasound scanning procedure control system shown in accordance with some embodiments of the present description;
FIG. 4 is an exemplary flow diagram of an ultrasound scanning flow control method according to some embodiments described herein;
FIG. 5 is a schematic diagram of a pre-set scanning protocol, shown in accordance with some embodiments herein;
6-10 are presentation diagrams of scanning protocols at an exemplary interaction interface, according to some embodiments described herein;
FIG. 11 is a schematic illustration of a scanned data record of an exemplary target object, shown in accordance with some embodiments of the present description;
FIG. 12 is an exemplary flow diagram of an ultrasound scanning process shown in accordance with some embodiments herein;
FIG. 13 is an exemplary flow diagram illustrating the determination of a scanning protocol to be performed according to some embodiments of the present description;
figure 14 is an exemplary flow diagram illustrating the determination of an ultrasound scanning protocol according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this specification to illustrate operations performed by systems according to embodiments of the specification, with relevant descriptions to facilitate a better understanding of medical imaging methods and/or systems. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
The technical solutions disclosed in the present specification will be explained in detail by the description of the drawings.
Fig. 1 is a schematic diagram of an exemplary application scenario of an ultrasound scanning procedure control system according to some embodiments of the present description.
In some embodiments, the ultrasound scanning procedure control system 100 may be used in the field of medical imaging, for example, may be applied to a handheld ultrasound inspection scenario. In some embodiments, the system 100 may define and store in the storage device 140 a plurality of different scanning protocol units for different locations of each organ; when scanning, the scanning protocol of the corresponding organ can be displayed on the interactive interface of the ultrasound device 110 according to the current situation of the patient; the user can select one or more protocol units to place in the selected list of the interactive interface; the ultrasound device 110 may scan according to the order of the positions of the protocol units in the selected list.
As shown in fig. 1, in some embodiments, the ultrasound scanning procedure control system 100 may include an ultrasound device 110, a processing device 120, a terminal 130, a storage device 140, and a network 150.
The ultrasound device 110 may be used to scan a target object or a portion thereof located within its examination region and generate an image relating to the target object or portion thereof. In some embodiments, the target object may include a body, a substance, or the like, or any combination thereof. In some embodiments, the target object may include a patient or other medical subject (e.g., other animal such as a laboratory white mouse), or the like.
In some embodiments, ultrasound device 110 may include a one-dimensional ultrasound device, a two-dimensional ultrasound device, and/or a three-dimensional ultrasound device. For example, the one-dimensional ultrasound device may include a-mode ultrasound, M-mode ultrasound, D-mode ultrasound, and the like, or any combination thereof. The two-dimensional ultrasound device may include a sector-scan B-mode ultrasound device, a linear-scan B-mode ultrasound device, a convex-scan B-mode ultrasound device, or the like, or any combination thereof. The three-dimensional ultrasound device may include a stereo ultrasound device or the like. In some embodiments, the ultrasound device 110 may be a handheld ultrasound device.
In some embodiments, the ultrasound device 110 may include a display, an input device, a processor, a storage device, etc., and further details may be found in fig. 2 and its associated description, which are not repeated herein.
Processing device 120 may process data and/or information obtained from ultrasound device 110, terminal 130, and/or storage device 140. For example, the processing device 120 may process image information detected by the ultrasound device 110 to obtain an ultrasound image. For another example, the processing device 120 may update the parameters of the preset scanning protocol according to the user's editing of the scanning protocol. In some embodiments, the processing device 120 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 120 may be local or remote. For example, processing device 120 may access information and/or data from ultrasound device 110, terminal 130, and/or storage device 140 via network 150. As another example, processing device 120 may directly connect ultrasound device 110, terminal 130, and/or storage device 140 to access information and/or data. In some embodiments, the processing device 120 may be implemented on a cloud platform. For example, the cloud platform may include one or a combination of private cloud, public cloud, hybrid cloud, community cloud, distributed cloud, cross-cloud, multi-cloud, and the like. In some embodiments, the processing device 120 may be part of the ultrasound device 110.
The terminal 130 may include a mobile device 131, a tablet computer 132, a notebook computer 133, and the like, or any combination thereof. In some embodiments, the terminal 130 may interact with other components in the ultrasound scanning procedure control system 100 through the network 150. For example, the terminal 130 may send basic information of the patient to the ultrasound device 110 through the network 150 to guide the user to set scanning parameters of the ultrasound device 110 according to the information. As another example, the terminal 130 may also receive a scanned image acquired by the ultrasound device 110 via the network 150 and display the scanned image for analysis and confirmation by an operator. In some embodiments, the mobile device 131 may include smart home devices, wearable devices, mobile devices, virtual reality devices, augmented reality devices, and the like, or any combination thereof.
The storage device 140 may store data (e.g., scan data for a target object, preset scan protocols, etc.), instructions, and/or any other information. In some embodiments, the storage device 140 may store data obtained from the ultrasound device 110, the terminal 130, and/or the processing device 120. For example, the storage device 140 may store scan protocols to be performed, scan data of a target object, and the like, obtained from the ultrasound device 110. In some embodiments, storage device 140 may store data and/or instructions that processing device 120 may execute or use to perform the exemplary methods described herein. In some embodiments, storage device 140 may include one or a combination of mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like. In some embodiments, the storage device 140 may be implemented by a cloud platform as described herein.
In some embodiments, the storage device 140 may be connected to a network 150 to enable communication with one or more components (e.g., ultrasound device 110, processing device 120, terminal 130, etc.) in the ultrasound scan flow control system 100. One or more components in the ultrasound scanning procedure control system 100 may read data or instructions in the storage device 140 via the network 150. In some embodiments, storage device 140 may be part of processing device 120 or may be separate and directly or indirectly coupled to processing device 120.
The network 150 may include any suitable network capable of facilitating information and/or data exchange for the ultrasound scanning procedure control system 100. In some embodiments, one or more components of the ultrasound scanning procedure control system 100 (e.g., the ultrasound device 110, the processing device 120, the terminal 130, the storage device 140, etc.) may exchange information and/or data with one or more components of the ultrasound scanning procedure control system 100 via the network 150. In some embodiments, the network 150 may include one or a combination of a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN)), etc.), a wired network (e.g., ethernet), a wireless network (e.g., an 802.11 network, a wireless Wi-Fi network, etc.), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a server computer, etc. In some embodiments, network 150 may include one or more network access points.
Fig. 2 is an exemplary hardware/software schematic of an ultrasound device shown in accordance with some embodiments herein.
As shown in fig. 2, in some embodiments, the ultrasound device 200 may include a processor 210, a non-volatile storage medium 220, an internal memory 250, a communication port 260, a display 270, an input device 280, and a bus 290.
The processor 210 may be configured to execute computer instructions (e.g., program code) to implement the ultrasound scanning procedure control method provided by the embodiments of the present specification. The computer instructions may include, for example, programs, objects, components, data structures, procedures, modules, and functions that perform the particular functions described herein, such as the computer program 230. In some embodiments, the processor 210 may be used to process and display the signals received back from the ultrasound probe, for example, on the display 270. In some embodiments, the processor 210 may interpret and/or execute instructions of the various functional units of the ultrasound device 200. For example, the processor 210 may process image data acquired from the ultrasound device 200 and/or any other component of the ultrasound scanning procedure control system 100. In some embodiments, the processor 210 may have a structure the same as or similar to that of the processing device 120, and further details may be referred to in the description of the processing device 120 and are not repeated herein.
For illustration only, only one processor is depicted in the ultrasound device 200. It should be noted, however, that the ultrasound device 200 in this specification may also include multiple processors, and thus operations and/or method steps performed by one processor described in this specification may also be performed by multiple processors, either in combination or individually. For example, if the processor of the ultrasound device 200 performs operations a and B in this description, it should be understood that operations a and B may also be performed by two or more different processors in the ultrasound device 200, either collectively or individually (e.g., a first processor performing operation a, a second processor performing operation B, or a first processor and a second processor performing operations a and B collectively).
The non-volatile storage medium 220 may store data related to computer software, including one or more computer programs 230 and an operating system 240 (e.g., an IOS)TM、AndroidTM、WindowsPhoneTMEtc.). In some embodiments, the computer program may include a browser or other suitable application for querying a scanning protocol corresponding to the target object, scanning a preset table of parameters, setting an initial interface, managing a local database, retrieving image information of the patient, and the like.
The internal memory 250 may temporarily store data/information acquired from the ultrasound device 200 and/or any other component of the ultrasound scan flow control system 100. In some embodiments, the internal memory 250 is similar to the storage device 140 in structure and function, and further details can be found in the description of the storage device 140 and are not repeated here. In some embodiments, the computer program 230 may be run in the internal memory 250 and store relevant data generated by the running process.
The communication port 260 may be connected to a network (e.g., network 150) to facilitate data communication. The communication port 260 may establish a connection between the ultrasound device 200 and other devices, such as the processing device 120, the terminal 130, and/or the storage device 140. The connection may be a wired connection, a wireless connection, any other communication connection that may enable data transmission and/or reception, and/or a combination of such connections. The wired connection may include, for example, a USB interface, a network interface, a PCI bus interface, an HDMI interface, and the like, or any combination thereof. In some embodiments, a USB interface may be used for the connection of the keyboard, a PCI bus interface for the connection of the probe device, and an HDMI interface for the connection of the touch screen (e.g., display 270). The wireless connection may include, for example, a bluetooth connection, a Wi-Fi connection, a WiMax connection, a WLAN connection, a zigbee connection, a mobile network connection (e.g., 3G, 4G, 5G), and the like, or combinations thereof.
Display 270 may be used to output information and/or data. For example, the display 270 may display image information scanned by the ultrasound device, preset scanning protocols, and the like. In some embodiments, display 270 may be used to receive user input of information and/or data. For example, display 270 may receive instructions from a user to select and/or delete a protocol. In some embodiments, the display 270 may include a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) based display, a flat-panel back-panel display, a curved screen, or the like, or any combination thereof. In some embodiments, the display 270 may include a touch screen.
The input device 280 may be used to input signals, data, information, etc. In some embodiments, the input device 280 may enable user interaction with the ultrasound apparatus 200. In some embodiments, input device 280 may include one or more ultrasound operating panels. For example, the ultrasound operating panel may include input via a keyboard, touch screen input, ultrasound parameter adjustment knobs, keys, voice input, eye tracking input, brain monitoring system, or any other similar input port. Input information received by input device 280 may be transmitted, for example, via bus 290, to another component (e.g., processor 210) for further processing. Other types of input devices may include cursor control devices such as a mouse, a trackball, or cursor direction keys, among others.
FIG. 3 is an exemplary block diagram of an ultrasound scanning procedure control system shown in accordance with some embodiments of the present description.
As shown in fig. 3, in some embodiments, the ultrasound scanning procedure control system 300 may include an acquisition module 310, a determination module 320, and an execution module 330.
The obtaining module 310 may be configured to obtain a preset scanning protocol of the target object, and display the preset scanning protocol in the candidate list of the interactive interface. For more details, reference may be made to fig. 4 and its related description, which are not repeated herein.
The determination module 320 may be configured to determine a scanning protocol to be performed that is selected by a user from the candidate list. In some embodiments, the scanning protocol to be performed may be presented in a selected list of the interactive interface. For more details, reference may be made to fig. 4 and its related description, which are not repeated herein.
The execution module 330 may be configured to perform scanning on the target object according to the position order of the scanning protocols to be performed in the selected list. In some embodiments, performing a scan of the target object may be accomplished by a handheld ultrasound device. For more details, reference may be made to fig. 4 and its related description, which are not repeated herein.
It should be noted that the above description of the ultrasound scanning procedure control system 100, the ultrasound apparatus 200, and the ultrasound scanning procedure control system 300 is for illustrative purposes only, and is not intended to limit the scope of the present description. Various modifications and adaptations may occur to those skilled in the art in light of this disclosure. However, such changes and modifications do not depart from the scope of the present specification. For example, in some embodiments, the ultrasound device 200 may further include an ultrasound probe for transmitting and receiving ultrasound, performing electrical-to-acoustic signal conversion, converting an electrical signal sent from a host computer into an ultrasound signal oscillating at a high frequency, and converting an ultrasound signal reflected from a tissue organ (e.g., a target object) into an electrical signal to be displayed on a display (e.g., the display 270) of the host computer. For another example, the ultrasound scanning procedure control system 300 may include one or more additional modules, such as a memory module for data storage.
Figure 4 is an exemplary flow diagram of an ultrasound scanning flow control method according to some embodiments described herein.
The process 400 may be performed by the processing device 120, the ultrasound scanning process control system 300, or the processor 210. For example, the process 400 may be implemented as a set of instructions (e.g., the computer program 230) stored in a memory device (e.g., the memory device 140, the internal memory 250, the non-volatile storage medium 220), external to the ultrasound scanning process control system 100 or the ultrasound device 200, and accessible by the ultrasound scanning process control system 100 or the ultrasound device 200. Processing device 120 or processor 210 may execute a set of instructions and, when executing the instructions, may be configured to perform flow 400. The operational schematic of flow 400 presented below is illustrative. In some embodiments, the process may be accomplished with one or more additional operations not described and/or one or more operations not discussed. Additionally, the order in which the operations of flow 400 are illustrated in FIG. 4 and described below is not intended to be limiting.
Step 410, acquiring a preset scanning protocol of the target object, and displaying the preset scanning protocol in a to-be-selected list of the interactive interface. In some embodiments, step 410 may be performed by the processing device 120, the ultrasound device 200, or the acquisition module 310.
The target object may be a body part or organ that a user (e.g. a patient) is about to be scanned. In some embodiments, the target object may include a particular part of the body, such as the head, chest, abdomen, etc., or any combination thereof. In some embodiments, the target object may include a specific organ, such as a liver, a heart, a stomach, a ureter, a uterus, a fallopian tube, a thyroid, a gall bladder, a small intestine, a colon, a bladder, or the like, or any combination thereof.
In some embodiments, the preset scanning protocol may be a predefined scanning protocol containing one or more parameters required when the target object is scanned.
In some embodiments, the preset scanning protocol of the target object may comprise a plurality of protocol units. In some embodiments, the scan site corresponding to each protocol unit may be part of the target object. In some embodiments, the scanning ranges of the plurality of scanning positions corresponding to the plurality of protocol units included in the preset scanning protocol of the target object may be the same or different. Two or more protocol units corresponding to the same scanning position can correspond to different scanning parameters. For example only, as shown in fig. 5, scanning protocols corresponding to different organs (i.e., target objects) of a human body, such as a liver, a kidney, a heart, a thyroid, etc., may be predefined, where each group of scanning protocols may include a plurality of protocol units corresponding to a plurality of different or same scanning positions of the organ, and for example, the liver scanning protocols may include N protocol units, such as a liver scanning protocol 1, a liver scanning protocol 2, a liver scanning protocol 3, …, a liver scanning protocol N, etc.
In some embodiments, the number of protocol units included in the preset scanning protocol corresponding to each target object may be the same or different. For example, N of the liver scanning protocol may be 5, which indicates that the number of protocol units of the preset scanning protocol corresponding to the liver part is 5 in total; the O of the cardiac scanning protocol may be 3, which indicates that the number of protocol units of the preset scanning protocol corresponding to the cardiac region is 3.
In some embodiments, each protocol unit may include at least one of the following information: scanning positions, protocol numbers, scanning line deflection angles, scanning depths, probe transmitting power, composite parameters, digital gray curves, ultrasonic echo gains, ultrasonic transmitting frequency and probe linear density.
The scanning part may refer to a name of a target object scanned when the user scans the patient with the ultrasound apparatus, such as a liver, a kidney, and a heart.
The protocol number may be a number of a protocol unit corresponding to a preset scanning protocol of a target object provided by the system after the target object is selected by the user. For example, as shown in fig. 7, N protocol units corresponding to the preset scan protocol for the liver are shown in the candidate protocol (i.e., the candidate list), where the numbers 1, 2, …, N are protocol numbers. In some embodiments, the numbers may be represented by lower case arabic numerals (e.g., 1, 2, 3 … …), and/or by letters (e.g., a, b, c … …), and/or by roman numerals (e.g., i, ii, iii … …), etc., as well as other characters that facilitate numbering.
In some embodiments, a preset scanning protocol may be defined or formulated based on the historical scanning data, for example, a preset scanning protocol for each organ may be defined or formulated based on information such as scanning positions and used scanning parameters of the same organ in the historical scanning data, or scanning positions and used scanning parameters of the same lesion in the historical scanning data.
In some embodiments, the processing device may retrieve information such as scan protocols or scan parameter preset tables (e.g., the preset scan protocol table shown in fig. 5) of organs stored in the storage device to obtain the preset scan protocols of the target object.
In some embodiments, the processing device may present the plurality of protocol units in the acquired preset scanning protocol of the target object in the interactive interface of the ultrasound device (e.g., the ultrasound device 110 or the ultrasound device 200) according to the sequence of the protocol numbers. In some embodiments, the interactive interface of the ultrasound device may include a title bar, a scanning image display area, an image parameter display area, an ultrasound device scanning information display area, a scanning protocol display area, and the like, or any combination thereof. In some embodiments, the scanning protocol presentation area may include a to-be-selected list and a selected list. The selected list can show the scanning protocol units to be executed, being executed and/or being executed selected by the user from the selected list. In some embodiments, the scanning protocol presentation area may further include a recommendation list for presenting one or more protocol units recommended by the system, and/or an execution order of the protocol units. And the protocol units recommended in the recommendation list can be selected by the user. The user may select some or all of the recommended protocol units, or may not select the recommended protocol units.
For example only, as shown in fig. 6, the top of the interactive interface is a title bar, an ultrasound image corresponding to the current scanning may be displayed on the lower left of the title bar, an image parameter of the ultrasound image may be displayed on the right side of the ultrasound image, a current parameter of the ultrasound device and a currently executed scanning protocol may be displayed on the lower side of the ultrasound image, and a scanning protocol may be displayed on the lowest side of the ultrasound image, where a circular icon with a number in a protocol to be selected (i.e., a list to be selected) is displayed as a protocol unit of a preset scanning protocol of a target object, and numbers 1 to N represent N protocol units corresponding to the current target object, respectively.
It can be understood that the positions of the display areas in the interactive interface can be any reasonable layout, and the present specification does not limit this. In some embodiments, protocol units corresponding to different organs may be presented using different shaped presentation icons, for example, protocol units corresponding to kidneys may be presented using an oval icon and protocol units corresponding to hearts may be presented using a heart icon.
In some embodiments, the processing device may display all or a part of the acquired preset scanning protocol of the target object in the interactive interface of the ultrasound device based on the historical data. For example, the processing device 120 may display, based on the historical scanning data of the current patient to be scanned, only protocol units in the preset scanning protocol corresponding to the historical scanning data of the patient in the interactive interface of the ultrasound device 110.
In some embodiments, the processing device may recommend a plurality of protocol units exhibiting a preset scanning protocol based on a historical protocol selection record and/or a historical protocol execution record of the user. For example, the processing device 120 may recommend, by different colors or graphs, executed, selected, and unexecuted protocol units of the protocol units showing the preset scanning protocol, respectively, based on the protocol units selected and/or scanned by the doctor for the target object during the last scanning and/or the historical scanning record of the patient to be scanned currently or similar to the patient. For another example, the processing device 120 may count protocol units that must be executed in each scan or protocol units with higher frequency of scans in each organ based on historical scan data, and highlight the protocol units in the interactive interface.
In some embodiments, the processing device may use two or more different shaped icons and/or different colors to expose multiple protocol units of a preset scanning protocol. For example, a circle may show the protocol units that have been selected this time or previously, and a square may show the unselected protocol units. For another example, the protocol units that have been selected in the current scanning may be shown in red, the protocol units that have been selected in the previous scanning may be shown in yellow, and the protocol units that have not been selected may be shown in green.
In some embodiments, the processing device may present information such as parameters and protocol attributes of a protocol unit in a preset scanning protocol through different icons. For example, attribute information such as the number and the scanning position of the protocol unit can be displayed by a circular icon, and parameter information such as scanning depth, probe emission power and probe linear density corresponding to the scanning position of the protocol unit can be displayed by a rectangular icon.
In some embodiments, when one or more protocol units in the to-be-selected list and/or the selected list are selected, information content corresponding to the selected protocol units may be presented. For example, as shown in fig. 7, which represents the interface state when the protocol unit 1 in the candidate list is selected, when the user selects the icon of the protocol unit 1, the ultrasound device may show the important parameters (e.g., a scanned part, an illustration of the scanned part, and a protocol number, a probe transmitting power, a scanning depth, etc.) corresponding to the protocol unit 1 in the area pointed by the short arrow in the interactive interface (e.g., an oval area), and further click on the double-arrow icon in the circular area, the parameters corresponding to the scanned part may be shown in detail in the area pointed by the long arrow (e.g., a rectangular area), such as the scanning line deflection angle shown in fig. 7: 12 °, probe transmit power: 5db, scanning depth: 11cm, composite parameters: on, ultrasonic echo gain: 180, ultrasonic emission frequency: 5MHZ, probe linear density: in (1). In some embodiments, after the protocol unit is selected, the detailed content of the protocol unit can be directly displayed, that is, all the content in the elliptical area and the rectangular area is included. The information content of the protocol units can be displayed to help the contents of the scanning protocol to be more clearly and intuitively known.
In some embodiments, the user may edit the preset scanning protocol. For example, the user may modify parameters of one or more protocol units in the preset scanning protocol through the interactive interface after obtaining the authorization, and/or add a new protocol unit, update the preset scanning protocol, and the like. In some embodiments, the user may edit the preset scanning protocol at any feasible time after the scanning is finished, when the system is idle, when the protocol is selected, or before the scanning is executed.
Step 420, determining the scanning protocol to be executed selected by the user from the list to be selected. In some embodiments, step 420 may be performed by the processing device 120, the ultrasound device 200, or the determination module 320.
The scanning protocol to be executed may be a sequence of one or more protocol units that a target object selected by a user needs to execute, which may be presented in a selected list of an interactive interface (e.g., display 270) of the ultrasound device. The selected list refers to an area of the interactive interface showing the protocol unit selected by the user from the to-be-selected list, such as the selected protocol box in the interactive interface shown in fig. 6-10.
In some embodiments, in response to a selection instruction of a user for a preset scanning protocol in the candidate list, the corresponding protocol unit is displayed in the selected list, and one or more protocol units in the selected list are determined as the scanning protocol to be executed. In some embodiments, a user may select a protocol unit to be executed from a plurality of protocol units corresponding to a preset scanning protocol displayed in a protocol list to be selected by clicking, dragging, and the like, as a scanning protocol to be executed. In some embodiments, the position sequence of the protocol units in the selected list can be adjusted and/or new protocol units can be added in response to the editing instruction of the preset scanning protocol and/or the protocol units of the scanning protocol to be executed by the user. That is, the user may edit the protocol units in the selected list, for example, the user may adjust the positions of the protocol units in the selected list, delete the selected protocol units, add new protocol units, and the like when the protocol selection is finished, when the execution of one of the protocol units is finished, when all the selected protocol units are executed, during the execution of the protocol units, and/or during the protocol selection. In some embodiments, the added new protocol unit may be the same protocol unit as the selected protocol unit, for example, the scanning part is the same as the scanning parameter, or the scanning part is the same and the scanning parameter is different.
In some embodiments, the user may edit the scan parameters of the protocol units in the selected list. For example, the user may change the parameters of the scan line deflection angle, scan depth, etc. of one or more protocol units in the selected list. In some embodiments, the processing device may maintain an edit record of protocol unit scan parameters. In some embodiments, the processing device may update the scanning parameters of the corresponding protocol units in the candidate list based on the user's edit record of the scanning parameters of the protocol units in the selected list.
In some embodiments, the user may edit the same numbered protocol units in the selected list separately. For example, if there are two or more protocol units numbered 2 in the selected list, the user may adjust the parameter in each protocol unit 2 to different values. In some embodiments, the adjusted protocol unit may be marked. For example, for two protocol units with the same number, when the parameter of one of the protocol units is adjusted, the two protocol units may be marked with different colors or symbols, or the color of the adjusted protocol unit may be changed or the display pattern may be changed. By the mode, the same scanning position of the same organ can be scanned for multiple times by adopting different scanning parameters, so that the scanning flexibility is improved.
For example only, as shown in fig. 14, when a user edits a protocol unit in a selected list through a touch screen operation, for example, when one of the protocol units in the selected list is dragged, the ultrasound device may determine, after a touch point leaves an interactive interface (i.e., leaves a display icon of the protocol unit), whether a position of the protocol unit is in the selected list, if the position of the protocol unit is not in the selected list, delete the protocol unit from the selected list, save an editing record for the protocol unit, and then enter a next round of editing process, and when the user finishes editing the protocol unit in the selected list, the current protocol unit setting process is finished; if the position of the protocol unit is in the selected list, whether the position of the protocol unit in the selected list changes can be further judged, and if the position of the protocol unit changes, the sequence of the protocol unit after the position change is stored and the editing record of the protocol unit is stored; and if the position of the protocol unit is not changed, entering the next round of editing process, and finishing the setting process of the protocol unit when the user finishes editing the protocol unit in the selected list. In some embodiments, the user may be asked whether to complete editing of the protocol unit in the selected list through voice prompt, pop-up window prompt, or the like, and the process is ended in response to the user completing editing; and responding to the fact that the editing operation is not finished, and entering the next round of editing process.
In the ultrasonic scanning flow control method provided in some embodiments of the present description, a preset scanning protocol of a target object is displayed in an interactive interface candidate list for a user to select, and one or more protocol units selected by the user are displayed in a selected list, and meanwhile, the user can edit the positions, scanning parameters, the number, and the like of the protocol units in the selected list, so that more standard scanning protocol parameters can be correspondingly provided for the user, and the flexibility and the convenience in editing the scanning protocol to be executed are improved.
In some embodiments, a user may implement operations such as selection and editing of a protocol unit through a touch screen operation, a mouse operation (e.g., mouse click selection or mouse dragging), an awareness operation (e.g., electroencephalogram control, eye movement tracking control, etc.), a gesture operation (e.g., putting out a corresponding gesture or hand remote sensing control, etc.), a voice operation, and the like.
In some embodiments, the processing device may determine one or more protocol units in the selected list as scanning protocols to be performed. For more details on selecting the protocol unit that needs to be executed, refer to fig. 13 and the related description thereof, which are not described herein again.
In some embodiments, the processing device may determine that a scanning protocol is to be performed based on preset information of a user. For example, a doctor may preset protocol units and/or execution sequences of protocol units that must be used in a next scanning and/or protocol units that cannot be used in the next scanning at the end of a previous scanning, and the processing device may display the preset protocol units in a selected list as scanning protocols to be executed or in a recommended list for a user to select according to the preset information of the doctor during the next scanning.
In some embodiments, the processing device may determine that the scanning protocol is to be executed based on an operation instruction of the recommended protocol unit by the user. For example, the processing device may determine the protocol unit selected by the user from the recommendation list as the scanning protocol to be executed, or determine the recommended protocol unit as the scanning protocol to be executed after the user clicks an associated button confirming selection of the recommended protocol unit.
In some embodiments, the processing device may recommend one or more protocol units to be executed based on historical scan data and/or protocol selection records. For example, the processing device 120 may recommend protocol units that may need to be executed for the scan based on a patient's historical scan record, historical scan protocol execution records for similar or identical scan sites, and/or a physician's historical protocol selection record. For another example, if the user performed or selected one of the protocol units multiple times during the last scan, the processing device may recommend the protocol unit during the next scan.
In some embodiments, the processing device may recommend one or more protocol units to be executed based on the patient information. For example, the processing device 120 may recommend the protocol units that may need to be executed by the patient for the scan based on the basic information and lesion information recorded by the patient in the hospital. For another example, the processing device 120 may match out patients similar to the patient through big data analysis based on the age, sex, occupation, and other basic information of the patient to be scanned, and recommend scanning protocols of similar patients. The big data comprises historical scanning protocol data of a plurality of patients, operation, evaluation and the like of doctors in the scanning protocol setting process of the history, scanning protocols set or selected by the doctors in the history, operation, evaluation and the like of the doctors in the scanning process of the history, and/or scanning results of the history and the like.
In some embodiments, the processing device may recommend one or more protocol units based on user ratings/feedback, or user actions on the protocol units. For example, if the user feeds back that a certain protocol unit is unnecessary, the protocol unit may not be recommended any more in the next scanning; if the user feeds back that a certain protocol unit must be executed in each scanning, the protocol unit can be recommended in each scanning. For another example, if the user suspends or terminates the execution process of a certain protocol unit in the middle of the execution of the protocol unit, the protocol unit may not be recommended any more in the next scanning.
By self-adaptively recommending protocol units which may need to be executed based on data such as historical scanning operation, scanning feedback/evaluation, protocol selection record and the like of a user and information such as patient data, the convenience and the efficiency of scanning can be improved; and analyzing and recommending protocol units which may need to be executed by the current patient based on the big data, so that a more accurate scanning protocol can be provided for the user, and the protocol units which need to be scanned can be rapidly positioned.
In some embodiments, in response to the user being interrupted when selecting a scanning protocol to be executed from the preset scanning protocols, the processing device may load the scanning protocol to be executed selected by the user before the interruption into the selected list when the ultrasound device resumes running, for example, as shown in fig. 14. By saving the selection record of the scanning protocol to be executed by the user and adding the system interrupt protection program, the data security is improved, and the convenience of protocol selection is improved.
And 430, scanning the target object according to the position sequence of the scanning protocol to be executed in the selected list. In some embodiments, step 430 may be performed by the processing device 120, the ultrasound device 200, or the execution module 330.
In some embodiments, the ultrasound device may automatically perform a scanning of the target object according to a position order of protocol units in the scanning protocol to be performed in the selected list. For example, as shown in fig. 8, the order of the positions of the scanning protocols to be executed in the selected protocols (i.e., the selected list) is protocol unit 2, protocol unit 7, protocol unit 11, protocol unit 10, and protocol unit 9, and the ultrasound apparatus 200 may automatically and sequentially scan the target object according to the order of the 5 protocol units, and according to the positions of the scanning sites and the scanning parameters in each protocol unit. In some embodiments, the next protocol unit in the selected list may be automatically executed in response to an instruction that scanning of the currently scanned protocol unit ends according to the position sequence of the scanning protocol to be executed in the selected list.
In some embodiments, the scanning protocol to be performed may provide a scanning reference for a user, who may set parameters of the ultrasound device according to information in each protocol unit to perform a scan. For example, the user may adjust parameters of the ultrasound device according to suggested values of scan line deflection angle, scan depth, probe transmit power, composite parameters, digital gray scale curve, ultrasound echo gain, ultrasound transmit frequency, probe line density, etc. in the protocol unit, and perform a scan based on the adjusted parameters. In some embodiments, the current parameter information of the ultrasound apparatus may be presented in an interactive interface of the ultrasound apparatus, so as to provide a reference for a user to an adjustment process or an ultrasound scanning process of the ultrasound apparatus parameters.
In some embodiments, the selected list may include a "current scanning protocol scan end" button (as shown in fig. 6-10) that the user may click to end the scan of the currently executing protocol unit. In some embodiments, the user may click on the "end of scan protocol currently being performed" button after the system completes a scan of the currently performing protocol unit or during a scan of the currently performing protocol unit. In some embodiments, the user may input feedback on the currently executed protocol unit after clicking the "scan end of current scanning protocol" button, for example, the user may input feedback information such as "the protocol unit is a non-essential execution protocol, and is not used again in the next scanning" or "the protocol unit must be executed in each scanning". In some embodiments, as shown in FIG. 12, the system may automatically switch to the scan of the next protocol unit in the selected list after the user clicks the "end of scan protocol scan" button.
In some embodiments, a free scanning process or a protocol selection process for the target object may be entered in response to a selection instruction of a scanning mode for the target object. For example only, as shown in fig. 12, the ultrasound device may determine whether the selected list is empty before starting scanning or during scanning, and when it is determined that the selected list is empty, may remind a user (e.g., a doctor) whether to perform free scanning in a message prompt, a voice prompt, a pop-up window prompt, or the like, and if the user selects "yes", then enter a free scanning process, that is, the user manually and automatically adjusts scanning parameters of the ultrasound device according to an actual situation and performs scanning on a target object; and if the doctor selects 'no', entering a protocol selection process, namely, selecting the protocol unit to be executed from the list to be selected by the user.
The ultrasonic scanning flow control method provided in some embodiments of the present description supports two scanning modes, namely "user free scanning" (corresponding to free scanning flow) and "system automatic scanning" (corresponding to protocol selection flow), so that flexibility and convenience of scanning are increased, more detailed and diversified scanning data can be obtained, and more accurate diagnosis results can be obtained.
In some embodiments, when the selected list is not empty, the ultrasound device may start to perform scanning on the target object according to the position order of the scanning protocol to be performed in the selected list, and mark the currently performed protocol unit.
In some embodiments, currently executed protocol units, unexecuted protocol units, executed protocol units, and/or the like may be marked in the interactive interface. In some embodiments, the indicia may include words, symbols, letters, graphics, and the like, or any combination thereof. For example, the currently executed protocol unit may be marked on or beside the protocol unit icon of the selected list in text form such as "in scan" or "scanned" or the like. For another example, as shown in fig. 9, the protocol unit 7 may be marked in the icon of the selected protocol unit by a character form such as "…" as the scanning protocol being executed, and the protocol units 2, 5, 8, and 11 may be marked in the icon of the protocol unit by a graphic form such as an oval. In some embodiments, the currently executed protocol unit, the non-executed protocol unit, and the executed protocol unit may be marked by different colors, different icon sizes, or different icon lines, respectively.
In some embodiments, the ultrasound device may store corresponding scan information in a database (e.g., storage device 140) after the scan of the scanning protocol to be performed is completed. For example, the ultrasound apparatus 200 may store the scanning information corresponding to the protocol unit into the database (for example, the storage device 140 or the internal memory 250) in response to the user pressing the "scan protocol end" button (i.e., the end button described in the flowchart of fig. 12), or store the scanning information corresponding to each protocol unit executed in the current scan into the database after corresponding to each other in response to the user pressing the "scan procedure end" button. For example, information corresponding to the protocol unit may be associated with the target object information and stored in the storage device 140 or the internal memory 250. In some embodiments, the information may include a patient ID (e.g., a patient's identification number, a patient's file number, etc.), a patient name, a visit time, scanned image information, a protocol unit number, etc., a table as shown in fig. 11, or any other combination.
In some embodiments, the system may record information such as execution feedback of the user on each protocol unit (e.g., feedback or evaluation information on the execution of the protocol input by the user through the interactive interface after the scanning process is finished or the execution of one of the protocol units is finished), an action of the user on each protocol unit during the execution process (e.g., an operation of the user on pausing or terminating the execution of the protocol unit in the middle of the execution of the protocol unit by the system), and/or the number of times of selection and/or editing of the protocol unit by the user (e.g., the user selects the same protocol unit multiple times).
In some embodiments, during the scanning execution process, the user may edit the scanning protocol to be executed in the selected list, for example, add a protocol unit (for example, add a certain executed protocol unit for scanning again, or add a protocol unit that is not scanned), delete an unexecuted protocol unit, or adjust the position of the unexecuted protocol unit in the selected list, etc. In some embodiments, when all scanning protocol units in the selected list are executed, the scanning may be ended. In some embodiments, when all scanning protocol units in the selected list are executed, the user may add a new scanning protocol to be executed, as shown in fig. 10.
FIG. 13 is an exemplary flow diagram illustrating the determination of a scanning protocol to be performed according to some embodiments of the present description.
In some embodiments, the processing device may determine that a scanning protocol is to be executed by determining whether a touch point (e.g., in a touch screen operation) or a trigger point (e.g., in a mouse operation) applied to the interactive interface is within a presentation icon of one of the plurality of protocol units in the candidate list, and determining whether the protocol unit is presented in the selected list. For convenience of understanding, the selection process of the scanning protocol will be mainly described in detail by taking a touch screen operation as an example, and it is to be understood that, in some embodiments, the scanning protocol to be executed may be selected through any other feasible operation manner (such as a mouse operation, an awareness operation, a gesture operation, and the like).
In step 1310, touch points of the user and the interactive interface are obtained. In some embodiments, step 1310 may be performed by the processing device 120, the ultrasound device 200, or the acquisition module 310.
The touch point refers to a contact point of a user with the interactive interface. In some embodiments, a touch point may be an area on the interactive interface that a user touches, clicks, or presses with a finger, or a capacitive glove, or other touch implement (e.g., a stylus, etc.). In some embodiments, the touch points may include one or more. In some embodiments, when the number of touch points is plural, the center point of one or more touch points may be selected as the main touch point.
In step 1320, it is determined whether the touch point is within the display icon of one of the protocol units. In some embodiments, step 1320 may be performed by the processing device 120, the ultrasound device 200, or the determination module 320.
The presentation icon may refer to a graphic used to represent the protocol unit in the interactive interface. In some embodiments, the shape of the presentation icon may include a circle (as shown in fig. 6-10), a square, a triangle, a pentagon, a regular hexagon, etc., or any combination thereof.
In some embodiments, the reference coordinate point and the preset range may be determined according to the shape of the presentation icon of the protocol unit; and judging whether the touch point is in a display icon of one protocol unit in the plurality of protocol units displayed in the list to be selected or not based on the reference coordinate point, the preset range and the coordinates of the touch point.
For example only, when the shape of the presentation icon is a circle, the ultrasound device may acquire coordinates (x1, y1) of a touch point (or trigger point), and a radius r of the presentation icon of the protocol unit and coordinates (x2, y2) of a center point thereof (i.e., reference coordinate points); by the equation
Figure BDA0003222551840000181
Calculating the value of the distance x between the touch point (or trigger point) and the center point of the displayed icon based on the coordinates (x1, y1) of the touch point (or trigger point) and the coordinates (x2, y2) of the center point of the displayed icon, and determining that the touch point (or trigger point) is in the protocol in response to x ≦ rWithin the unit's presentation icon.
As yet another example, when the shape of the presentation icon is a square, the ultrasound apparatus may acquire coordinates (x1, y1) of a touch point (or a trigger point), and coordinates (x3, y3) of a side length S of the presentation icon of the protocol unit and an upper left corner point thereof (i.e., reference coordinate points); in response to x3< x1< x3+ S, and y3< x1< y3+ S, it is determined that the touch point (or trigger point) is within the presentation icon of the protocol unit.
As yet another example, when the shape of the presentation icon is a regular hexagon, the ultrasound apparatus may acquire the side length R of the presentation icon of a protocol unit and define the center point of the protocol unit as the coordinate system origin (0,0) (i.e., the reference coordinate point); determining coordinates (x1, y1) of the touch point (or trigger point) based on the coordinate system origin, and calculating absolute values x ═ x1|, y ═ y1 |; judging whether the touch point (or the trigger point) is in a circumscribed rectangle of a regular hexagon of the protocol unit or not based on the side length R and the absolute value of the coordinate of the touch point (or the trigger point); in response to y<R is or
Figure BDA0003222551840000191
Determining the touch point (or trigger point) in the circumscribed rectangle of the regular hexagon, and judging the mathematical expression
Figure BDA0003222551840000192
Figure BDA0003222551840000193
Whether the result is true or not; in response to the equation being established, a touch point (or trigger point) is determined to be within the presentation icon of the protocol unit.
In some embodiments, when multiple touch points (or trigger points) operate simultaneously, the multiple touch points (or trigger points) may be identified and coordinates of the multiple touch points (or trigger points) may be obtained, and it is sequentially determined whether each touch point (or trigger point) is within a presentation icon of one of the protocol units in the candidate list. In some embodiments, multiple touch points (or trigger points) may simultaneously and respectively select multiple protocol units. For example, for the touch point 1 and the touch point 2, the system may determine that the touch point 1 is in the display icon of the protocol unit 3 and the touch point 2 is in the display icon of the protocol unit 7 according to the coordinates of the touch point 1 and the coordinates of the touch point 2, and further determine that the user selects the protocol unit 3 and the protocol unit 7 at the same time.
It is understood that the above method is only an example, and in some alternative embodiments, whether the touch point is in the presentation icon of one of the protocol units may be determined in a corresponding manner according to the shape of the presentation icon being a heart, a pentagram or other special figure. For example, when the display icon is a heart shape, the system may determine whether the touch point is selected to one of the protocol units by calculating whether the coordinates of the touch point are within the area of one of the heart shapes, which is not limited in this specification.
Step 1330, in response to the touch point being within the icon of the protocol unit, determining whether the protocol unit is dragged to the selected list. In some embodiments, step 1330 may be performed by processing device 120, ultrasound device 200, or determination module 320.
When the touch point is in the display icon of the protocol unit, the touch point indicates that the user selects the protocol unit. After determining that one of the protocol units is touched and clicked, it is necessary to determine further operations of the user on the protocol unit, for example, the information content corresponding to the protocol unit needs to be checked, or the information content needs to be dragged to a selected list as a scanning protocol to be executed.
In some embodiments, it may be determined whether the protocol unit is dragged to the selected list based on a coordinate position of the protocol unit presentation icon when the touch point is away from the presentation icon of the selected protocol unit. For example only, the system may obtain the length L, width H, and left vertex coordinates (x4, y4) of the border of the selected list, and coordinates (x5, y5) of the position where the center point of the selected protocol unit exists when the touch point (or trigger point) leaves the displayed border of the protocol unit; in response to the coordinates (x5, y5) of the location of the center point of the protocol unit being within the border of the selected list, e.g., coordinates (x5, y5) satisfy x4< x5< x4+ L, and y4< y5< y4+ H, determining that the protocol unit is dragged to the selected list.
In some embodiments, it may be determined that the user does not perform the operation of selecting the protocol unit in response to the touch point (or the trigger point) not being within the displayed icon of any protocol unit in the to-be-selected list, that is, not selecting any protocol unit.
Step 1340, in response to the protocol unit being dragged to the selected list, presents the protocol unit in the selected list. In some embodiments, step 1340 may be performed by the processing device 120, the ultrasound device 200, or the determination module 320.
In some embodiments, the protocol unit may be presented in the selected list in response to the touch point being off of the presentation icon of the selected protocol unit and the position to which the protocol unit is dragged being within the border of the selected list, i.e., the user wants to drag it to the selected list as the scan protocol to be performed.
In some embodiments, the system may determine that the protocol units shown in the selected list are to be scanned. In some embodiments, the protocol units may be presented in the selected list according to the sequential order in which the protocol units are dragged into the selected list. In some embodiments, after a protocol unit is displayed in the selected list, the protocol unit may still be displayed in the to-be-selected list to meet the requirement of the user for multiple scanning selections.
In some embodiments, in response to the touch point of the user leaving the display icon of one of the protocol units and dragging the position to which the protocol unit is dragged out of the border of the selected list, the protocol unit will automatically return to the list to be selected in the form of animation.
For example only, if the user selects protocol unit 4 in the candidate list, the coordinates of the center point before protocol unit 4 is dragged are (x7, y7), and when the touch is released (i.e., the touch point leaves the icon displayed on protocol unit 4), the coordinates of the center point of protocol unit 4 are (x8, y 8). Assuming that the time during which the user drags the protocol unit 4 is T, i.e. the protocol unit 4 moves from the point (x7, y7) to the point (x8, y8) within the time T, the coordinates of the center point of the protocol unit 4 dragged by the user at a certain time point T are:
Figure BDA0003222551840000201
Figure BDA0003222551840000202
the processing device can erase the protocol unit 4 at the previous time point according to the mathematical expression, draw the protocol unit 4 at the current time point T, the coordinates of the corresponding central point are (x), (T), y (T)), and the process of drawing, erasing and drawing in the continuous time T can form the moving animation of the protocol unit 4 in the interactive interface.
In some embodiments, in the whole process of the operation of the ultrasound apparatus, the process of selecting the scanning protocol to be executed from the preset scanning protocols by the user may be interrupted or forced to stop due to power failure, crash, and the like of the apparatus. For example, as shown in fig. 14, after the system is initialized, the ultrasound apparatus may first determine whether the scanning protocol to be executed is forced to be interrupted in the selection process, in response to the interruption, restore the protocol record saved before the interruption, and then perform a protocol selection or editing process.
According to the ultrasonic scanning flow control method provided in some embodiments of the present description, whether a user selects a protocol unit is judged in different manners according to different display icons, and whether the protocol unit is displayed in a selected list is judged according to a position relationship between the protocol unit and the selected list, so that accuracy of a judgment result in a protocol selection process is improved, and a scanning protocol to be executed is more accurately determined.
It should be noted that the above description of flow 400 and/or flow 1300 is provided for illustrative purposes only, and is not intended to limit the scope of the present description. Various changes and modifications will occur to those skilled in the art based on the description herein. However, such changes and modifications do not depart from the scope of the present specification. In some embodiments, flow 400 and/or flow 1300 may include one or more additional operations or may omit one or more of the operations described above. For example, the flow 400 may include one or more additional operations to determine one or more preset scanning protocols and/or scanning protocols to be performed.
It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (10)

1. An ultrasonic scanning flow control method is characterized by comprising the following steps:
acquiring a preset scanning protocol of a target object, and displaying the preset scanning protocol in a to-be-selected list of an interactive interface;
determining a scanning protocol to be executed selected from the list to be selected by a user, wherein the scanning protocol to be executed is displayed in the selected list of the interactive interface;
and scanning the target object according to the position sequence of the scanning protocol to be executed in the selected list.
2. The method of claim 1, wherein the preset scanning protocol comprises a plurality of protocol units, and a scanning position corresponding to each protocol unit at least covers a part of the target object.
3. The method of claim 1, wherein the determining the scanning protocol to be executed selected by the user from the candidate list comprises:
responding to a selection instruction of the user for a preset scanning protocol in the to-be-selected list, and displaying a corresponding protocol unit in the preset scanning protocol in the selected list;
and determining the protocol units in the selected list as the scanning protocols to be executed.
4. The method according to claim 3, wherein the presenting, in response to a selection instruction of a preset scanning protocol in the candidate list by the user, a corresponding protocol unit in the preset scanning protocol in the selected list comprises:
acquiring a touch point or a trigger point of the user acting on the interactive interface;
judging whether the touch point or the trigger point is in a display icon of one of the protocol units in the to-be-selected list;
in response to the touch point or trigger point being within the presentation icon of the one of the protocol units,
displaying one of the protocol units in the selected list when the touch point or the trigger point leaves the display icon of the one of the protocol units, and the position to which the one of the protocol units is dragged is within a border of the selected list; and/or
And when the touch point or the trigger point leaves the display icon of the one protocol unit, and the position to which the one protocol unit is dragged is not in the frame of the selected list, controlling the one protocol unit to automatically return to the to-be-selected list in an animation mode.
5. The method according to claim 4, wherein the determining whether the touch point or the trigger point is within a presentation icon of one of the plurality of protocol units in the candidate list comprises:
determining a reference coordinate point and a preset range according to the shape of the display icon of one protocol unit;
and judging whether the touch point or the trigger point is in the display icon of one protocol unit or not based on the reference coordinate point, the preset range and the coordinates of the touch point or the trigger point.
6. The method of claim 1, further comprising:
and in response to the user being interrupted when selecting the scanning protocol to be executed from the preset scanning protocols, loading the scanning protocol to be executed selected by the user before interruption to the selected list when the ultrasonic equipment resumes running.
7. The method of claim 1, further comprising:
and responding to the editing instruction of the user on the protocol units in the preset scanning protocol and/or the scanning protocol to be executed, and adjusting the position sequence of the protocol units of the scanning protocol to be executed in the selected list and/or adding new protocol units.
8. The method of claim 1, further comprising:
and responding to a selection instruction of the scanning mode of the target object, and entering a free scanning process or a protocol selection process of the target object.
9. An ultrasound scanning procedure control diagnostic system, the system comprising:
the acquisition module is used for acquiring a preset scanning protocol of the target object and displaying the preset scanning protocol in a to-be-selected list of an interactive interface;
the determining module is used for determining a scanning protocol to be executed selected by a user from the list to be selected, and the scanning protocol to be executed is displayed in the selected list of the interactive interface;
and the execution module is used for executing scanning on the target object according to the position sequence of the scanning protocol to be executed in the selected list.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-8 when executing the computer program.
CN202110962335.1A 2021-08-20 2021-08-20 Ultrasonic scanning flow control method and system Pending CN114035713A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110962335.1A CN114035713A (en) 2021-08-20 2021-08-20 Ultrasonic scanning flow control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110962335.1A CN114035713A (en) 2021-08-20 2021-08-20 Ultrasonic scanning flow control method and system

Publications (1)

Publication Number Publication Date
CN114035713A true CN114035713A (en) 2022-02-11

Family

ID=80134339

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110962335.1A Pending CN114035713A (en) 2021-08-20 2021-08-20 Ultrasonic scanning flow control method and system

Country Status (1)

Country Link
CN (1) CN114035713A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD1016844S1 (en) * 2022-05-25 2024-03-05 K-Bio HealthCare, Inc. Display screen or portion thereof with graphical user interface

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1558738A (en) * 2001-11-22 2004-12-29 ��ʽ���綫֥ Ultrasonograph, work flow edition system, and ultrasonograph control method
CN103164142A (en) * 2011-12-16 2013-06-19 联想(北京)有限公司 Method of adjusting screen touch point position of picture-in-picture interface and electronic equipment
CN106992985A (en) * 2017-04-10 2017-07-28 上海联影医疗科技有限公司 A kind of protocol groups generation method and device
CN111493917A (en) * 2020-04-23 2020-08-07 上海联影医疗科技有限公司 Image scanning protocol interaction device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1558738A (en) * 2001-11-22 2004-12-29 ��ʽ���綫֥ Ultrasonograph, work flow edition system, and ultrasonograph control method
CN103164142A (en) * 2011-12-16 2013-06-19 联想(北京)有限公司 Method of adjusting screen touch point position of picture-in-picture interface and electronic equipment
CN106992985A (en) * 2017-04-10 2017-07-28 上海联影医疗科技有限公司 A kind of protocol groups generation method and device
CN111493917A (en) * 2020-04-23 2020-08-07 上海联影医疗科技有限公司 Image scanning protocol interaction device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD1016844S1 (en) * 2022-05-25 2024-03-05 K-Bio HealthCare, Inc. Display screen or portion thereof with graphical user interface

Similar Documents

Publication Publication Date Title
US20180317890A1 (en) Method of sharing information in ultrasound imaging
US9841811B2 (en) Visually directed human-computer interaction for medical applications
US8228347B2 (en) User interface and methods for sonographic display device
US10842466B2 (en) Method of providing information using plurality of displays and ultrasound apparatus therefor
US20160350503A1 (en) Medical image display apparatus and method of providing user interface
EP3653131B1 (en) Ultrasound diagnosis apparatus for determining abnormality of fetal heart, and operating method thereof
US20130174077A1 (en) Medical information display apparatus, method, and program
KR20140039954A (en) Ultrasound apparatus and method for providing information using the ultrasound apparatus
US20150160821A1 (en) Method of arranging medical images and medical apparatus using the same
US20140149910A1 (en) Method of displaying medical image acquisition information and medical image display apparatus
EP2889744A1 (en) Method and apparatus for displaying medical images
CN105493133A (en) Method of sharing information in ultrasound imaging
KR20160044401A (en) Method and ultrasound apparatus for providing information using a plurality of display
JP2015016067A (en) Image display method, apparatus, and program
JP6632248B2 (en) Medical image display device, medical image display system, medical image display method, and program
JP2014178458A (en) Mobile display device for medical images
EP2888993A1 (en) Image display device and program
CN101118574A (en) Systems and methods for rule-based volume rendition and navigation
CN114035713A (en) Ultrasonic scanning flow control method and system
JP6158690B2 (en) Image display device
CN202288335U (en) Ultrasonic diagnostic instrument with touch screen selection preset value
CN107809956A (en) Ultrasonic device and its operating method
CN109223034A (en) Ultrasonic imaging method and supersonic imaging apparatus
KR101194294B1 (en) Ultrasonic waves diagnosis method and apparatus for providing user interface on screen
KR20100014135A (en) Ultrasound system and method of offering preview pages

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination