CN116128888A - Three-dimensional ultrasonic image cutting method and device, ultrasonic equipment and storage medium - Google Patents

Three-dimensional ultrasonic image cutting method and device, ultrasonic equipment and storage medium Download PDF

Info

Publication number
CN116128888A
CN116128888A CN202111339868.0A CN202111339868A CN116128888A CN 116128888 A CN116128888 A CN 116128888A CN 202111339868 A CN202111339868 A CN 202111339868A CN 116128888 A CN116128888 A CN 116128888A
Authority
CN
China
Prior art keywords
depth
cutting
depth indication
clipping
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111339868.0A
Other languages
Chinese (zh)
Inventor
陈庭轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sonoscape Medical Corp
Original Assignee
Sonoscape Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sonoscape Medical Corp filed Critical Sonoscape Medical Corp
Priority to CN202111339868.0A priority Critical patent/CN116128888A/en
Publication of CN116128888A publication Critical patent/CN116128888A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • G06T2207/101363D ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

In the scheme, at least two depth indication lines are marked on a first angle view of a three-dimensional ultrasonic image, then after a cutting surface selected by a user on a second angle view of the three-dimensional ultrasonic image is obtained, cutting bodies corresponding to each depth indication line are respectively determined in the three-dimensional ultrasonic image based on the cutting surface and the depth indicated by each depth indication line, and each cutting body is simultaneously displayed. According to the scheme, the plurality of cutting bodies are determined in the three-dimensional ultrasonic image at the same time, and the cutting bodies are displayed at the same time, so that doctors can conveniently compare the cutting bodies under different cutting depths, a needed cutting part is positioned rapidly and accurately, trial cutting of the three-dimensional ultrasonic image is not needed repeatedly, and efficiency is improved. Correspondingly, the three-dimensional ultrasonic image cutting device, the ultrasonic equipment and the storage medium have the technical effects.

Description

Three-dimensional ultrasonic image cutting method and device, ultrasonic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and apparatus for clipping three-dimensional ultrasound images, an ultrasound device, and a storage medium.
Background
At present, partial shielding objects, impurities or noise inevitably exist in the three-dimensional ultrasonic image, and the three-dimensional ultrasonic image actually required by a doctor can be conveniently and better presented by cutting the three-dimensional ultrasonic image to reserve the main part of the image.
Because the three-dimensional ultrasonic image is a three-dimensional body, the shielding object, impurities and the like in the three-dimensional body can be located at any position or in any shape of the three-dimensional body, and the specific position of the shielding object in the three-dimensional ultrasonic image cannot be clearly defined at present, a doctor is required to repeatedly perform trial cutting on the three-dimensional ultrasonic image until the shielding object to be cut is determined.
Disclosure of Invention
In view of the foregoing, an object of the present application is to provide a three-dimensional ultrasound image clipping method, apparatus, ultrasound device and storage medium, so as to improve clipping efficiency of three-dimensional ultrasound images. The specific scheme is as follows:
in order to achieve the above object, in one aspect, the present application provides a three-dimensional ultrasound image cropping method, including:
acquiring a three-dimensional ultrasonic image;
marking at least two depth indication lines on a first angle view of the three-dimensional ultrasonic image;
acquiring a clipping surface selected by a user on a second angle view of the three-dimensional ultrasonic image;
and respectively determining a cutting body corresponding to each depth indication line in the three-dimensional ultrasonic image based on the cutting surface and the depth indicated by each depth indication line, and simultaneously displaying each cutting body.
Preferably, the marking at least two depth indication lines on the first angle view of the three-dimensional ultrasound image includes:
marking at least two depth indication lines on the first angle view according to default clipping depth configuration information or clipping depth configuration information input by a user; wherein the clipping depth configuration information includes: the number of depth indication lines, the spacing between two adjacent depth indication lines and/or the depth indicated by each depth indication line.
Preferably, after marking at least two depth indication lines on the first angle view of the three-dimensional ultrasound image, the method further includes:
and adjusting the depth indicated by the depth indication line according to a depth adjustment instruction of a user on any depth indication line.
Preferably, after marking at least two depth indication lines on the first angle view of the three-dimensional ultrasound image, the method further includes:
and simultaneously adjusting the depths indicated by all the depth indication lines according to the depth adjustment instructions of the user on all the depth indication lines.
Preferably, the second angular view is a front view, and the clipping surface is determined by a user tracing on the front view.
Preferably, the determining, in the three-dimensional ultrasound image, a clipping body corresponding to each depth indication line based on the clipping surface and the depth indicated by each depth indication line includes:
and determining a space region formed by the region indicated by the cutting surface and the depth indicated by the depth indication line in the three-dimensional ultrasonic image as a cutting body corresponding to the depth indication line aiming at each depth indication line.
Preferably, the simultaneously displaying the respective cutting bodies includes:
and simultaneously displaying all the cutting bodies and the first angle view marked with at least two depth indication lines on the same display window.
In yet another aspect, the present application further provides a three-dimensional ultrasound image clipping device, including:
the first acquisition module is used for acquiring a three-dimensional ultrasonic image;
the marking module is used for marking at least two depth indication lines on a first angle view of the three-dimensional ultrasonic image;
the second acquisition module is used for acquiring a clipping surface selected by a user on a second angle view of the three-dimensional ultrasonic image;
and the cutting module is used for respectively determining cutting bodies corresponding to each depth indication line in the three-dimensional ultrasonic image based on the cutting surface and the depth indicated by each depth indication line, and simultaneously displaying each cutting body.
In yet another aspect, the present application also provides an ultrasound device including a processor and a memory; the memory is used for storing a computer program, and the computer program is loaded and executed by the processor to realize the three-dimensional ultrasonic image clipping method.
In yet another aspect, the present application further provides a storage medium having stored therein computer-executable instructions that, when loaded and executed by a processor, implement the foregoing three-dimensional ultrasound image cropping method.
After a three-dimensional ultrasonic image is acquired, marking at least two depth indication lines on a first angle view of the three-dimensional ultrasonic image, then after a cutting surface selected by a user on a second angle view of the three-dimensional ultrasonic image is acquired, respectively determining cutting bodies corresponding to each depth indication line in the three-dimensional ultrasonic image based on the cutting surface and the depth indicated by each depth indication line, and simultaneously displaying each cutting body. According to the scheme, under the indication of at least two depth indication lines on the first angle view, a plurality of cutting bodies can be determined in the three-dimensional ultrasonic image at the same time, and the cutting bodies are displayed at the same time, so that doctors can conveniently compare the cutting bodies under different cutting depths, required cutting parts can be positioned rapidly and accurately, repeated trial cutting of the three-dimensional ultrasonic image is not needed, and cutting efficiency is improved.
Correspondingly, the three-dimensional ultrasonic image cutting device, the ultrasonic equipment and the storage medium have the technical effects.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings may be obtained according to the provided drawings without inventive effort to a person skilled in the art.
FIG. 1 is a flow chart of a three-dimensional ultrasound image cropping method provided by the application;
FIG. 2 is a schematic diagram of a depth indication line according to the present application;
FIG. 3 is a schematic view of a clipping surface provided in the present application;
FIG. 4 is a schematic view of a cut-out provided herein;
FIG. 5 is a flow chart of another method for clipping three-dimensional ultrasound images provided herein;
fig. 6 is a display effect diagram of the same three-dimensional ultrasound image provided in the present application at different display angles;
FIG. 7 is a schematic view of a clipping effect of a three-dimensional ultrasound image provided by the present application;
FIG. 8 is a schematic view of a clipping flow for clipping bodies with different depths according to the present application;
FIG. 9 is a schematic view of a three-dimensional ultrasound image cropping device provided by the present application;
FIG. 10 is a block diagram of a server provided herein;
fig. 11 is a schematic diagram of a terminal provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application. In addition, in the embodiments of the present application, "first," "second," and the like are used to distinguish similar objects, and are not necessarily used to describe a particular order or sequence.
Because the three-dimensional ultrasonic image is a three-dimensional body, the shielding object, impurities and the like in the three-dimensional body can be located at any position or in any shape of the three-dimensional body, and the specific position of the shielding object in the three-dimensional ultrasonic image cannot be clearly defined at present, a doctor is required to repeatedly perform trial cutting on the three-dimensional ultrasonic image until the shielding object to be cut is determined.
In view of the above-mentioned problems existing at present, the present application proposes a three-dimensional ultrasound image clipping scheme, which can determine a plurality of clipping bodies in one three-dimensional ultrasound image at the same time and display the clipping bodies at the same time, so that a doctor can conveniently compare the clipping bodies at different clipping depths, thereby rapidly and accurately positioning a required clipping portion without repeatedly performing trial clipping on the three-dimensional ultrasound image, and thereby improving clipping efficiency.
Referring to fig. 1, fig. 1 is a flowchart of a first three-dimensional ultrasound image clipping method according to an embodiment of the present application. As shown in fig. 1, the three-dimensional ultrasound image cropping method may include the steps of:
s101, acquiring a three-dimensional ultrasonic image.
The three-dimensional ultrasound image in this embodiment may be obtained from an ultrasound image by frame. Thus in one embodiment, acquiring a three-dimensional ultrasound image comprises: and acquiring an ultrasonic image in the ultrasonic detection process, and intercepting a three-dimensional ultrasonic image from the ultrasonic image.
S102, marking at least two depth indication lines on a first angle view of the three-dimensional ultrasonic image.
Wherein the first angular view of the three-dimensional ultrasound image may be a side view, such as a left view or a right view. The depth indication line is vertically marked on the first angular view. As shown in fig. 2, assuming that the three-dimensional ultrasound image is a regular cuboid, a plurality of depth indication lines perpendicular to the bottom surface may be marked on the right view thereof, each depth indication line indicating one depth, and the depth indicated by each depth indication line is also marked on the right view, as-7, 0, +5 in fig. 2. The depth referred to herein is the broad length of the cuboid.
It should be noted that, the depth indicated by each depth indication line may be an exact value or a range of values.
S103, acquiring a clipping surface selected by a user on a second angle view of the three-dimensional ultrasonic image.
Because the shielding object, the impurities and the like in the three-dimensional ultrasonic image can be in any shape, the user traces on the second angle view to determine the clipping surface, so that the outline of the actual impurities can be conveniently traced, and the clipped clipping body is more accurate. Thus in one embodiment, the second angular view is a front view, the clipping plane being defined by the user tracing on the front view. Of course, the shade, the foreign matter, etc. may be regular shapes, and thus a regular shape capable of being dragged may be provided so that the user determines the shape of the clipping plane using these regular shapes.
Referring to fig. 3, the clipping plane selected on the front view of the three-dimensional ultrasonic image may be a circular shape as shown in a or an irregular shape as shown in B. Of course, other shapes are also possible.
Of course, the second angle view and the first angle view described in this embodiment are determined according to the display viewing angle of the three-dimensional ultrasound image, and if the display viewing angle of the three-dimensional ultrasound image changes, the corresponding second angle view and first angle view naturally change correspondingly. In this embodiment, the display view angle of the three-dimensional ultrasound image may be flexibly selected by the user.
S104, based on the cutting surface and the depth indicated by each depth indication line, respectively determining cutting bodies corresponding to each depth indication line in the three-dimensional ultrasonic image, and simultaneously displaying each cutting body.
Referring to fig. 4, if the clipping plane is a circle as shown in fig. 3 and the three-dimensional ultrasound image is a cuboid as shown in fig. 2, then the clipping body corresponding to each depth indication line is a respective circle (each clipping body is seen from the front of the cuboid).
In one embodiment, based on the depth indicated by the clipping surface and each depth indication line, determining a clipping body corresponding to each depth indication line in the three-dimensional ultrasonic image respectively includes: and determining a space region formed by the region indicated by the cutting surface and the depth indicated by the depth indication line in the three-dimensional ultrasonic image as a cutting body corresponding to the depth indication line aiming at each depth indication line.
It should be noted that, when displaying each cutting body, the first angle view marked with at least two depth indication lines may be displayed together, so that the user can adjust the depth of each depth indication line at any time. Thus in one embodiment, simultaneously displaying each of the trim pieces comprises: and simultaneously displaying all the cutting bodies and a first angle view marked with at least two depth indication lines on the same display window. Specifically, the user may perform depth adjustment only for a certain depth indication line, or may adjust the depths indicated by a plurality of depth indication lines at the same time. Simultaneously adjusting the depths indicated by the plurality of depth indication lines as follows: the plurality of depth indication lines are translated in whole.
According to the method provided by the embodiment, a software cutting tool can be designed and realized, and the software cutting tool is installed on a computer device or an ultrasonic workbench, so that a user can effectively cut a three-dimensional ultrasonic image by using the software cutting tool.
It can be seen that, in this embodiment, after a three-dimensional ultrasound image is acquired, at least two depth indication lines are marked on a first angle view of the three-dimensional ultrasound image, then, after a clipping plane selected by a user on a second angle view of the three-dimensional ultrasound image is acquired, clipping bodies corresponding to each depth indication line are respectively determined in the three-dimensional ultrasound image based on the clipping plane and the depth indicated by each depth indication line, and each clipping body is simultaneously displayed. According to the scheme, under the indication of at least two depth indication lines on the first angle view, a plurality of cutting bodies can be determined in the three-dimensional ultrasonic image at the same time, and the cutting bodies are displayed at the same time, so that doctors can conveniently compare the cutting bodies under different cutting depths, required cutting parts can be positioned rapidly and accurately, repeated trial cutting of the three-dimensional ultrasonic image is not needed, and cutting efficiency is improved.
Based on the above embodiments, it should be noted that at least two depth indication lines marked on the first angle view of the three-dimensional ultrasound image may be determined according to default clipping depth configuration information or clipping depth configuration information input by a user.
If a software cutting tool is designed and implemented according to the method provided by the application, after the three-dimensional ultrasonic image is input into the software cutting tool, at least two depth indication lines can be marked on a first angle view of the three-dimensional ultrasonic image by using default cutting depth configuration information in the software cutting tool. Of course, the user can autonomously configure the clipping depth configuration information in the software clipping tool, thereby realizing: and marking at least two depth indication lines on the first angle view according to the clipping depth configuration information input by the user.
In one embodiment, marking at least two depth indication lines on a first angular view of a three-dimensional ultrasound image comprises: marking at least two depth indication lines on the first angle view according to default cutting depth configuration information or cutting depth configuration information input by a user; the clipping depth configuration information comprises: the number of depth indication lines, the spacing between two adjacent depth indication lines and/or the depth indicated by each depth indication line. Wherein, each depth indication line can be equidistantly arranged or non-equidistantly arranged.
The user can adjust the depth of each depth indication line at any time. The user may perform depth adjustment only for a certain depth indication line, or may adjust the depths indicated by a plurality of depth indication lines at the same time. Simultaneously adjusting the depths indicated by the plurality of depth indication lines as follows: the plurality of depth indication lines are translated in whole.
In one embodiment, after marking at least two depth indication lines on the first angular view of the three-dimensional ultrasound image, further comprising: and adjusting the depth indicated by the depth indication line according to a depth adjustment instruction of a user on any depth indication line.
In one embodiment, after marking at least two depth indication lines on the first angular view of the three-dimensional ultrasound image, further comprising: and simultaneously adjusting the depths indicated by all the depth indication lines according to the depth adjustment instructions of the user on all the depth indication lines.
Of course, the user may also add or delete one or more depth indication lines marked on the first angle view.
The clipping method provided by the application is further described below in connection with a software clipping tool implemented by the application.
The software cutting tool realized by the application can be used for a user to select the display view angle and the cutting surface of the three-dimensional ultrasonic image, and can adjust the number of cutting bodies (realized by adjusting the number of depth indication lines), the distance between the cutting surfaces (realized by adjusting the distance between the number of depth indication lines), the position of a single cutting surface (realized by adjusting the depth indicated by the corresponding depth indication line), and integrally move a plurality of cutting surfaces (realized by translating each depth indication line), so as to enlarge or reduce a certain cutting body.
Referring to fig. 5, a specific process may include:
step1: the user transmits the three-dimensional ultrasonic image into a software cutting tool;
step2: displaying different display view angles of the three-dimensional ultrasonic image in the software cutting tool so as to enable a user to select a certain view angle; as shown in fig. 6, different viewing angles of the same three-dimensional ultrasound image are respectively displayed in a 2×2 split screen mode. The first 3 figures in fig. 6 are two-dimensional plan views at different viewing angles, and the last is a three-dimensional plan view at a front viewing angle.
Step3: the user traces on the front of the selected presentation view angle to select a crop area (i.e., crop face).
Step4: the software cutting tool takes the display view angle selected by the user as the front surface, determines the side view surface of the three-dimensional ultrasonic image, simultaneously displays cutting indication lines with different depths on the side view surface according to default cutting depth configuration information, and marks the depth value indicated by each cutting indication line;
step5: the software cropping tool responds to the adjustment of the cropping depth configuration information by a user, namely, acquires adjustment parameters, and adjusts corresponding cropping indication lines based on the adjustment parameters.
Step6: the software cutting tool confirms the cutting effect aiming at the same cutting area, cuts the three-dimensional ultrasonic image to different depths by utilizing different cutting indication lines, and displays the obtained corresponding cutting body on the same interface of the upper computer.
The display effect diagram can be seen in fig. 7. Fig. 7 shows the marking effect of each cutting body and cutting indication lines with different depths on the test surface in a split screen mode of 3*3. Wherein the split screen mode may be selected or configured by a user in the software cropping tool.
The first image in the upper left corner in fig. 7 is a three-dimensional ultrasonic image which is not cut at the right view angle, and 8 cutting indication lines are arranged in the first image, and the indication lines indicate that: the depth position of the clipping surface. In addition, 8 cutting body effect display windows are arranged, the upper left corner of each window is provided with a digital number, the digital number corresponds to the depth number of the cutting indication line, and the cutting body displayed in each window is obtained by cutting the corresponding cutting surface. The user selects a certain cutting body and can zoom in or zoom out the cutting body.
Referring to fig. 8, the software cropping tool performs cropping of different depths on the three-dimensional ultrasound image by using different cropping indication lines for the same cropping area, including: the software cutting tool firstly generates a two-dimensional cutting template based on the cutting area, and then the cutting template is respectively corresponding to each cutting indication line, so that cutting bodies of the same cutting area in different depths are determined.
For example: the generation process of the cutting body 1 comprises the following steps: determining a space region formed by the depth 1 and the two-dimensional clipping template in the three-dimensional ultrasonic image, and clipping the outline of the space region in the three-dimensional ultrasonic image to obtain a clipping body 1; correspondingly, the generation process of the cutting body 2 comprises the following steps: and determining a space region formed by the depth 2 and the two-dimensional clipping template in the three-dimensional ultrasonic image, and clipping the outline of the space region in the three-dimensional ultrasonic image to obtain a clipping body 2. The generation of other cuts and so on.
Therefore, the embodiment can simultaneously present the cutting effect of different cutting depths, and a doctor can intuitively compare the cutting body effect under different cutting depths and can freely fine-tune each cutting body by using the software cutting tool, so that the needed cutting depth and the cutting body effect can be rapidly and accurately positioned.
Referring to fig. 9, fig. 9 is a schematic diagram of a three-dimensional ultrasound image clipping device according to an embodiment of the present application, including:
a first acquisition module 901, configured to acquire a three-dimensional ultrasound image;
a marking module 902, configured to mark at least two depth indication lines on a first angular view of the three-dimensional ultrasound image;
a second obtaining module 903, configured to obtain a clipping plane selected by a user on a second angular view of the three-dimensional ultrasound image;
the clipping module 904 is configured to determine clipping volumes corresponding to each depth indication line in the three-dimensional ultrasound image based on the clipping surface and the depth indicated by each depth indication line, and display each clipping volume at the same time.
In one embodiment, the marking module is specifically configured to:
marking at least two depth indication lines on the first angle view according to default cutting depth configuration information or cutting depth configuration information input by a user; the clipping depth configuration information comprises: the number of depth indication lines, the spacing between two adjacent depth indication lines and/or the depth indicated by each depth indication line.
In one embodiment, the method further comprises:
the first adjusting module is used for adjusting the depth indicated by any depth indication line according to the depth adjusting instruction of the user on the depth indication line.
In one embodiment, the method further comprises:
and the second adjusting module is used for adjusting the depth indicated by all the depth indication lines simultaneously according to the depth adjusting instructions of the user on all the depth indication lines.
In one embodiment, the second angular view is a front view, and the clipping plane is defined by a user tracing on the front view.
In one embodiment, the clipping module is specifically configured to:
and determining a space region formed by the region indicated by the cutting surface and the depth indicated by the depth indication line in the three-dimensional ultrasonic image as a cutting body corresponding to the depth indication line aiming at each depth indication line.
In one embodiment, the clipping module is specifically configured to:
and simultaneously displaying all the cutting bodies and a first angle view marked with at least two depth indication lines on the same display window.
The more specific working process of each module and unit in this embodiment may refer to the corresponding content disclosed in the foregoing embodiment, and will not be described herein.
Therefore, the embodiment provides a three-dimensional ultrasonic image clipping device, which can determine a plurality of clipping bodies in a three-dimensional ultrasonic image at the same time and display the clipping bodies at the same time, so that doctors can conveniently compare the clipping bodies at different clipping depths, and the needed clipping parts can be positioned rapidly and accurately without repeatedly carrying out trial clipping on the three-dimensional ultrasonic image, thereby improving clipping efficiency.
Further, the embodiment of the application also provides ultrasonic equipment. The above-mentioned ultrasonic device may be the server 50 shown in fig. 10 or the terminal 60 shown in fig. 11. Fig. 10 and 11 are each a block diagram of an ultrasonic apparatus according to an exemplary embodiment, and the contents of the drawings should not be construed as any limitation on the scope of use of the present application.
Fig. 10 is a schematic structural diagram of a server according to an embodiment of the present application. The server 50 may specifically include: at least one processor 51, at least one memory 52, a power supply 53, a communication interface 54, an input output interface 55, and a communication bus 56. Wherein the memory 52 is configured to store a computer program that is loaded and executed by the processor 51 to implement the relevant steps in the three-dimensional ultrasound image cropping disclosed in any of the foregoing embodiments.
In this embodiment, the power supply 53 is configured to provide an operating voltage for each hardware device on the server 50; the communication interface 54 can create a data transmission channel between the server 50 and an external device, and the communication protocol to be followed is any communication protocol applicable to the technical solution of the present application, which is not specifically limited herein; the input/output interface 55 is used for acquiring external input data or outputting external output data, and the specific interface type thereof may be selected according to the specific application needs, which is not limited herein.
The memory 52 may be a carrier for storing resources, such as a read-only memory, a random access memory, a magnetic disk, or an optical disk, and the resources stored thereon include an operating system 521, a computer program 522, and data 523, and the storage may be temporary storage or permanent storage.
The operating system 521 is used for managing and controlling various hardware devices on the Server 50 and the computer program 522 to implement the operation and processing of the data 523 in the memory 52 by the processor 51, which may be Windows Server, netware, unix, linux, etc. The computer program 522 may further include a computer program that can be used to perform other specific tasks in addition to the computer program that can be used to perform the three-dimensional ultrasound image cropping method disclosed in any of the previous embodiments. The data 523 may include data such as update information of an application program and data such as developer information of the application program in addition to the three-dimensional ultrasound image.
Fig. 11 is a schematic structural diagram of a terminal provided in an embodiment of the present application, and the terminal 60 may specifically include, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, or the like.
Generally, the terminal 60 in this embodiment includes: a processor 61 and a memory 62.
Processor 61 may include one or more processing cores, such as a 4-core processor, an 8-core processor, etc. The processor 61 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 61 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 61 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 61 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 62 may include one or more computer-readable storage media, which may be non-transitory. Memory 62 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In the present embodiment, the memory 62 is at least used for storing a computer program 621, which, when loaded and executed by the processor 61, is capable of implementing the relevant steps in the three-dimensional ultrasound image cropping method performed by the terminal side as disclosed in any of the foregoing embodiments. In addition, the resources stored by the memory 62 may also include an operating system 622, data 623, and the like, and the storage manner may be transient storage or permanent storage. The operating system 622 may include Windows, unix, linux, among others. The data 623 may include, but is not limited to, updated information of the application, three-dimensional ultrasound images.
In some embodiments, the terminal 60 may further include a display 63, an input-output interface 64, a communication interface 65, a sensor 66, a power supply 67, and a communication bus 68.
Those skilled in the art will appreciate that the structure shown in fig. 11 is not limiting of the terminal 60 and may include more or fewer components than shown.
Further, the embodiment of the application also discloses a storage medium, wherein the storage medium stores computer executable instructions, and when the computer executable instructions are loaded and executed by a processor, the three-dimensional ultrasonic image clipping method disclosed in any embodiment is realized. For specific steps of the method, reference may be made to the corresponding contents disclosed in the foregoing embodiments, and no further description is given here.
It should be noted that the foregoing is merely a preferred embodiment of the present application, and is not intended to limit the present application, but any modification, equivalent replacement, improvement, etc. that comes within the spirit and principles of the present application are included in the scope of protection of the present application.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, so that the same or similar parts between the embodiments are referred to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The principles and embodiments of the present application are described herein with specific examples, the above examples being provided only to assist in understanding the methods of the present application and their core ideas; meanwhile, as those skilled in the art will have variations in embodiments and application ranges based on the ideas of the present application, the present disclosure should not be construed as limiting the present application in view of the above.

Claims (10)

1. A method for clipping a three-dimensional ultrasound image, comprising:
acquiring a three-dimensional ultrasonic image;
marking at least two depth indication lines on a first angle view of the three-dimensional ultrasonic image;
acquiring a clipping surface selected by a user on a second angle view of the three-dimensional ultrasonic image;
and respectively determining a cutting body corresponding to each depth indication line in the three-dimensional ultrasonic image based on the cutting surface and the depth indicated by each depth indication line, and simultaneously displaying each cutting body.
2. The method of three-dimensional ultrasound image cropping according to claim 1, wherein said marking at least two depth indication lines on a first angular view of the three-dimensional ultrasound image comprises:
marking at least two depth indication lines on the first angle view according to default clipping depth configuration information or clipping depth configuration information input by a user; wherein the clipping depth configuration information includes: the number of depth indication lines, the spacing between two adjacent depth indication lines and/or the depth indicated by each depth indication line.
3. The method of three-dimensional ultrasound image cropping according to claim 1, wherein after marking at least two depth indication lines on the first angular view of the three-dimensional ultrasound image, further comprising:
and adjusting the depth indicated by the depth indication line according to a depth adjustment instruction of a user on any depth indication line.
4. The method of three-dimensional ultrasound image cropping according to claim 1, wherein after marking at least two depth indication lines on the first angular view of the three-dimensional ultrasound image, further comprising:
and simultaneously adjusting the depths indicated by all the depth indication lines according to the depth adjustment instructions of the user on all the depth indication lines.
5. The method of three-dimensional ultrasound image cropping according to claim 1, wherein the second angular view is a front view, the cropping surface being determined by a user tracing over the front view.
6. The method of clipping a three-dimensional ultrasound image according to any one of claims 1 to 5, wherein determining a clipping body corresponding to each depth indication line in the three-dimensional ultrasound image based on the clipping surface and the depth indicated by each depth indication line, respectively, comprises:
and determining a space region formed by the region indicated by the cutting surface and the depth indicated by the depth indication line in the three-dimensional ultrasonic image as a cutting body corresponding to the depth indication line aiming at each depth indication line.
7. The method of clipping a three-dimensional ultrasound image according to any one of claims 1 to 5, wherein the simultaneously displaying the respective clipping bodies includes:
and simultaneously displaying all the cutting bodies and the first angle view marked with at least two depth indication lines on the same display window.
8. A three-dimensional ultrasound image cropping device, comprising:
the first acquisition module is used for acquiring a three-dimensional ultrasonic image;
the marking module is used for marking at least two depth indication lines on a first angle view of the three-dimensional ultrasonic image;
the second acquisition module is used for acquiring a clipping surface selected by a user on a second angle view of the three-dimensional ultrasonic image;
and the cutting module is used for respectively determining cutting bodies corresponding to each depth indication line in the three-dimensional ultrasonic image based on the cutting surface and the depth indicated by each depth indication line, and simultaneously displaying each cutting body.
9. An ultrasound device, comprising a processor and a memory; wherein the memory is for storing a computer program that is loaded and executed by the processor to implement the three-dimensional ultrasound image cropping method of any one of claims 1 to 7.
10. A storage medium having stored therein computer executable instructions which, when loaded and executed by a processor, implement the three-dimensional ultrasound image cropping method of any one of claims 1 to 7.
CN202111339868.0A 2021-11-12 2021-11-12 Three-dimensional ultrasonic image cutting method and device, ultrasonic equipment and storage medium Pending CN116128888A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111339868.0A CN116128888A (en) 2021-11-12 2021-11-12 Three-dimensional ultrasonic image cutting method and device, ultrasonic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111339868.0A CN116128888A (en) 2021-11-12 2021-11-12 Three-dimensional ultrasonic image cutting method and device, ultrasonic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116128888A true CN116128888A (en) 2023-05-16

Family

ID=86294312

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111339868.0A Pending CN116128888A (en) 2021-11-12 2021-11-12 Three-dimensional ultrasonic image cutting method and device, ultrasonic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116128888A (en)

Similar Documents

Publication Publication Date Title
US8411107B2 (en) Adaptive snapping
KR20160048901A (en) System and method for determining the extent of a plane in an augmented reality environment
KR20210025057A (en) Methods and systems for dynamic adjustment of models
US10354447B2 (en) Image processing device, image processing method, and image processing program
JP4883791B2 (en) Information processing apparatus and display method
US8982127B2 (en) Computing device and method for establishing three dimensional coordinate system using graphics
EP3828729A2 (en) Map coordinate processing method, map coordinate processing device, electronic device, and storage medium
CN106502701A (en) The method and device of component alignment during a kind of establishment webpage
CN110782517A (en) Point cloud marking method and device, storage medium and electronic equipment
CN115847384B (en) Mechanical arm safety plane information display method and related products
US20210092305A1 (en) Designation device and non-transitory recording medium
KR101913212B1 (en) Display device and control method of display device
CN113129362B (en) Method and device for acquiring three-dimensional coordinate data
CN116128888A (en) Three-dimensional ultrasonic image cutting method and device, ultrasonic equipment and storage medium
US10186047B2 (en) Apparatus and method for generating a depth map
CN105657210A (en) 3D printing display method, system and electronic device
KR101991401B1 (en) Method and apparatus for displaying augmented reality
KR20160143936A (en) Method for increasing 3D rendering performance and system thereof
CN114663612A (en) High-precision map construction method and device and electronic equipment
CN114820968A (en) Three-dimensional visualization method and device, robot, electronic device and storage medium
US20140074432A1 (en) Electronic device and method for measuring outline of object
CN112308766B (en) Image data display method and device, electronic equipment and storage medium
EP3230957B1 (en) Virtual interactive definition of volumetric shapes
CN114564268A (en) Equipment management method and device, electronic equipment and storage medium
US20200244943A1 (en) Graphical user interface for indicating off-screen points of interest

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination