CN115811961A - Three-dimensional display method and ultrasonic imaging system - Google Patents

Three-dimensional display method and ultrasonic imaging system Download PDF

Info

Publication number
CN115811961A
CN115811961A CN202080102793.1A CN202080102793A CN115811961A CN 115811961 A CN115811961 A CN 115811961A CN 202080102793 A CN202080102793 A CN 202080102793A CN 115811961 A CN115811961 A CN 115811961A
Authority
CN
China
Prior art keywords
ablation
lesion
dimensional model
dimensional
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080102793.1A
Other languages
Chinese (zh)
Inventor
于开欣
丛龙飞
王超
周文兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN115811961A publication Critical patent/CN115811961A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Otolaryngology (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A three-dimensional display method and an ultrasonic imaging system, the method comprises the following steps: acquiring an ultrasonic image acquired aiming at a focus, and registering the ultrasonic image with a three-dimensional image containing the focus; segmenting the focus in the three-dimensional image, and displaying a three-dimensional model of the focus in a three-dimensional display window based on the results of the registration and the segmentation; displaying a three-dimensional model of the ablation focus in a three-dimensional display window according to the registration result and the acquired ablation parameters; determining an ablation region according to the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus; and when the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus meets the rotation condition, automatically rotating the current view from the current view angle to a target view angle at which the ablation area meets the preset requirement of the ablation focus, and displaying the three-dimensional model of the focus and the three-dimensional model of the ablation focus under the target view angle. The system comprises: an ultrasound probe (110), a transmitting circuit (112), a receiving circuit (114), a processor (118), a display (120). The three-dimensional display scheme can automatically rotate the visual angles of the three-dimensional model of the observed focus and the three-dimensional model of the ablation focus, and is convenient for a user to operate.

Description

Three-dimensional display method and ultrasonic imaging system Technical Field
The present application relates to the field of ultrasound imaging technologies, and in particular, to a three-dimensional display method and an ultrasound imaging system.
Background
The real-time ultrasonic-guided percutaneous puncture tumor ablation interventional therapy has the advantages of high curative effect, small invasion, quick postoperative recovery and the like, and the position in tumor therapy is increasingly important. The current tumor ablation interventional therapy is mainly carried out based on two-dimensional ultrasonic image guidance, namely, a doctor finds out the approximate position of a tumor region through a real-time ultrasonic image or an ultrasonic contrast image, roughly estimates a two-dimensional surface where the maximum diameter of a tumor is located, formulates an ablation scheme based on the two-dimensional image and guides ablation.
With the development of new technologies, three-dimensional reconstruction of acquired two-dimensional images and rendering of target structures such as tumors in the three-dimensional images have been performed by using computer three-dimensional reconstruction software and image processing technologies, so as to realize three-dimensional visualization of the target structures. The three-dimensional visual three-dimensional structure image can provide an area which is difficult to display in a two-dimensional image and obtain objective anatomical information, has the characteristics of accuracy, liveness and vividness, and can visually, clearly and randomly display the position relation between the tumor and surrounding tissues at any angle. Based on the real-time ultrasonic images and the position information of the ablation needle provided by the positioning device, after the real-time ultrasonic images, the ablation needle and the three-dimensional images are registered and mapped to the same space, the three-dimensional visual display window objectively and accurately shows the position relation of the tumor and the ablation needle and the ablation condition of the current ablation to a focus along with the movement of the positioning device, so that a doctor can intuitively perform surgical planning on the tumor, optimize a surgical scheme, improve surgical skills and further improve the surgical safety.
However, in the actual operation process, when a doctor performs tumor ablation, the doctor can only see the ablation condition of the three-dimensional tumor displayed at the current angle, and cannot see the ablation images at the back side of the tumor and other angles, and after the doctor completely ablates the tumor at the current angle or at a certain angle, the doctor needs to manually rotate to find the next suitable ablation angle, which causes inconvenience in clinical application.
Disclosure of Invention
A series of concepts in a simplified form are introduced in the summary section, which is described in further detail in the detailed description section. The summary of the invention is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
An embodiment of the present invention provides a three-dimensional display method, including:
acquiring an ultrasonic image acquired aiming at a focus;
registering the ultrasonic image with a pre-acquired three-dimensional image containing the lesion;
segmenting the lesion in a three-dimensional image containing the lesion, and displaying a three-dimensional model of the lesion in a three-dimensional display window based on a result of the registration and a result of the segmentation;
acquiring ablation parameters of an ablation focus, and displaying a three-dimensional model of the ablation focus in the three-dimensional display window according to the registration result and the acquired ablation parameters;
determining an ablation region according to the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus; and
when the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus meets the rotation condition, automatically rotating the current view of the three-dimensional model of the focus and the three-dimensional model of the ablation focus in the three-dimensional display window from the current view angle to a target view angle at which the ablation area meets the preset requirement of the ablation focus, and displaying the three-dimensional model of the focus and the three-dimensional model of the ablation focus at the target view angle.
Another aspect of the embodiments of the present application provides a three-dimensional display method, where the method includes:
displaying two or more than two three-dimensional display windows on a display interface;
displaying a current view in one of the two or more three-dimensional display windows, wherein the current view includes a three-dimensional model of a lesion and a three-dimensional model of an ablation lesion at a current viewing angle;
displaying, at remaining three-dimensional display windows of the two or more windows, views at other viewing angles, the views at other viewing angles including a three-dimensional model of the lesion and a three-dimensional model of the lesion at other viewing angles.
In another aspect, an embodiment of the present invention provides an ultrasound imaging system, including:
an ultrasonic probe;
the transmitting circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to the focus;
the receiving circuit is used for controlling the ultrasonic probe to receive the ultrasonic echo returned from the focus to obtain an ultrasonic echo signal;
a processor to:
processing the ultrasonic echo signal to obtain an ultrasonic image;
registering the ultrasonic image with a pre-acquired three-dimensional image containing the lesion;
segmenting the lesion in a three-dimensional image containing the lesion, and displaying a three-dimensional model of the lesion in a three-dimensional display window of a display based on a result of the registration and a result of the segmentation;
acquiring ablation parameters of an ablation focus, and displaying a three-dimensional model of the ablation focus in the three-dimensional display window according to the registration result and the acquired ablation parameters;
determining an ablation region according to the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus;
when the rotation condition is met, rotating the current view of the three-dimensional display window from the current view angle to a target view angle of an ablation area meeting the preset requirement of ablating the focus, and displaying a three-dimensional model of the focus and a three-dimensional model of the ablation focus under the target view angle;
the display is used for displaying the three-dimensional model of the focus and the three-dimensional model of the ablation focus.
The three-dimensional display scheme can automatically rotate the visual angles of the three-dimensional model for observing the focus and the three-dimensional model for the ablation focus, and is greatly convenient for ablation operation.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
In the drawings:
FIG. 1 shows a schematic block diagram of an ultrasound imaging system according to an embodiment of the invention;
FIG. 2 shows a schematic flow diagram of a three-dimensional display method according to an embodiment of the invention;
FIG. 3 is a schematic diagram illustrating a spatial transformation in a three-dimensional display method according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a three-dimensional display window in a three-dimensional display method according to an embodiment of the present invention;
fig. 5 is a schematic view illustrating a main window and an auxiliary window in a three-dimensional display method according to an embodiment of the present invention;
FIG. 6 illustrates a schematic view of the interface shown in FIG. 5 after rotating the viewing angle, in accordance with one embodiment of the present invention;
FIG. 7 shows a schematic flow diagram of a three-dimensional display method according to another embodiment of the invention;
fig. 8 is a schematic diagram illustrating a display interface in a three-dimensional display method according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail below with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the application described in the application without inventive step, shall fall within the scope of protection of the application.
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present application. It will be apparent, however, to one skilled in the art, that the present application may be practiced without one or more of these specific details. In other instances, well-known features of the art have not been described in order to avoid obscuring the present application.
It is to be understood that the present application is capable of implementation in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
In order to provide a thorough understanding of the present application, a detailed structure will be presented in the following description in order to explain the technical solutions presented in the present application. Alternative embodiments of the present application are described in detail below, however, the present application may have other implementations in addition to these detailed descriptions.
Next, an ultrasound imaging system according to an embodiment of the present application is first described with reference to fig. 1, and fig. 1 shows a schematic structural block diagram of an ultrasound imaging system 100 according to an embodiment of the present application.
As shown in fig. 1, the ultrasound imaging system 100 includes an ultrasound probe 110, transmit circuitry 112, receive circuitry 114, a processor 118, and a display 120. Further, the ultrasound imaging system may further include a beam forming circuit 116 and a transmit/receive selection switch 122, and the transmit circuit 112 and the receive circuit 114 may be connected to the ultrasound probe 110 through the transmit/receive selection switch 122.
The ultrasound probe 110 includes an array of a plurality of transducer elements for transmitting ultrasound waves according to electrical signals or converting received ultrasound echoes into electrical signals. The plurality of transducers can be arranged in a row to form a linear array, or can be arranged in a two-dimensional matrix to form an area array, and the plurality of transducers can also form a convex array, a phased array and the like.
The transducers may transmit ultrasound waves in response to an excitation electrical signal or convert received ultrasound waves into electrical signals, whereby each transducer may be used to transmit ultrasound waves to tissue in the target region and also to receive ultrasound echoes returned through the tissue. In making ultrasonic measurements, it may be controlled by transmit circuitry 112 and receive circuitry 114 which transducers are used to transmit ultrasonic waves and which transducers are used to receive ultrasonic waves, or to control the transducers to be time-slotted for transmitting ultrasonic waves or receiving ultrasonic echoes. All transducers participating in the transmission of the ultrasonic waves can be excited simultaneously by the electrical signal, so that the ultrasonic waves are transmitted simultaneously; or the transducers participating in the transmission of the ultrasound waves may be excited by several electrical signals with certain time intervals so as to continuously transmit the ultrasound waves with certain time intervals.
The ultrasonic probe 110 is provided with a positioning sensor, and when the ultrasonic probe 110 moves, the specific position of the current ultrasonic sector can be known according to the coordinate change of the positioning sensor. In some embodiments, the ultrasound probe 110 is provided with a puncture frame for fixing the ablation needle during the ablation operation, and the angle and position of the ablation needle can be known based on the angle of the puncture frame and the depth of the ablation needle. In some embodiments, a positioning sensor (Vtrax) is disposed on the ablation needle, so that when the ablation needle moves, the angle and position of the ablation needle can be known according to the coordinate change of the positioning sensor.
During ultrasound imaging, the transmit circuitry 112 sends delay-focused transmit pulses through the transmit/receive select switch 122 to the ultrasound probe 110. The ultrasonic probe 110 is excited by the transmission pulse to transmit an ultrasonic beam to a target region of the object to be measured, receives an ultrasonic echo with tissue information reflected from the target region after a certain delay, and converts the ultrasonic echo back into an electric signal again. The receiving circuit 114 receives the electrical signals generated by the ultrasound probe 110, obtains ultrasound echo signals, and sends the ultrasound echo signals to the beam forming circuit 116, which performs processing such as focusing delay, weighting, and channel summation on the ultrasound echo data, and then sends the ultrasound echo data to the processor 118. The processor 118 performs signal detection, signal enhancement, data conversion, logarithmic compression, and the like on the ultrasound echo data to form an ultrasound image. The ultrasound images obtained by the processor 118 may be displayed on the display 120 or may be stored in memory.
Alternatively, processor 118 may be implemented as software, hardware, firmware, or any combination thereof, and may use single or multiple Application Specific Integrated Circuits (ASICs), single or multiple general purpose Integrated circuits (USICs), single or multiple microprocessors, single or multiple Programmable Logic Devices (PLDs), or any combination thereof, or other suitable circuits or devices. Also, the processor 118 may control other components in the ultrasound imaging system 100 to perform the respective steps of the methods in the various embodiments herein.
The display 120 is connected to the processor 118, and the display 120 may be a touch screen, a liquid crystal display, or the like; or the display 120 may be a separate display device such as a liquid crystal display, a television, etc. that is separate from the ultrasound imaging system 100; or the display 120 may be a display screen of an electronic device such as a smartphone, a tablet computer, and so on. The number of the display 120 may be one or more. For example, the display 120 may include a main screen mainly for displaying ultrasound images and a touch screen mainly for human-computer interaction.
The display 120 may display the ultrasound image obtained by the processor 118. In addition, the display 120 may also provide a graphical interface for human-computer interaction for the user while displaying the ultrasound image, and one or more controlled objects are set on the graphical interface, so that the user may be provided with a human-computer interaction device to input an operation instruction to control the controlled objects, thereby performing a corresponding control operation. For example, an icon is displayed on the graphical interface, and the icon can be operated by the man-machine interaction device to execute a specific function, such as rotating the view angle of the current view.
Optionally, the ultrasound imaging system 100 may further include a human-computer interaction device other than the display 120, which is connected to the processor 118, for example, the processor 118 may be connected to the human-computer interaction device through an external input/output port, which may be a wireless communication module, a wired communication module, or a combination thereof. The external input/output port may also be implemented based on USB, bus protocols such as CAN, and/or wired network protocols, etc.
The human-computer interaction device may include an input device for detecting input information of a user, for example, control instructions for the transmission/reception timing of the ultrasonic waves, operation input instructions for drawing points, lines, frames, or the like on the ultrasonic images, or other instruction types. The input device may include one or more of a keyboard, mouse, scroll wheel, trackball, mobile input device (such as a mobile device with a touch screen display, cell phone, etc.), multi-function knob, and the like. The human interaction means may also include an output device such as a printer.
The ultrasound imaging system 100 may also include a memory for storing instructions executed by the processor, storing received ultrasound echoes, storing ultrasound images, and so forth. The memory may be a flash memory card, solid state memory, hard disk, etc. Which may be volatile memory and/or non-volatile memory, removable memory and/or non-removable memory, etc.
It should be understood that the components included in the ultrasound imaging system 100 shown in fig. 1 are merely illustrative and that more or fewer components may be included. This is not a limitation of the present application.
Next, a three-dimensional display method according to an embodiment of the present application will be described with reference to fig. 2. Fig. 2 is a schematic flow chart of a three-dimensional display method 200 according to an embodiment of the present application. The three-dimensional display method 200 of the embodiment of the application can be used for surgical planning of percutaneous needle ablation surgery.
As shown in fig. 2, a three-dimensional display method 200 according to an embodiment of the present application includes the following steps:
step S210, an ultrasound image acquired for the lesion is acquired.
Wherein, the focus is the focus area of the target part of the tested object. The object to be tested can be a patient needing ablation operation, the target part of the object to be tested can be a liver part, and then the focus area can be a liver tumor area; the target part of the object to be measured can be other parts of the human body, such as prostate, thyroid gland, mammary gland, etc., and the focus area is the focus area of the parts. Hereinafter, the three-dimensional display scheme of the present application is described mainly taking the target site as the liver site as an example, but it should be understood that this is only exemplary and the three-dimensional display scheme of the present application may be applied to any other site.
Exemplarily, in conjunction with fig. 1, in step S210, the ultrasound probe 110 may be excited by the transmission/reception selection switch 122 to transmit an ultrasonic wave to the target portion of the measured object via the transmission circuit 112 at a fixed time, and an ultrasonic echo returned from the target portion of the measured object is received by the ultrasound probe 110 via the reception circuit 114 and converted into an ultrasonic echo signal. The signal processing may be performed by the beamforming module 116 and the beamformed ultrasound echo data may then be fed into the processor 118 for associated processing to obtain an ultrasound image.
According to different imaging modes required by a user, the processor 118 may perform different processing on the ultrasound echo signal to obtain ultrasound data in different modes, and then perform processing such as logarithmic compression, dynamic range adjustment, digital scan conversion, and the like to form ultrasound images in different modes, for example, two-dimensional ultrasound images including a B image, a C image, a D image, and the like.
In practical application, an operator can scan a target part of a detected object by using an ultrasonic probe, and when the anatomical structure of a focus appears in a scanned image, the ultrasonic image can be frozen, so that an ultrasonic image acquired aiming at the focus is obtained.
Step S220, registering the ultrasound image with a pre-acquired three-dimensional image containing the lesion.
By way of example, the operator may import the three-dimensional image containing the lesion into the ultrasound imaging system in advance before starting the ultrasound measurement, including but not limited to import via a storage medium such as a usb disk or an optical disk, or import via network transmission.
The three-dimensional image containing the lesion may be acquired by a medical Imaging device such as a Computed Tomography (CT) device, a Magnetic Resonance Imaging (MRI) device, a Positron Emission Tomography (PET) device, a Digital X-ray Imaging device, an ultrasound device, a Digital Subtraction Imaging (DSA) device, an optical Imaging device, or the like. Since three-dimensional reconstruction takes a long time, a three-dimensional image containing a lesion is previously acquired before an operation. The three-dimensional image acquired before the operation and the ultrasonic image are registered, so that the spatial information of the three-dimensional image can be utilized, and the real-time property of the ultrasonic image is realized.
Registering the ultrasonic image and the three-dimensional image, namely seeking a spatial transformation relation between the ultrasonic image and the three-dimensional image, so that the geometric relations of corresponding points of the ultrasonic image and the three-dimensional image are in one-to-one correspondence. Registration may include rigid body registration or non-rigid body registration.
In one embodiment, the positioning sensor fixed on the ultrasonic probe can continuously provide position information along with the movement of the ultrasonic probe, the 6-degree-of-freedom space position of the ultrasonic probe can be obtained through the magnetic positioning controller, and the ultrasonic image and the three-dimensional image can be subjected to registration fusion by utilizing the image information and the magnetic positioning information. The processor can be connected to a positioning sensor arranged on the ultrasonic probe in a wired or wireless mode to acquire probe position information. The positioning sensor can adopt any type of structure or principle such as an optical positioning sensor or a magnetic field positioning sensor to position the ultrasonic probe.
The spatial transformation relationship between the general ultrasound image and the three-dimensional image is shown in fig. 3, and is expressed in the form of a formula:
X sec =P·R probe ·A·X us
wherein, X us Is the coordinate of the point in the ultrasound image space, X sec Is the coordinate of the corresponding point in the three-dimensional image space; a is the ultrasound image space (coordinate denoted X) us ,Y us ,Z us ) To the positioning sensor space (coordinates denoted X) sensor ,Y sensor ,Z sensor ) Of the transformation matrix R probe Is positioning the sensor space to the world coordinate space (coordinates denoted X) MG ,Y MG ,Z MG ) P is a transformation matrix of the world coordinate system to the three-dimensional image space. In the ultrasonic measurement process, the positioning sensor is fixed on the ultrasonic probe, when the model of the ultrasonic probe is unchanged, A is fixed and can be determined by a calibration method before registration. R probe Read directly by the magnetic positioning controller, R as the ultrasonic probe moves probe And is constantly changing. P is calculated from the result of registration, i.e. the result of image registration in ultrasound space and three-dimensional image space is M, then:
P=M·A -1 ·R probe -1 ;M=P·A·R probe
the registration method adopted by the embodiment of the application may include automatic registration, interactive registration, manual registration, or any combination of the above three methods. Registration may include anatomical or geometric based registration, pixel intensity correlation based registration, external localization marker based registration, and the like. Registration may also include any other suitable registration means.
In one embodiment, registering the ultrasound image with the three-dimensional image specifically includes: aligning the ultrasonic image with the corresponding tangent plane in the three-dimensional image, and determining a coordinate transformation matrix from the ultrasonic image space to the three-dimensional image space according to the coordinates of the points in the ultrasonic image and the coordinates of the corresponding points in the three-dimensional image. The above-mentioned alignment operation may be performed manually by a user, that is, the manual alignment operation of the user is received, so as to align the ultrasound image with the corresponding slice in the three-dimensional image.
In another embodiment, the same tissue in the ultrasound image and the three-dimensional image may be identified for automatic alignment. When the target site is a liver site, the same tissue identified is, for example, a blood vessel, a liver envelope, or the like. After aligning the ultrasound image with the three-dimensional image, a spatial transformation matrix may be calculated from the coordinates of the coincident points.
In other embodiments, the feature points in the ultrasound image and the three-dimensional image may be determined first, and the feature points generally have some properties of translational invariance, rotational invariance, scale invariance, insensitivity to illumination, insensitivity to modality, and the like, and the properties of the feature points are determined by the feature point extraction method. Features of the feature points are then extracted, which may be generated by neighborhood gradient histograms, neighborhood autocorrelation, gray scale, etc. And then, matching the characteristic points of the ultrasonic images with the characteristic points of the three-dimensional ultrasonic images, and calculating a space transformation matrix based on the matched characteristic points.
In addition, the position of the external marker in the three-dimensional image can be identified, and the position of the external marker in the space of the ultrasonic image can be determined based on magnetic navigation, so as to perform automatic alignment. The in-vitro markers are, for example, one or more metal markers arranged on the body surface of a patient, and can form obvious light spots in a three-dimensional image so as to obtain the positions of the markers.
When the three-dimensional image of the lesion area is a three-dimensional ultrasonic image, if the three-dimensional image carries position information, automatic registration can be performed based on the position information carried by the ultrasonic image and the three-dimensional ultrasonic image. The three-dimensional image can be acquired by a volume probe, or reconstructed by a convex array and a linear array probe with magnetic navigation equipment based on a Freehand three-dimensional ultrasonic reconstruction technology, or scanned by an area array probe. The three-dimensional ultrasonic image reconstructed based on the magnetic navigation position information can be a reconstructed three-dimensional ultrasonic image obtained by scanning a section of ultrasonic film with positioning information through field Freehand, and the position information can be obtained during scanning, so that the P matrix in the text can be automatically obtained.
In one embodiment, the ablation operation on the soft tissues of the abdomen, such as the liver, the lung and the like, causes the position of the soft tissues and the lesion position to shift due to the influence of the respiratory motion of the patient, so that a respiratory correction function is introduced in the registration process for respiratory correction. As shown in fig. 3, the added T (T) is a spatial mapping manner for respiration correction, and the spatial transformation relationship between the ultrasound image and the three-dimensional image is expressed in the form of a formula when T (T) varies with time:
X Sec =T(t)·P·R probe ·A·X us
in addition, the position deviation caused by the breathing motion can be corrected by adopting a mode of enabling the patient to breathe stably and the like.
In an embodiment, the ultrasound image and the three-dimensional image may be fused according to the correspondence between the ultrasound image and the three-dimensional image obtained in step S220, and the fused image of the ultrasound image and the three-dimensional image is displayed in a fusion display window of the display. Specifically, the ultrasound image space coordinate system can be mapped to the three-dimensional image space coordinate system based on the registration relation matrix, and since the positioning sensor is mounted on the ultrasound probe, when the ultrasound probe moves, the specific position relation between the current ultrasound sector and the focus located in the three-dimensional image space coordinate system can be known according to the coordinate change of the positioning sensor.
Step S230, segmenting the lesion in the three-dimensional image containing the lesion, and displaying a three-dimensional model of the lesion in a three-dimensional display window based on the result of the registration and the result of the segmentation. For example, a three-dimensional display window is provided on a display interface of the display for displaying a three-dimensional model of a lesion and a three-dimensional model of a lesion to be described later.
As an example, a three-dimensional contour of a lesion is first segmented in a three-dimensional image. In embodiments of the present application, any suitable method may be used to segment the lesion in the three-dimensional image, including but not limited to automatic segmentation, manual segmentation, or interactive segmentation. Illustratively, the automatic segmentation method may employ one or more of a random walk model, region growing, graph cut algorithm, pattern recognition, markov field, adaptive thresholding, and the like. The manual segmentation comprises the steps that a user outlines the edges of the focus on a plurality of two-dimensional slices of the three-dimensional data, interpolation is carried out between every two layers of edges, or the edges of the focus are outlined on every two-dimensional slice, and then the three-dimensional outline of the focus is generated based on the two-dimensional edges. The interactive segmentation method adds user interaction in the segmentation process as algorithm input, so that objects with high-level semantics in the image can be completely extracted. For example, a preliminary segmentation limit may be selected by the user box; and then, automatically segmenting the three-dimensional outline of the focus in the initial segmentation range. For another example, a user may draw some points or lines in the preliminary segmentation range, obtain the points or lines drawn by the user as input through an interactive segmentation algorithm, automatically establish a weighted graph of similarity between each pixel point and the foreground or background, and distinguish the foreground and the background by solving the minimum cut, thereby determining the three-dimensional contour of the lesion.
Then, surface reconstruction is performed according to the three-dimensional contour obtained by segmentation to generate a three-dimensional model of the lesion, such as the three-dimensional model 401 of the lesion shown in fig. 4.
The surface drawing method can be adopted, the surface of the three-dimensional target structure is firstly reconstructed from the three-dimensional data, namely the reconstruction of the surface of the object is carried out according to the segmentation result and the contour line, and then the reasonable illumination model and the texture mapping method are utilized to generate the three-dimensional entity with the sense of reality. It will be appreciated that since the image displayed by the display is still a two-dimensional plane, the surface rendering is actually a projection of the realism of the three-dimensional object on the two-dimensional plane, similar to the appearance of a three-dimensional object displayed on a photograph "taking a picture" of the three-dimensional object from a point when the viewing angle is at that point.
Then, a three-dimensional model of the lesion is displayed at a certain position (e.g., an intermediate position) in the three-dimensional display window according to the coordinates of the three-dimensional contour of the lesion in the three-dimensional image and the registration relationship between the ultrasound image and the three-dimensional image obtained in step S220, and the ultrasound image is displayed in the three-dimensional display window and the three-dimensional model of the lesion is displayed at the position of the lesion in the ultrasound image based on the registration relationship between the ultrasound image and the three-dimensional image obtained in step S220. The position relation between the reconstructed three-dimensional model of the focus and the real-time ultrasonic image can reflect the position, the size, the geometric shape and the relation between the focus and surrounding tissues. Specifically, the coordinate X of the focus in the three-dimensional image is determined sec_tumor Then, calculating the coordinate X of the lesion position in the real-time ultrasonic image based on the registration matrix of the three-dimensional image and the ultrasonic image us_tumor And displaying the three-dimensional model of the focus in a corresponding position in the three-dimensional display window in an overlapping manner. Wherein the content of the first and second substances,
X us_tumor =M -1 ·X sec_tumor =(P·R probe ·A) -1 X sec_tumor
step S240, acquiring ablation parameters of the ablation focus, and displaying a three-dimensional model of the ablation focus at a corresponding position in the three-dimensional display window according to the acquired ablation parameters and the registration result acquired in step S220, where the corresponding position is the position of the ablation focus determined according to the registration result and the acquired ablation parameters. The embodiment of the present application does not limit the execution sequence of step S230 and step S240. Through the two steps, the three-dimensional model of the focus and the three-dimensional model of the ablation focus can be displayed in the same image space, so that the position relation of the focus and the ablation focus in the same image space can be conveniently displayed for a user. According to the registration result, the image space of the three-dimensional display window can be the image space corresponding to the three-dimensional image, can be the image space corresponding to the ultrasonic image, and can be any other image space as long as the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion are ensured to be displayed in the same image space. The example of the present application will be described by taking an example in which two three-dimensional models are displayed in an image space of a three-dimensional image.
The three-dimensional model of the lesion includes a single-needle ablation model or a multi-needle combined ablation model. And the single-needle ablation model is a three-dimensional model of the ablation focus displayed on a three-dimensional display window according to a single ablation needle inserted into the focus based on the registration result and the acquired ablation parameters. The multi-needle combined ablation model is a three-dimensional model of an ablation focus displayed on the three-dimensional display window according to at least two ablation needles inserted into a focus sequentially or simultaneously based on the registration result and the acquired ablation parameters. For example, when the ablation focus is in the shape of an ellipsoid, the single-needle ablation model of the ablation focus displayed in the three-dimensional display window is a single ellipsoid, and the multi-needle combined ablation model includes a plurality of ellipsoids. As an example, the three-dimensional model of the lesion may be displayed in a different color than the three-dimensional model of the lesion to facilitate distinguishing between the two. In some embodiments, the three-dimensional model of the lesion and the three-dimensional model of the lesion may each be displayed as a semi-transparent graphic to facilitate viewing of the overlapping region of the two. Fig. 4 illustrates a three-dimensional model 402 of a lesion in one embodiment.
According to the three-dimensional model of the lesion displayed on the display interface and the three-dimensional model of the lesion as described above, the operator can intuitively determine the ablation region, i.e., the overlapping region of the three-dimensional model of the lesion and the three-dimensional model of the lesion. As an example, the coincident region may be displayed in a different color than the three-dimensional model of the lesion and the three-dimensional model of the lesion.
Wherein, exemplarily, the coordinates of the focal center in the ultrasound image can be determined first, and the coordinates of the focal center in the three-dimensional image can be determined according to the coordinate transformation matrix from the ultrasound image space to the three-dimensional image space; then, a three-dimensional model of the lesion is rendered in an image space of the three-dimensional image according to coordinates of a center of the lesion in the three-dimensional image and a size of the lesion. When the ultrasonic probe is provided with the puncture frame, the coordinates of the center of the ablation focus in the ultrasonic image can be determined according to the angle of the puncture frame and the depth of the ablation path. When the three-dimensional model of the lesion is a multi-needle combined ablation model, the center coordinates of the lesion may be set for each of the ablation models, respectively.
Besides the coordinates of the ablation focus center, the ultrasound imaging system needs to acquire some ablation parameters set by an operator, for example, the operator needs to set the power and the continuous ablation time of the ablation needle, and acquire the size of the ablation range of the ablation needle, so as to ensure that the ablation region contains a three-dimensional model of the whole lesion and a safety boundary thereof. The safety boundary is generally required to cover the focus edge and extend outwards for a certain distance in the ablation process so as to ensure complete ablation of the whole focus.
In this step, the operator may input a given power and ablation duration for the ablation procedure and obtain the extent of the ablation zone based on the above-mentioned operating parameters. The operator can also set the range of the ablation region required, namely preset ablation range, and select corresponding working parameters such as given power, ablation duration and the like according to the set range of the ablation region. Since the ablation range of the ablation needle is generally an ellipsoid, in this step, when setting the range of the ablation region, only the length of the major axis and the length of the minor axis of the ellipsoid need be set. It should be noted that the shape of the lesion is not limited to an ellipsoid shape, but may include a spherical shape or a cylindrical shape, and the operator may set the shape of the lesion according to the shape of the lesion and set different parameters according to the shape of the lesion.
The three-dimensional model of the lesion may also be rendered in the form of a surface. Assuming that the ablation focus is in an ellipsoidal shape, an operator can set the major diameter, the minor diameter, the needle point distance, the path depth and the like of the ablation focus according to the model of the actually used ablation needle, and the system performs surface drawing on the ablation focus at the origin of coordinates by using the parameters, wherein the needle point distance is the distance from the heat source of the ablation needle to the needle point, and the central point of the ablation focus is the position of the distance from the path depth of the ablation needle to the needle point. If the ablation needle is punctured on the basis of the puncture rack arranged on the ultrasonic probe, the angle beta of the puncture rack needs to be set firstly, and the position T of the ablation focus in the current ultrasonic sector is calculated by utilizing the set angle beta of the puncture rack and the path depth d us_ablate I.e. (x) us_ablate, y us_ablate ) Wherein:
x us_ablate =d·sinβ
y us_ablate =d·cosβ
in another embodiment, the coordinates of the lesion center in the ultrasound image may be determined from a positioning sensor disposed on the ablation needle.
Specifically, when an operator punctures based on a positioning sensor arranged on an ablation needle, the angle and the depth of an ablation focus are obtained based on the coordinate change of the positioning sensor, namely, the coordinate of the ablation focus in the positioning sensor space is determined according to the positioning sensor, the ablation focus is converted into an ultrasonic image space according to a transformation matrix from the positioning sensor to the ultrasonic image space, and finally the coordinate of the ablation focus is mapped to a three-dimensional image space according to a registration matrix of the ultrasonic image space and the three-dimensional image space, so that three-dimensional visualization is realized. Assuming that the coordinate of the ablation focus in the ultrasonic space is T us_ablate Then, in the three-dimensional image space, the coordinates are:
T sec_ablate =M·T us_ablate =P·R probe ·A·T us_ablate
when the ultrasonic probe is provided with the puncture frame, an operator moves the ultrasonic probe provided with the positioning sensor, and the ablation focus in the three-dimensional image spaceWill follow R in the mapping relation matrix probe Move with changes in the position of the object. As an example, if the operator is satisfied with the location of the current lesion, the save may be clicked, i.e. the location where the current lesion is located is considered to be ablated.
In one embodiment, two or more three-dimensional display windows may be displayed on the display interface, with one window displaying the current view and the remaining windows displaying views at other viewing angles to facilitate operator viewing of the lesion and the lesion at multiple viewing angles simultaneously.
Specifically, referring to fig. 5, the two or more three-dimensional display windows include a main window 501 and an auxiliary window 502, where the main window 501 is used for displaying a current view, and the auxiliary window 502 is used for displaying views at other viewing angles. In some embodiments, the auxiliary window 502 may be displayed superimposed at the corner of the main window 501 as shown in fig. 5 to save layout, without obscuring the three-dimensional model of the lesion and the three-dimensional model of the lesion. Of course, the auxiliary window 502 may be displayed in parallel with the main window 501. Generally, the size of the main window 501 is larger than that of the auxiliary window 502, but the size of the main window 501 and the size of the auxiliary window 502 may be the same.
As an example, the other perspective of the at least one secondary window display includes a reverse perspective to the current perspective of the primary window display. For example, the main window 501 in fig. 5 shows a view of the front side of the lesion, and the auxiliary window 502 shows a view of the back side of the lesion. In addition, auxiliary windows for displaying other visual angles such as the left side, the right side and the bottom side of the focus can be arranged, so that an operator can more comprehensively know the ablation condition of the focus at different angles. As an example, an angular relationship between the main window and the auxiliary window may be preset, and the viewing angle of the auxiliary window may be determined according to the viewing angle of the main window and the angular relationship. For example, the angular relationship between the main window and the auxiliary window is preset to be an angular difference of 30 ° clockwise, and the auxiliary window displays a view at a viewing angle rotated by 30 ° clockwise with respect to the main window. As an example, the other viewing angles displayed in the auxiliary window are target viewing angles described below, and when there are a plurality of target viewing angles and one auxiliary window, a view at a viewing angle at which an unablated region is the largest or a view at a viewing angle at which a rotation angle is the smallest may be displayed in the auxiliary window. When the target view angle is multiple and the auxiliary window is multiple, the corresponding views at the respective view angles may be sequentially displayed in the multiple auxiliary windows according to the size of the non-ablated region or according to the size of the rotation angle.
And S250, determining an ablation region according to the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus.
As an implementation manner, the ablation region is a coincidence region of the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion, the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion may be mapped to the same image space based on the above registration result, the coincidence region of the three-dimensional model of the lesion and the three-dimensional model of the ablation lesion is calculated, and the coincidence region is determined as the ablation region. Specifically, in the same image space, it may be determined whether each coordinate point of the three-dimensional model of the lesion falls within the range of the three-dimensional model of the ablation site, a lesion position falling within the range of the three-dimensional model of the ablation site is considered to be ablated, a lesion position falling outside the range of the three-dimensional model of the ablation site is considered to be not ablated, and a set of coordinate points falling within the range of the three-dimensional model of the ablation site is determined as an ablated region, and a coordinate region { P1, P2, · where the lesion is ablated and a coordinate region { N1, N2,. · } where the lesion is not ablated may be calculated by traversing, calculating a coordinate connected domain, and the like. Because the z-axis size of the tumor coordinates displayed in the current view in the model drawing is different from the z-axis size in other views, the ablation proportion and the maximum ablation connected region in the current view can be determined based on the z-axis sizes of the ablated coordinate region and the non-ablated coordinate region. It is to be understood that, in other embodiments, since the ablation region is a coincidence region of the three-dimensional model of the lesion and the three-dimensional model of the lesion, based on a similar principle, the ablation region may also be determined by determining whether each coordinate point of the three-dimensional model of the lesion falls within the range of the three-dimensional model of the lesion.
Step S260, when the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus meets the rotation condition, automatically rotating the current view of the three-dimensional model of the focus and the three-dimensional model of the ablation focus in the three-dimensional display window from the current view to a target view of which the ablation area meets the preset requirement of the ablation focus, and displaying the three-dimensional model of the focus and the three-dimensional model of the ablation focus under the target view.
In one embodiment, the spatial relationship of the three-dimensional model of the lesion and the three-dimensional model of the lesion satisfying a rotation condition includes: in the three-dimensional display window, the inclusion conditions of the space position of the three-dimensional model of the focus and the space position of the three-dimensional model of the ablation focus under any view angle meet the preset conditions. For example, it may be determined that the rotation condition is satisfied when the spatial position of the three-dimensional model of the lesion and the included portion of the spatial position of the three-dimensional model of the lesion at other viewing angles reach a certain ratio by determining whether the spatial position of the three-dimensional model of the lesion and the included portion of the spatial position of the three-dimensional model of the lesion at any viewing angle reach a certain ratio. Alternatively, it may be determined whether the rotation condition is satisfied by determining whether the spatial position of the three-dimensional model of the lesion and the spatial position of the three-dimensional model of the lesion at any view angle are mutually included, for example, if the spatial position of the three-dimensional model of the lesion and the spatial position of the three-dimensional model of the lesion at other view angles are not included, it is determined that the rotation condition is satisfied. Alternatively, it may be determined whether or not an object of interest is present in a portion where the spatial position of the three-dimensional model of the lesion and the spatial position of the three-dimensional model of the lesion are mutually included at any view angle, and if an object of interest (for example, a tissue of a specific form) is present, it may be determined that the rotation condition is satisfied.
In one embodiment, whether the rotation condition is satisfied is determined according to a projected area of the ablation region at the current view angle. The projection area refers to a display area of a display view of a three-dimensional model displayed on a display window of a display under a current visual angle, and the projection is a graph which is mapped to a plane where the three-dimensional display window is located by the three-dimensional model under any visual angle. As described above, since the image displayed by the display is a two-dimensional plane observed at a certain viewing angle, most of the area of the lesion may have been ablated at the current viewing angle, and a larger non-ablated area still exists at other viewing angles, which is not conducive to the subsequent ablation operation if the observation is continued at the current viewing angle, and it is difficult to determine whether the lesion is ablated completely. Therefore, when most of the area of the focus is completely ablated under the current visual angle, the rotation condition can be determined to be met, so that the automatic rotation is triggered to rotate to the larger visual angle of the non-ablated area of the focus, and the ablation is convenient to continue.
Whether most of the area of the focus under the current view angle is ablated can be judged by comparing the projection area of the ablation area under the current view angle with the projection area of the three-dimensional model of the focus, namely, if the proportion of the projection area of the ablation area in the projection area of the three-dimensional model of the focus exceeds a preset threshold value, the rotation condition is determined to be met.
In addition, since the actual ablation region is the overlapping region of the lesion and the ablation focus, it can also be determined whether most of the region of the lesion at the current view angle has been ablated by comparing the projected area of the ablation region at the current view angle with the projected area of the three-dimensional model of the ablation focus, that is, if the proportion of the projected area of the ablation region in the projected area of the three-dimensional model of the ablation focus exceeds a predetermined threshold, it is determined that the rotation condition is satisfied.
In some cases, after at least one ablation, that is, after a three-dimensional model of a lesion is generated, the ablation region seen at the current viewing angle is still too small, or the ablation region is not seen at the current viewing angle, which indicates that the overlapped region of the lesion and the lesion region may be located in the opposite direction or other different directions from the current viewing angle, and therefore, the viewing angle of the current view needs to be rotated to better observe the ablation region. Specifically, after the three-dimensional model of the at least one lesion is displayed, if a projected area of the ablation region on the three-dimensional model of the lesion is smaller than a predetermined threshold, or a ratio of the projected area of the ablation region in the projected area of the three-dimensional model of the lesion is smaller than a predetermined threshold, it is determined that the rotation condition is satisfied.
In some embodiments, it may also be determined by the user whether the spatial relationship of the three-dimensional model of the lesion and the three-dimensional model of the lesion satisfies the rotation condition. When the user judges that the visual angle needs to be rotated, the rotation of the visual angle can be triggered by manually clicking the automatic rotation function key.
After the rotation condition is determined to be met, a rotation operation is executed, and the current view is rotated to a target view angle of the ablation region meeting the preset requirement of the ablation focus, so that the ablation effect can be better observed. The target view at which the ablation region meets a predetermined requirement for ablating the lesion may be a view that facilitates a user viewing the ablation region to ablate the lesion.
Wherein, the target view angle may be a view angle at which the projection area of the ablation region is smaller. As an implementation, the projected area of the ablation region may be compared with the projected area of the three-dimensional model of the lesion, that is, the target view angle is the view angle with the smallest ratio of the projected area of the ablation region to the projected area of the three-dimensional model of the lesion, that is, the current view angle is rotated to the target view angle with the largest non-ablation region. Alternatively, at least one view angle may be first determined in which the ratio of the projected area of the ablation region to the projected area of the three-dimensional model of the lesion is less than or equal to a predetermined threshold, and when there is one view angle, the view angle may be used as the target view angle having the largest non-ablated region, and when there is more than one view angle, one view angle may be selected among the more than one view angles as the target view angle having the largest non-ablated region, for example, the view angle with the smallest required rotation angle may be selected.
As another implementation manner, the projection area of the ablation region may be compared with the projection area of the three-dimensional model of the lesion to find a view angle at which the projection area of the ablation region is smaller, that is, the target view angle is a view angle at which the ratio of the projection area of the ablation region to the projection area of the three-dimensional model of the lesion is the smallest, or the target view angle is a view angle at which the ratio of the projection area of the ablation region to the projection area of the three-dimensional model of the lesion is smaller than or equal to a predetermined threshold value.
After the target visual angle is determined, an included angle between the current visual angle and the target visual angle is calculated to serve as a view rotation angle, and the visual angle of the view is rotated according to the calculated rotation angle.
Specifically, the coordinates of the projection of the non-ablated region of the three-dimensional model of the lesion in the current view can be obtained by traversing the coordinate depth of the three-dimensional model of the lesion, and the like, and then the first coordinate (x) of the center position of the projection of the non-ablated region of the lesion displayed in the current view can be calculated 1 ,y 1 ). Calculating a second coordinate (x) of the projected central position of the non-ablated region of the lesion under the target visual angle based on the similar manner 2 ,y 2 ) Calculating the first coordinate (x) 1 ,y 1 ) A line connecting the origin of coordinates and the second coordinate (x) 2 ,y 2 ) And taking the included angle α as the rotation angle, wherein:
tanα=(x 1 y 2 -x 2 y 1 )/(x 1 x 2 +y 1 y 2 )
α=tan(arctanα)
in the above manner, a connecting line between the coordinate origin and the central point of the non-ablated region is used as the start point and the stop point of the rotation, and alternatively, a connecting line between the coordinate origin and the central point of the lesion projection may be used as the start point and the stop point of the rotation. Specifically, determining a third coordinate of the projected central position of the three-dimensional model of the lesion at the current viewing angle, and determining a second coordinate of the projected central position of the non-ablated region of the three-dimensional model of the lesion at the target viewing angle; and calculating an included angle between a connecting line of the third coordinate and the origin of coordinates and a connecting line of the second coordinate and the origin of coordinates, and taking the included angle as the rotating rotation angle.
In practical applications, there may be a plurality of viewing angles that meet a predetermined requirement. When there are a plurality of viewing angles that satisfy a predetermined requirement, a viewing angle at which a required rotation angle is minimum may be determined as a target viewing angle.
Referring to the above, in some embodiments, two or more three-dimensional display windows are displayed on the display interface, where one window (e.g., the main window) displays the current view at the current viewing angle; then in one example, when the current view rotates, the views in the other windows rotate synchronously with it. For example, as shown in fig. 5, the main window 501 shows a view of the front side of the lesion, and the auxiliary window shows a view of the rear side of the lesion, and if the view in the main window 501 is rotated to the rear side of the lesion, the view in the auxiliary window 502 is rotated to the front side of the lesion, as shown in fig. 6. If the view angle in the main window 501 is rotated to the left of the lesion, the view angle in the auxiliary window 502 will be rotated to the right of the lesion accordingly.
In other examples, when the current view is rotated, the views at other perspectives may also be fixed. For example, the view angle in the main window 501 rotates to the left of the lesion, and the view angle in the auxiliary window may continue to remain behind the lesion.
Thereafter, steps S240 to S260 may be repeatedly performed, and the lesion may be continuously generated based on the rotated view until the lesion is completely ablated.
In summary, the three-dimensional display method 200 of the embodiment of the invention can automatically rotate the viewing angles of the three-dimensional model of the observed lesion and the three-dimensional model of the ablation lesion, thereby greatly facilitating the ablation operation.
Referring to fig. 7, another aspect of the present application provides a three-dimensional display method 700, including:
in step S710, two or more three-dimensional display windows are displayed on the display interface;
displaying a current view in one of the two or more three-dimensional display windows, wherein the current view includes a three-dimensional model of a lesion and a three-dimensional model of an ablation lesion at a current viewing angle, at step S720;
in step S730, displaying, in the remaining windows of the two or more windows, views at other viewing angles including the three-dimensional model of the lesion and the three-dimensional model of the lesion at the other viewing angles.
In one embodiment, the current view further comprises an ultrasound image, and the ultrasound image, the three-dimensional model of the lesion and the three-dimensional model of the lesion in the same three-dimensional display window are in the same image space, which may be within the image space of the ultrasound image or within the image space of the three-dimensional image, for example. Similarly, the views at the other viewing angles may also include ultrasound images, and the three-dimensional model of the lesion, and the ultrasound images at the other viewing angles are also in the same image space.
Further, referring to FIG. 8, FIG. 8 shows a schematic diagram of a display interface 800 of a three-dimensional display method 700 according to an embodiment of the invention. In this embodiment, in addition to the first three-dimensional display window 801 displaying the current view and the second three-dimensional display window 802 displaying the views at other viewing angles on the display interface 800, an ultrasound window 803 and a fusion display window 804 are displayed, the ultrasound window 803 being used for displaying an ultrasound image, for example, a real-time ultrasound image, and the fusion display window 804 being used for displaying a fusion image of an ultrasound image and a three-dimensional image containing the lesion. In addition to the layout shown in fig. 8, the first three-dimensional display window 801, the second three-dimensional display window 802, the ultrasound window 803, and the fusion display window 804 may also adopt other suitable layouts.
The three-dimensional display method 700 according to the embodiment of the present invention adopts a multi-view display mode with two or more windows, which is beneficial for the user to better know the ablation conditions at different views.
Referring now back to fig. 1, an embodiment of the invention further provides an ultrasound imaging system 100, and the ultrasound imaging system 100 may be used to implement the three-dimensional display method 200 described above. The ultrasound imaging system 100 may include an ultrasound probe 110, transmit circuitry 112, receive circuitry 114, beamforming circuitry 116, a processor 118, a display 120, a transmit/receive selection switch 122, and some or all of the components in memory 124, the relevant description of which may be referenced above. Only the main functions of the ultrasound imaging system 100 will be described below, and details that have been described above will be omitted.
Specifically, the transmitting circuit 112 is used for exciting the ultrasonic probe 110 to transmit ultrasonic waves to the lesion; the receiving circuit 114 is used for controlling the ultrasonic probe to receive the ultrasonic echo returned from the lesion to obtain an ultrasonic echo signal; the processor 118 is configured to: processing the ultrasonic echo signal to obtain an ultrasonic image; registering the ultrasonic image with a pre-acquired three-dimensional image containing the focus; segmenting the lesion in a three-dimensional image containing the lesion, and displaying a three-dimensional model of the lesion in a three-dimensional display window of the display 120 based on a result of the registration and a result of the segmentation; acquiring ablation parameters of an ablation focus, and displaying a three-dimensional model of the ablation focus in the three-dimensional display window according to the registration result and the acquired ablation parameters; determining an ablation region according to the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus; and when the rotation condition is met, automatically rotating the current view of the three-dimensional display window from the current view angle to a target view angle at which the ablation region meets the preset requirement of an ablation focus, and displaying a three-dimensional model of the focus and a three-dimensional model of the ablation focus under the target view angle.
In one embodiment, the ablation region is a region of coincidence of the three-dimensional model of the lesion and the three-dimensional model of the lesion.
In one embodiment, the satisfying the rotation condition includes: and the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus meets a rotation condition, or a rotation instruction input by a user is received.
In one embodiment, the spatial relationship of the three-dimensional model of the lesion and the three-dimensional model of the lesion satisfying a rotation condition includes: in the three-dimensional display window, the inclusion conditions of the space position of the three-dimensional model of the focus and the space position of the three-dimensional model of the ablation focus at any view angle meet preset conditions.
In one embodiment, the spatial relationship of the three-dimensional model of the lesion and the three-dimensional model of the lesion satisfies a rotation condition, including: and under the current view angle, the projection area of the ablation region in the three-dimensional display window and the projection area of the three-dimensional model of the lesion or the projection area of the ablation lesion meet preset conditions.
In one embodiment, the meeting of the preset condition includes: the proportion of the projection area of the ablation region in the projection area of the three-dimensional model of the lesion exceeds a predetermined threshold, or the proportion of the projection area of the ablation region in the projection area of the three-dimensional model of the ablation lesion exceeds a predetermined threshold.
In one embodiment, the target view angle at which the ablation region meets the predetermined requirements for ablating the lesion includes: a view angle at which a ratio of a projected area of the ablation region to a projected area of the three-dimensional model of the lesion is minimum, or a view angle at which a ratio of a projected area of the ablation region to a projected area of the three-dimensional model of the lesion is lower than a predetermined threshold; or, the view angle at which the ratio of the projected area of the ablation region to the projected area of the three-dimensional model of the lesion is minimum, or the view angle at which the ratio of the projected area of the ablation region to the projected area of the three-dimensional model of the lesion is lower than a predetermined threshold value.
In one embodiment, rotating the current view from the current perspective to a target perspective for which the projection of the ablation region meets predetermined requirements includes: determining a first coordinate of the central position of an unablated area of a three-dimensional model of a focus under a current visual angle; determining a second coordinate of the central position of an unablated area of the three-dimensional model of the focus under the target visual angle; and determining an included angle between a connecting line of the first coordinate and the origin of coordinates and a connecting line of the second coordinate and the origin of coordinates, and taking the included angle as a rotating angle.
In one embodiment, rotating the current view from the current perspective to a target perspective for which the projection of the ablation region meets predetermined requirements includes: determining a third coordinate of the central position of the three-dimensional model of the focus under the current visual angle; determining a second coordinate of the central position of an unablated area of the three-dimensional model of the focus under the target visual angle; and determining an included angle between a connecting line of the third coordinate and the origin of coordinates and a connecting line of the second coordinate and the origin of coordinates, and taking the included angle as a rotation angle of rotation.
In one embodiment, the processor 118 is further configured to display two or more three-dimensional display windows on the display interface of the display 120, wherein one window displays the current view and the remaining windows display views from other viewing angles.
Illustratively, the two or more windows include a main window for displaying a current view and an auxiliary window for displaying views at other viewing angles, and the size of the main window may be larger than the size of the auxiliary window.
For example, when the front view is rotated, the views at other viewing angles are fixed, or the views at other viewing angles may be rotated synchronously therewith.
Furthermore, according to an embodiment of the present application, there is also provided a computer storage medium on which program instructions are stored, which when executed by a computer or a processor are used for executing the corresponding steps of the three-dimensional display method 200 of the embodiment of the present application. The storage medium may include, for example, a memory card of a smart phone, a storage component of a tablet computer, a hard disk of a personal computer, a read-only memory (ROM), an erasable programmable read-only memory (EPROM), a portable compact disc read-only memory (CD-ROM), a USB memory, or any combination of the above storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
In addition, according to the embodiment of the application, a computer program is further provided, and the computer program can be stored on a cloud or a local storage medium. When being executed by a computer or a processor, the computer program is used for executing the corresponding steps of the three-dimensional display method of the embodiment of the present application.
Based on the above description, the three-dimensional display method and the ultrasonic imaging system according to the embodiment of the present application can automatically rotate the viewing angle of the three-dimensional model of the observed lesion and the three-dimensional model of the ablation lesion, greatly facilitating the surgical operation of the doctor.
Although the example embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above-described example embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present application. All such changes and modifications are intended to be included within the scope of the present application as claimed in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the present application, various features of the present application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the application and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present application should not be construed to reflect the intent: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some of the modules according to embodiments of the present application. The present application may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiments of the present application or the description thereof, and the protection scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope disclosed in the present application, and shall be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (47)

  1. A method of three-dimensional display, the method comprising:
    acquiring an ultrasonic image acquired aiming at a focus;
    registering the ultrasonic image with a pre-acquired three-dimensional image containing the lesion;
    segmenting the focus in a three-dimensional image containing the focus, and displaying a three-dimensional model of the focus in a three-dimensional display window according to the result of the registration and the result of the segmentation;
    acquiring ablation parameters of an ablation focus, and displaying a three-dimensional model of the ablation focus in the three-dimensional display window according to the registration result and the acquired ablation parameters;
    determining an ablation region according to the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus; and
    when the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus meets the rotation condition, automatically rotating the current view of the three-dimensional model of the focus and the three-dimensional model of the ablation focus in the three-dimensional display window from the current view angle to a target view angle at which an ablation area meets the preset requirement for ablating the focus, and displaying the three-dimensional model of the focus and the three-dimensional model of the ablation focus at the target view angle.
  2. The method of claim 1, wherein the ablation region is a region of coincidence of the three-dimensional model of the lesion and the three-dimensional model of the lesion.
  3. The method of claim 2, wherein determining the ablation region comprises:
    and mapping the three-dimensional model of the focus and the three-dimensional model of the ablation focus to the same image space based on the registration result, calculating a coincidence region of the three-dimensional model of the focus and the three-dimensional model of the ablation focus, and determining the coincidence region as the ablation region.
  4. The method of claim 3, wherein the calculating a region of coincidence of the three-dimensional model of the lesion and the three-dimensional model of the lesion comprises:
    and judging whether each coordinate point of the three-dimensional model of the lesion falls within the range of the three-dimensional model of the lesion in the same image space, and determining the set of coordinate points falling within the range of the three-dimensional model of the lesion as the ablation region.
  5. The method of claim 1, wherein the satisfying a rotation condition comprises:
    in the three-dimensional display window, the inclusion conditions of the space position of the three-dimensional model of the focus and the space position of the three-dimensional model of the ablation focus at any view angle meet preset conditions.
  6. The method of claim 1, wherein the satisfying a rotation condition comprises:
    and under the current view angle, the projection area of the ablation region and the projection area of the three-dimensional model of the focus in the three-dimensional display window, or the projection area of the ablation region and the projection area of the ablation focus meet preset conditions.
  7. The method of claim 6, wherein the projection area of the ablation region and the projection area of the three-dimensional model of the lesion or the projection area of the ablation region and the projection area of the lesion in the three-dimensional display window satisfy a preset condition, and the method comprises the following steps:
    the proportion of the projected area of the ablation region in the projected area of the three-dimensional model of the lesion exceeds a predetermined threshold.
  8. The method of claim 6, wherein the projection area of the ablation region and the projection area of the three-dimensional model of the lesion or the projection area of the ablation region and the projection area of the lesion in the three-dimensional display window satisfy a preset condition, and the method comprises the following steps:
    the proportion of the projection area of the ablation region in the projection area of the three-dimensional model of the lesion exceeds a predetermined threshold.
  9. The method of claim 1, wherein the three-dimensional model of the lesion is at least one of a single-needle ablation model and a multi-needle combined ablation model.
  10. The method of claim 9, wherein the single needle ablation model is a three-dimensional model of a lesion displayed on the three-dimensional display window from a single ablation needle inserted into the lesion based on the result of the registration and the ablation parameters.
  11. The method of claim 9, wherein the multi-needle combined ablation model is a three-dimensional model of a lesion displayed on the three-dimensional display window according to at least two ablation needles inserted into the lesion sequentially or simultaneously based on the result of the registration and the ablation parameters.
  12. The method of claim 1, wherein the ablation zone satisfying a target view angle of a predetermined requirement to ablate the lesion comprises:
    a view angle at which a ratio of a projected area of the ablation region to a projected area of the three-dimensional model of the lesion is a minimum, or a view angle at which a ratio of a projected area of the ablation region to a projected area of the three-dimensional model of the lesion is below a predetermined threshold.
  13. The method of claim 1, wherein the ablation zone satisfying a target view angle of a predetermined requirement to ablate the lesion comprises:
    and the view angle with the minimum ratio of the projection area of the ablation region to the projection area of the three-dimensional model of the ablation focus or the view angle with the ratio of the projection area of the ablation region to the projection area of the three-dimensional model of the ablation focus lower than a preset threshold value.
  14. The method of claim 1, wherein the target view is a view with a minimum required rotation angle when multiple views meet the predetermined requirement.
  15. The method of claim 1, wherein automatically rotating a current view of the three-dimensional model of the lesion and the three-dimensional model of the lesion in the three-dimensional display window from a current perspective to a target perspective at which an ablation region meets a predetermined requirement for ablating the lesion comprises:
    determining a first coordinate of a projection center position of an unablated area of a three-dimensional model of the lesion at the current view angle;
    determining a second coordinate of a projection center position of an unablated area of the three-dimensional model of the lesion at the target view angle;
    and determining an included angle between the connecting line of the first coordinate and the origin of coordinates and the connecting line of the second coordinate and the origin of coordinates, and taking the included angle as the rotating rotation angle.
  16. The method of claim 1, wherein said automatically rotating a current view of the three-dimensional model of the lesion and the three-dimensional model of the lesion in the three-dimensional display window from a current perspective to a target perspective at which an ablation region meets a predetermined requirement for ablating the lesion comprises:
    determining a third coordinate of a projection center position of the three-dimensional model of the lesion at the current view angle;
    determining a second coordinate of a projection center position of an unablated area of the three-dimensional model of the lesion at the target view angle;
    and determining an included angle between the connecting line of the third coordinate and the origin of coordinates and the connecting line of the second coordinate and the origin of coordinates, and taking the included angle as the rotating rotation angle.
  17. The method of claim 1, wherein the registering comprises:
    aligning the ultrasound image with a corresponding slice in the three-dimensional image;
    and determining a coordinate transformation matrix from the ultrasonic image space to the three-dimensional image space according to the coordinates of the points in the ultrasonic image and the coordinates of the corresponding points in the three-dimensional image.
  18. The method of claim 17, wherein the aligning comprises at least one of:
    receiving a manual alignment of a user;
    identifying the same tissue in the ultrasound image and the three-dimensional image for automatic alignment; or
    The location of the imaging of the external marker in the three-dimensional image is identified and the location of the external marker in the space of the ultrasound image is determined based on magnetic navigation for automatic alignment.
  19. The method of claim 1, wherein when the three-dimensional image is a three-dimensional ultrasound image reconstructed based on magnetic navigation position information, the registering comprises: and automatically registering based on the position information of the ultrasonic image and the three-dimensional ultrasonic image.
  20. The method of claim 1, wherein the acquired ablation parameters include parameters reflecting a lesion shape and location.
  21. The method of claim 20, wherein the parameters reflecting the shape and location of the lesion comprise at least one of:
    the major diameter and the minor diameter of the ellipsoidal ablation focus, the heat source position, the ablation path depth and the puncture frame angle.
  22. The method of claim 1, wherein said displaying a three-dimensional model of a lesion in said three-dimensional display window according to a result of said registering and the acquired ablation parameters comprises:
    determining the coordinates of the center of an ablation focus in the ultrasonic image, and determining the coordinates of the center of the ablation focus in the three-dimensional image according to a registration matrix from the ultrasonic image to the three-dimensional image;
    and drawing a three-dimensional model of the ablation focus in the three-dimensional image space according to the coordinates of the center of the ablation focus in the three-dimensional image and the size of the ablation focus.
  23. The method of claim 22, wherein the determining coordinates of a lesion center in the ultrasound image comprises:
    and determining the coordinates of the center of the ablation focus in the ultrasonic image according to the angle of the puncture frame and the depth of the ablation path.
  24. The method of claim 22, wherein the determining coordinates of a lesion center in the ultrasound image comprises:
    and determining the coordinates of the center of the ablation focus in the ultrasonic image according to a positioning sensor arranged on the ablation needle.
  25. The method of claim 1, wherein the shape of the three-dimensional model of the lesion comprises an ellipsoid, a sphere, or a cylinder.
  26. The method of claim 1, further comprising:
    and displaying two or more than two three-dimensional display windows on the display interface, wherein one three-dimensional display window displays the current view, and the other three-dimensional display windows display views under other visual angles.
  27. The method of claim 26, wherein the two or more three-dimensional display windows comprise a main window and an auxiliary window, the main window is used for displaying the current view, the auxiliary window is used for displaying the views in the other views, and the size of the main window is larger than that of the auxiliary window.
  28. The method of claim 27, wherein the auxiliary window is displayed superimposed on a corner of the main window.
  29. The method of claim 26, wherein the other views comprise an inverse view of the current view.
  30. The method of claim 26, wherein the view at the other view angle is fixed while the rotation of the current view occurs.
  31. The method of claim 26, wherein when the rotation of the current view occurs, the views at the other views are rotated synchronously therewith.
  32. The method of claim 1, wherein displaying the three-dimensional model of the lesion in a three-dimensional display window comprises:
    segmenting a three-dimensional contour of the lesion in the three-dimensional image;
    performing surface reconstruction according to the three-dimensional contour obtained by segmentation to generate a three-dimensional model of the focus;
    and displaying a three-dimensional model of the focus in the three-dimensional display window according to the coordinates of the three-dimensional outline of the focus in the three-dimensional image.
  33. A three-dimensional display method, the method comprising:
    displaying two or more than two three-dimensional display windows on a display interface;
    displaying a current view in one of the two or more three-dimensional display windows, wherein the current view includes a three-dimensional model of a lesion and a three-dimensional model of a lesion at a current viewing angle;
    displaying, in remaining three-dimensional display windows of the two or more three-dimensional display windows, views at other viewing angles, including three-dimensional models of the lesion and three-dimensional models of the lesion at other viewing angles.
  34. The method of claim 33, wherein the current view and/or the view at the other perspective is also displayed with an ultrasound image.
  35. The method of claim 33, further comprising:
    and displaying an ultrasonic window and a fusion display window on the display interface, wherein the ultrasonic window is used for displaying an ultrasonic image, and the fusion display window is used for displaying a fusion image of the ultrasonic image and a three-dimensional image containing the focus.
  36. An ultrasound imaging system, comprising:
    an ultrasonic probe;
    the transmitting circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to the focus;
    the receiving circuit is used for controlling the ultrasonic probe to receive the ultrasonic echo returned from the focus to obtain an ultrasonic echo signal;
    a processor configured to:
    processing the ultrasonic echo signal to obtain an ultrasonic image;
    registering the ultrasonic image with a pre-acquired three-dimensional image containing the lesion;
    segmenting the lesion in a three-dimensional image containing the lesion, and displaying a three-dimensional model of the lesion in a three-dimensional display window of a display based on a result of the registration and a result of the segmentation;
    acquiring ablation parameters of an ablation focus, and displaying a three-dimensional model of the ablation focus in the three-dimensional display window according to the registration result and the acquired ablation parameters;
    determining an ablation region according to the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus;
    when the rotation condition is met, rotating the current view of the three-dimensional display window from the current view angle to a target view angle of an ablation area meeting the preset requirement of ablating the focus, and displaying a three-dimensional model of the focus and a three-dimensional model of the ablation focus under the target view angle;
    the display is used for displaying the three-dimensional model of the focus and the three-dimensional model of the ablation focus.
  37. The ultrasound imaging system of claim 36, wherein the ablation region is a region of coincidence of the three-dimensional model of the lesion and the three-dimensional model of the lesion.
  38. The ultrasound imaging system of claim 36 or 37, wherein the satisfying a rotation condition comprises: and the spatial relationship between the three-dimensional model of the focus and the three-dimensional model of the ablation focus meets a rotation condition, or a rotation instruction input by a user is received.
  39. The ultrasound imaging system of claim 38, wherein the spatial relationship of the three-dimensional model of the lesion and the three-dimensional model of the lesion satisfying a rotation condition comprises:
    in the three-dimensional display window, the inclusion conditions of the space position of the three-dimensional model of the focus and the space position of the three-dimensional model of the ablation focus at any visual angle meet preset conditions.
  40. The ultrasound imaging system of claim 38, wherein the spatial relationship of the three-dimensional model of the lesion and the three-dimensional model of the lesion satisfies a rotation condition comprising:
    and under the current view angle, the projection area of the ablation region and the projection area of the three-dimensional model of the focus in the three-dimensional display window, or the projection area of the ablation region and the projection area of the ablation focus meet preset conditions.
  41. The ultrasound imaging system of claim 40, wherein the meeting of the preset condition comprises:
    the proportion of the projection area of the ablation region in the projection area of the three-dimensional model of the lesion exceeds a predetermined threshold, or
    The proportion of the projection area of the ablation region in the projection area of the three-dimensional model of the lesion exceeds a predetermined threshold.
  42. The ultrasound imaging system of claim 36, wherein the ablation zone satisfying a target view angle of a predetermined requirement to ablate the lesion comprises:
    a view angle at which a ratio of a projected area of the ablation region to a projected area of the three-dimensional model of the lesion is minimum, or a view angle at which a ratio of a projected area of the ablation region to a projected area of the three-dimensional model of the lesion is lower than a predetermined threshold;
    or, a view angle at which a ratio of a projected area of the ablation region to a projected area of the three-dimensional model of the lesion is minimum, or a view angle at which a ratio of a projected area of the ablation region to a projected area of the three-dimensional model of the lesion is lower than a predetermined threshold.
  43. The ultrasound imaging system of claim 36, wherein the processor automatically rotates a current view of the three-dimensional model of the lesion and the three-dimensional model of the lesion in the three-dimensional display window from a current perspective to a target perspective at which an ablation region meets a predetermined requirement for ablating the lesion, comprising:
    determining a first coordinate of a projection center position of an unablated area of the three-dimensional model of the lesion at the current view angle;
    determining a second coordinate of a projection center position of an unablated area of the three-dimensional model of the lesion at the target view angle;
    and determining an included angle between a connecting line of the first coordinate and the origin of coordinates and a connecting line of the second coordinate and the origin of coordinates, and taking the included angle as the rotating rotation angle.
  44. The ultrasound imaging system of claim 36, wherein the processor automatically rotates a current view of the three-dimensional model of the lesion and the three-dimensional model of the lesion in the three-dimensional display window from a current view angle to a target view angle at which an ablation region meets a predetermined requirement for ablating the lesion, comprising:
    determining a third coordinate of a projection center position of the three-dimensional model of the lesion at the current view angle;
    determining a second coordinate of a projection center position of an unablated area of the three-dimensional model of the lesion at the target view angle;
    and determining an included angle between a connecting line of the third coordinate and the origin of coordinates and a connecting line of the second coordinate and the origin of coordinates, and taking the included angle as the rotating rotation angle.
  45. The ultrasound imaging system of claim 36, wherein the processor is further configured to:
    and displaying two or more than two three-dimensional display windows on a display interface of the display, wherein one three-dimensional display window displays the current view, and the other three-dimensional display windows display views under other visual angles.
  46. The ultrasound imaging system of claim 45, wherein the two or more three-dimensional display windows include a main window for displaying the current view and an auxiliary window for displaying views at the other viewing angles, the main window having a size larger than the auxiliary window.
  47. The ultrasound imaging system of claim 45, wherein the views at the other viewing angles are fixed or rotated synchronously with the current view while the rotation occurs.
CN202080102793.1A 2020-08-04 2020-08-04 Three-dimensional display method and ultrasonic imaging system Pending CN115811961A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/106893 WO2022027251A1 (en) 2020-08-04 2020-08-04 Three-dimensional display method and ultrasonic imaging system

Publications (1)

Publication Number Publication Date
CN115811961A true CN115811961A (en) 2023-03-17

Family

ID=80118674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080102793.1A Pending CN115811961A (en) 2020-08-04 2020-08-04 Three-dimensional display method and ultrasonic imaging system

Country Status (2)

Country Link
CN (1) CN115811961A (en)
WO (1) WO2022027251A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116725673A (en) * 2023-08-10 2023-09-12 卡本(深圳)医疗器械有限公司 Ultrasonic puncture navigation system based on three-dimensional reconstruction and multi-modal medical image registration

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114469309B (en) * 2022-02-16 2022-10-21 上海睿刀医疗科技有限公司 Ablation device, electrode needle layout strategy obtaining method, electronic equipment and storage medium
CN114820731A (en) * 2022-03-10 2022-07-29 青岛海信医疗设备股份有限公司 CT image and three-dimensional body surface image registration method and related device
CN116523802B (en) * 2023-07-04 2023-08-29 天津大学 Enhancement optimization method for liver ultrasonic image
CN117503344A (en) * 2023-12-12 2024-02-06 中国人民解放军总医院第一医学中心 Method and device for confirming power of multiple puncture needles, electronic equipment and storage medium
CN117853570B (en) * 2024-03-08 2024-05-10 科普云医疗软件(深圳)有限公司 Anesthesia puncture auxiliary positioning method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859341A (en) * 2009-04-13 2010-10-13 盛林 Image-guided ablation surgery planning device
CN104605926A (en) * 2013-11-05 2015-05-13 深圳迈瑞生物医疗电子股份有限公司 Ultrasound intervention ablation system and working method thereof
CN103971574B (en) * 2014-04-14 2017-01-18 中国人民解放军总医院 Ultrasonic guidance tumor puncture training simulation system
US10398337B2 (en) * 2017-03-07 2019-09-03 Biosense Webster (Israel) Ltd. Automatic tracking and adjustment of the view angle during catheter ablation treatment
EP3612091A1 (en) * 2017-04-18 2020-02-26 Boston Scientific Scimed Inc. Electroanatomical mapping tools facilitated by activation waveforms
US20200155086A1 (en) * 2017-05-02 2020-05-21 Apn Health, Llc Determining and displaying the 3d location and orientation of a cardiac-ablation balloon
WO2020033947A1 (en) * 2018-08-10 2020-02-13 Covidien Lp Systems for ablation visualization
CN111012484A (en) * 2020-01-06 2020-04-17 南京康友医疗科技有限公司 Real-time ablation area imaging system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116725673A (en) * 2023-08-10 2023-09-12 卡本(深圳)医疗器械有限公司 Ultrasonic puncture navigation system based on three-dimensional reconstruction and multi-modal medical image registration
CN116725673B (en) * 2023-08-10 2023-10-31 卡本(深圳)医疗器械有限公司 Ultrasonic puncture navigation system based on three-dimensional reconstruction and multi-modal medical image registration

Also Published As

Publication number Publication date
WO2022027251A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
WO2022027251A1 (en) Three-dimensional display method and ultrasonic imaging system
US20200281662A1 (en) Ultrasound system and method for planning ablation
US8303502B2 (en) Method and apparatus for tracking points in an ultrasound image
US20160174934A1 (en) Method and system for guided ultrasound image acquisition
US9561016B2 (en) Systems and methods to identify interventional instruments
US7894663B2 (en) Method and system for multiple view volume rendering
US9301733B2 (en) Systems and methods for ultrasound image rendering
US10499879B2 (en) Systems and methods for displaying intersections on ultrasound images
US20170095226A1 (en) Ultrasonic diagnostic apparatus and medical image diagnostic apparatus
US10026191B2 (en) Multi-imaging modality navigation system
US20160143622A1 (en) System and method for mapping ultrasound shear wave elastography measurements
EP1727471A1 (en) System for guiding a medical instrument in a patient body
US20160125605A1 (en) Image processing apparatus, image processing method, and ultrasound imaging apparatus having image processing unit
US20180360427A1 (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
WO2018195946A1 (en) Method and device for displaying ultrasonic image, and storage medium
JP2017113312A (en) Medical image diagnostic apparatus and medical image diagnostic program
CN111836584B (en) Ultrasound contrast imaging method, ultrasound imaging apparatus, and storage medium
CN115317128A (en) Ablation simulation method and device
CN115998334A (en) Ablation effect display method and ultrasonic imaging system
KR20130110544A (en) The method and apparatus for indicating a medical equipment on an ultrasound image
CN115998423A (en) Display method for simulated ablation and ultrasonic imaging system
JP7299100B2 (en) ULTRASOUND DIAGNOSTIC DEVICE AND ULTRASOUND IMAGE PROCESSING METHOD
CN115998335A (en) Ablation effect display method and ultrasonic imaging system
CN113822837A (en) Oviduct ultrasonic contrast imaging method, ultrasonic imaging device and storage medium
CN112634191A (en) Medical image analysis method, ultrasonic imaging apparatus, and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination