CN115317128A - Ablation simulation method and device - Google Patents

Ablation simulation method and device Download PDF

Info

Publication number
CN115317128A
CN115317128A CN202211091789.7A CN202211091789A CN115317128A CN 115317128 A CN115317128 A CN 115317128A CN 202211091789 A CN202211091789 A CN 202211091789A CN 115317128 A CN115317128 A CN 115317128A
Authority
CN
China
Prior art keywords
simulated
ablation
needle
determining
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211091789.7A
Other languages
Chinese (zh)
Inventor
黄光亮
张晓儿
谢晓燕
于开欣
韩晓涛
丛龙飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
First Affiliated Hospital of Sun Yat Sen University
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
First Affiliated Hospital of Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd, First Affiliated Hospital of Sun Yat Sen University filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN115317128A publication Critical patent/CN115317128A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00571Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
    • A61B2018/00577Ablation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • A61B2018/1405Electrodes having a specific shape
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Abstract

The embodiment of the invention provides an ablation simulation method and equipment, which comprise the following steps: transmitting ultrasonic waves to target tissues containing focuses through an ultrasonic probe, and generating ultrasonic images of the target tissues according to received ultrasonic echo data; acquiring three-dimensional image data of a target tissue; determining a spatial transformation matrix between an ultrasonic image space where an ultrasonic image is located and a three-dimensional image space where three-dimensional image data is located; determining position information of a simulated ablation needle and position information of a simulated focus in a target space according to the spatial transformation matrix, wherein the target space is an ultrasonic image space or a three-dimensional image space; and calculating and displaying the relative position between the simulated ablation needle and the simulated focus in real time according to the position information of the simulated ablation needle and the position information of the simulated focus. Quantitative evaluation of the position of the needle can be realized by simulating the relative position between the ablation needle and the simulated focus, and a doctor is intuitively guided to perform needle insertion.

Description

Ablation simulation method and device
Technical Field
The embodiment of the invention relates to the technical field of medical ultrasound, in particular to an ablation simulation method and equipment.
Background
Tumors are new organisms formed by the proliferation of local histiocytes of organisms, are easy to cause various diseases, and have important clinical significance for the treatment of the tumors. With the continuous development of medical technology, tumor ablation treatment methods for directly destroying tumors by physical methods such as radio frequency, microwave, freezing, high-frequency electrocautery, laser, high-energy focused ultrasound and the like have the advantages of high curative effect, small invasion, quick postoperative recovery and the like, and are widely applied in clinic.
At present, whether in the preoperative planning stage of tumor ablation or in the surgical stage of tumor ablation, the tumor ablation is usually simulated by means of an ultrasonic imaging device. The ablation range of the tumor and the ablation needle is simulated through the ultrasonic imaging equipment, and the proportion of the simulated lesion covered by the simulated ablation range is calculated so as to guide a doctor to perform tumor ablation. The insertion position of the ablation needle directly determines the tumor ablation effect, and the proportion of the simulated lesion covered by the simulated ablation range can quantitatively evaluate the lesion ablation effect, but can not intuitively guide a doctor to perform needle insertion.
Disclosure of Invention
The embodiment of the invention provides an ablation simulation method and equipment, which are used for quantitatively evaluating the position of an insertion needle of an ablation needle so as to intuitively guide a doctor to insert the insertion needle.
In a first aspect, an embodiment of the present invention provides an ablation simulation method, including:
transmitting ultrasonic waves to target tissues containing focuses through an ultrasonic probe, and generating ultrasonic images of the target tissues according to received ultrasonic echo data;
acquiring three-dimensional image data of a target tissue;
determining a spatial transformation matrix between an ultrasonic image space in which an ultrasonic image is located and a three-dimensional image space in which three-dimensional image data is located;
determining position information of a simulated ablation needle and position information of a simulated focus in a target space according to the spatial transformation matrix, wherein the target space is an ultrasonic image space or a three-dimensional image space;
and calculating and displaying the relative position between the simulated ablation needle and the simulated focus in real time according to the position information of the simulated ablation needle and the position information of the simulated focus.
In one embodiment, determining a spatial transformation matrix between an ultrasound image space in which an ultrasound image is located and a three-dimensional image space in which three-dimensional image data is located includes:
acquiring a first transformation matrix of an ultrasonic image space to a position sensor space;
acquiring a second transformation matrix from the space of the position sensor to the space of world coordinates;
determining a third transformation matrix from the world coordinate space to the three-dimensional image space;
and determining a spatial transformation matrix according to the first transformation matrix, the second transformation matrix and the third transformation matrix.
In one embodiment, determining a third transformation matrix from world coordinate space to three-dimensional image space comprises:
registering the ultrasonic image and the three-dimensional image data based on the in-vitro marker or based on the image information to obtain a registration matrix;
a third transformation matrix is determined from the registration matrix.
In one embodiment, if there are a plurality of ablation needles of the same type, determining the position information of the ablation needle includes: determining position information of central positions of a plurality of simulated ablation needles; if the number of the simulated ablation needles is multiple and the types of the simulated ablation needles are not completely the same, determining the position information of the simulated ablation needles comprises the following steps: and determining ablation centers of the plurality of simulated ablation needles according to the simulated ablation range of each simulated ablation needle, and determining position information of the ablation centers of the plurality of simulated ablation needles.
In one embodiment, determining position information of a simulated ablation needle and position information of a simulated lesion center in a target space according to a spatial transformation matrix comprises:
acquiring a needle inserting angle and a needle inserting depth of a simulated ablation needle;
determining position information of the simulated ablation needle in an ultrasonic image space according to the needle inserting angle and the needle inserting depth;
acquiring position information of a simulated focus center in a three-dimensional image space;
when the target space is the ultrasonic image space, the position information of the simulated focus center in the three-dimensional image space is converted into the position information of the simulated focus center in the ultrasonic image space according to the space transformation matrix,
alternatively, the first and second electrodes may be,
and when the target space is a three-dimensional image space, converting the position information of the simulated ablation needle in the ultrasonic image space into the position information of the simulated ablation needle in the three-dimensional image space according to the space transformation matrix.
In one embodiment, acquiring three-dimensional image data of a target tissue includes:
acquiring at least one of computed tomography data, magnetic resonance imaging data, and three-dimensional ultrasound data of a target tissue,
alternatively, the first and second electrodes may be,
three-dimensional image data of the target tissue is acquired by performing three-dimensional reconstruction on an ultrasonic image of the target tissue acquired by an ultrasonic probe.
In one embodiment, the method further comprises:
fusing and displaying the ultrasonic image and the three-dimensional image data;
and performing three-dimensional visual display on the simulated ablation needle and the simulated focus in the three-dimensional image displayed in a fusion mode.
In one embodiment, a three-dimensional visualization of a simulated ablation needle and a simulated lesion includes:
performing three-dimensional visual display on the simulated ablation needle and the simulated focus from a target angle through a window, wherein the target angle is determined and/or adjusted according to operation information input by a user;
alternatively, the first and second electrodes may be,
and performing three-dimensional visual display on the simulated ablation needle and the simulated lesion from different angles through a plurality of windows, wherein the displayed angle in each window is determined and/or adjusted according to the operation information input by the user.
In one embodiment, the method further comprises:
determining a simulated ablation range of the simulated ablation needle;
and calculating and displaying the proportion of the simulated lesion covered by the simulated ablation range in real time according to the position information and the simulated ablation range of the simulated ablation needle and the position information and the size of the simulated lesion.
In one embodiment, determining a simulated ablation range for a simulated ablation needle comprises:
acquiring ablation parameters of a simulated ablation needle, wherein the ablation parameters comprise an energy radiation range and a heat source position;
and determining the simulated ablation range of the simulated ablation needle according to the ablation parameters.
In one embodiment, if there are a plurality of ablation simulation needles, determining a simulated ablation range of the ablation simulation needle according to the ablation parameters includes:
determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameters corresponding to each simulated ablation needle to obtain a plurality of simulated energy distributions corresponding to a plurality of simulated ablation needles;
determining a combined energy value in a combined simulated energy distribution range corresponding to the plurality of simulated ablation needles based on the plurality of simulated energy distributions;
determining an energy distribution area with a combined energy value larger than a first energy threshold value from the combined simulation energy distribution range;
and determining simulated ablation ranges corresponding to the plurality of simulated ablation needles according to the energy distribution areas with the combined energy value larger than the first energy threshold value.
In one embodiment, determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameters corresponding to each simulated ablation needle comprises:
determining a three-dimensional coordinate point in the energy radiation range according to the energy radiation range of each simulated ablation needle and the heat source position of each simulated ablation needle;
calculating the energy value of the three-dimensional coordinate point and determining a target three-dimensional coordinate point with the energy value larger than a second energy threshold value;
determining the target three-dimensional coordinate points as the simulated ablation range of each simulated ablation needle;
and determining the simulated energy distribution corresponding to each simulated ablation needle according to the simulated ablation range of each simulated ablation needle and the energy value of the target three-dimensional coordinate point.
In one embodiment, the method further comprises:
the distance between the boundary of the simulated ablation volume and the boundary of the at-risk tissue is determined and displayed on the display interface.
In one embodiment, the method further comprises:
when the simulated ablation range intersects with the dangerous tissue, warning information is output, and the warning information is used for prompting a user to touch the dangerous tissue.
In a second aspect, an embodiment of the present invention provides an ultrasound imaging apparatus, including:
an ultrasonic probe;
the transmitting circuit is used for outputting the corresponding transmitting sequence to the ultrasonic probe according to a set mode so as to control the ultrasonic probe to transmit the corresponding ultrasonic wave;
the receiving circuit is used for receiving the ultrasonic echo signal output by the ultrasonic probe and outputting ultrasonic echo data;
a display for outputting visual information;
a processor for performing the ablation simulation method of any of the first aspects.
In a third aspect, embodiments of the present invention provide a computer-readable storage medium, in which computer-executable instructions are stored, and when executed by a processor, the computer-executable instructions are used to implement the ablation simulation method according to any one of the first aspect.
According to the ablation simulation method and the ablation simulation equipment provided by the embodiment of the invention, the ultrasonic image of the target tissue is generated, the three-dimensional image data of the target tissue is obtained, then the spatial transformation matrix between the ultrasonic image space where the ultrasonic image is located and the three-dimensional image space where the three-dimensional image data is located is determined, the position information of the simulated ablation needle and the position information of the simulated focus are determined in the same target space according to the spatial transformation matrix, and finally the relative position between the simulated ablation needle and the simulated focus is calculated and displayed in real time according to the position information of the simulated ablation needle and the position information of the simulated focus, so that the quantitative evaluation of the position of the insertion needle is realized, and a doctor can be intuitively guided to perform insertion needle.
Drawings
Fig. 1 is a block diagram of an ultrasound imaging apparatus according to an embodiment of the present invention;
FIG. 2 is a flow chart of an ablation simulation method provided in accordance with an embodiment of the present invention;
FIGS. 3A and 3B are schematic diagrams of an interface provided by an embodiment of the present invention;
FIG. 4 is a flowchart of determining a spatial transformation matrix according to an embodiment of the present invention;
FIG. 5 is a flow chart of an ablation simulation method according to yet another embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating a cube vertex determination according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of 15 basic cubes according to an embodiment of the present invention;
fig. 8 is a flowchart of an ablation simulation method according to another embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments have been given like element numbers associated therewith. In the following description, numerous specific details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in this specification in order not to obscure the core of the present application with unnecessary detail, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the description of the methods may be transposed or transposed in order, as will be apparent to a person skilled in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" as used herein includes both direct and indirect connections (couplings), unless otherwise specified.
As shown in fig. 1, the ultrasound imaging apparatus provided by the present invention may include: an ultrasound probe 20, a transmit/receive circuit 30 (i.e., a transmit circuit 310 and a receive circuit 320), a beam-forming module 40, an IQ demodulation module 50, a memory 60, a processor 70, and a human-computer interaction device. The processor 70 may include a control module 710 and an image processing module 720.
The ultrasonic probe 20 includes a transducer (not shown) composed of a plurality of array elements arranged in an array, the plurality of array elements are arranged in a row to form a linear array, or are arranged in a two-dimensional matrix to form an area array, and the plurality of array elements may also form a convex array. The array elements are used for emitting ultrasonic beams according to the excitation electric signals or converting the received ultrasonic beams into electric signals. Each array element can thus be used to perform a mutual transformation of the electrical impulse signal and the ultrasound beam, thereby performing an emission of ultrasound waves into a target region of human tissue (e.g. target tissue containing a lesion, such as a tumor in this embodiment) and also to receive echoes of ultrasound waves reflected back through the tissue. In performing ultrasonic detection, which array elements are used for transmitting ultrasonic beams and which array elements are used for receiving ultrasonic beams can be controlled by the transmitting circuit 310 and the receiving circuit 320, or the time slots of the array elements are controlled for transmitting ultrasonic beams or receiving echoes of ultrasonic beams. The array elements participating in ultrasonic wave transmission can be simultaneously excited by the electric signals, so that the ultrasonic waves are transmitted simultaneously; or the array elements participating in the ultrasonic wave transmission can be excited by a plurality of electric signals with certain time intervals, so that the ultrasonic waves with certain time intervals are continuously transmitted.
In this embodiment, the user selects an appropriate position and angle by moving the ultrasound probe 20 to transmit ultrasound waves to the target tissue 10 including the lesion and receive echoes of the ultrasound waves returned by the target tissue 10 including the lesion, and obtains and outputs electrical signals of the echoes, where the electrical signals of the echoes are channel analog electrical signals formed by using the receiving array elements as channels, and carry amplitude information, frequency information, and time information.
The transmitting circuit 310 is configured to generate a transmitting sequence according to the control of the control module 710 of the processor 70, where the transmitting sequence is configured to control some or all of the plurality of array elements to transmit ultrasonic waves to the biological tissue, and parameters of the transmitting sequence include the position of the array element for transmission, the number of array elements, and ultrasonic beam transmitting parameters (e.g., amplitude, frequency, number of transmissions, transmitting interval, transmitting angle, wave pattern, focusing position, etc.). In some cases, the transmit circuitry 310 is further configured to phase delay the transmitted beams to cause different transmit elements to transmit ultrasound at different times so that each transmitted ultrasound beam can be focused at a predetermined region of interest. In different operating modes, such as a B image mode, a C image mode, and a D image mode (doppler mode), the parameters of the transmitted sequence may be different, and the echo signals received by the receiving circuit 320 and processed by subsequent modules and corresponding algorithms may generate a B image reflecting the tissue anatomy, a C image reflecting the tissue anatomy and blood flow information, and a D image reflecting the doppler spectrum image.
The receiving circuit 320 is used for receiving the electrical signal of the ultrasonic echo from the ultrasonic probe 20 and processing the electrical signal of the ultrasonic echo. The receive circuit 320 may include one or more amplifiers, analog-to-digital converters (ADCs), and the like. The amplifier is used for amplifying the electric signal of the received ultrasonic echo after proper gain compensation, the analog-to-digital converter is used for sampling the analog echo signal according to a preset time interval so as to convert the analog echo signal into a digitized signal, and the digitized echo signal still retains amplitude information, frequency information and phase information. The data output from the receiving circuit 320 may be output to the beam forming module 40 for processing or may be output to the memory 60 for storage.
The beam forming module 40 is connected to the receiving circuit 320 for performing corresponding beam forming processing such as delaying and weighted summing on the signals output by the receiving circuit 320, because distances from ultrasonic receiving points in the measured tissue to the receiving array elements are different, channel data of the same receiving point output by different receiving array elements have a delay difference, delay processing is required to be performed, phases are aligned, and weighted summing is performed on different channel data of the same receiving point, so as to obtain ultrasonic image data after beam forming, and the ultrasonic image data output by the beam forming module 40 is also called as radio frequency data (RF data). The beam synthesis module 40 outputs the radio frequency data to the IQ demodulation module 50. In some embodiments, the beam forming module 40 may also output the rf data to the memory 60 for buffering or saving, or directly output the rf data to the image processing module 720 of the processor 70 for image processing.
Beamforming module 40 may perform the above functions in hardware, firmware, or software, for example, beamforming module 40 may include a central controller Circuit (CPU), one or more microprocessor chips, or any other electronic components capable of processing input data according to specific logic instructions, which when implemented in software, may execute instructions stored on a tangible and non-transitory computer-readable medium (e.g., memory 60) to perform beamforming calculations using any suitable beamforming method.
The IQ demodulation module 50 removes the signal carrier by IQ demodulation, extracts the tissue structure information included in the signal, and performs filtering to remove noise, and the signal obtained at this time is called a baseband signal (IQ data pair). The IQ demodulation module 50 performs image processing on the IQ data to an image processing module 720 that outputs to the processor 70. In some embodiments, the IQ demodulation module 50 further buffers or saves the IQ data pair output to the memory 60, so that the image processing module 720 reads out the data from the memory 60 for subsequent image processing.
The processor 70 is used to configure a central controller Circuit (CPU), one or more microprocessors, a graphics controller circuit (GPU) or any other electronic components capable of processing input data according to specific logic instructions, which may perform control on peripheral electronic components according to the input instructions or predetermined instructions, or perform data reading and/or saving on the memory 60, or may process the input data by executing programs in the memory 60, such as performing one or more processing operations on acquired ultrasound data according to one or more operating modes, the processing operations including, but not limited to, adjusting or defining the form of ultrasound waves emitted by the ultrasound probe 20, generating various image frames for display by a display 80 of a subsequent human-computer interaction device, or adjusting or defining the content and form displayed on the display 80, or adjusting one or more image display settings (e.g., ultrasound images, interface components, positioning regions) displayed on the display 80.
The image processing module 720 is used to process the data output by the beam synthesis module 40 or the data output by the IQ demodulation module 50 to generate a gray-scale image of signal intensity variation within the scanning range, which reflects the anatomical structure inside the tissue, and is called B image. The image processing module 720 may output the B image to the display 80 of the human-computer interaction device for display.
The human-computer interaction device is used for performing human-computer interaction, namely receiving input and output visual information of a user; the input of a user can be received by the touch screen integrated with the display, such as a keyboard, an operating button, a mouse, a track ball and the like; which outputs visual information using the display 80.
The memory 60 may be a tangible and non-transitory computer readable medium, such as a flash memory card, solid state memory, hard disk, etc., for storing data or programs, e.g., the memory 60 may be used to store acquired ultrasound data or temporarily not immediately displayed image frames generated by the processor 70, or the memory 60 may store a graphical user interface, one or more default image display settings, programming instructions for the processor, the beam-forming module, or the IQ decoding module.
It should be noted that the structure shown in fig. 1 is merely illustrative, and may include more or fewer components than those shown in fig. 1, or have a different configuration than that shown in fig. 1. The components shown in fig. 1 may be implemented in hardware and/or software. The ultrasound imaging device shown in fig. 1 may be used to perform the ablation simulation method provided by any of the embodiments of the present invention.
Referring to fig. 2, an ablation simulation method according to an embodiment of the present invention may include:
s201, transmitting ultrasonic waves to target tissues containing the focus through an ultrasonic probe, and generating ultrasonic images of the target tissues according to received ultrasonic echo data.
When performing ablation simulation with ultrasound, whether in the preoperative planning stage or the surgical stage, an ultrasound image of the target tissue needs to be acquired first. The target tissue in this embodiment is a tissue containing a lesion to be ablated, such as a tumor. Specifically, ultrasonic echo data may be generated by moving an ultrasonic probe of the ultrasonic imaging apparatus at an appropriate position, transmitting ultrasonic waves at an appropriate angle to target tissue, and receiving ultrasonic echoes returned from the target tissue, and then generating an ultrasonic image of the target tissue from the ultrasonic echo data.
S202, three-dimensional image data of the target tissue is obtained.
In this embodiment, the three-dimensional image data of the target tissue stored in advance may be acquired from the storage device, or the three-dimensional image data of the target tissue may be directly acquired by the acquisition device. The Three-Dimensional image data of the target tissue in this embodiment may be any one or more of Computed Tomography (CT) data, magnetic Resonance Imaging (MRI) data, and Three-Dimensional Ultrasound (3D-US) data. For example, CT data of the target tissue may be acquired for ablation simulation, or CT data and MRI data of the target tissue may also be acquired for ablation simulation.
In an alternative embodiment, a 3D reconstruction technique may be further used to perform 3D reconstruction on the two-dimensional image data of the target tissue to obtain three-dimensional image data of the target tissue. Specifically, three-dimensional ultrasound image data of the target tissue may be acquired by three-dimensionally reconstructing a two-dimensional ultrasound image of the target tissue acquired by the ultrasound probe. The target tissue can be 3D reconstructed, for example, by an in situ Freehand scan of a section of an ultrasound movie with localization information to obtain 3D-US data of the target tissue.
It should be noted that the execution order of S201 and S202 is not limited in this embodiment. Or the three-dimensional image data of the target tissue can be acquired first, and then the two-dimensional ultrasonic image of the target tissue can be acquired.
S203, determining a space transformation matrix between the ultrasonic image space where the ultrasonic image is located and the three-dimensional image space where the three-dimensional image data is located.
The ultrasound image is located in an ultrasound image space and the three-dimensional image data is located in a three-dimensional image space, which may be converted to the same space by a spatial transformation matrix although they are located in different image spaces. The spatial transformation matrix may be a matrix capable of transforming data in the ultrasound image space to data in the three-dimensional image space, or may be a matrix capable of transforming data in the three-dimensional image space to data in the ultrasound image space, according to actual transformation requirements.
S204, determining the position information of the simulated ablation needle and the position information of the simulated focus in a target space according to the space transformation matrix, wherein the target space is an ultrasonic image space or a three-dimensional image space.
When the ultrasonic imaging equipment is used for ablation simulation, the puncture frame can be arranged on the ultrasonic probe and used for fixing the ablation needle. The position information of the simulated ablation needle in the ultrasonic image space can be acquired according to the information provided by the positioning sensor arranged on the ultrasonic probe. Optionally, the position information of the simulated ablation needle in the ultrasound image space can be acquired according to the information provided by the positioning sensor arranged at the tail part of the ablation needle. A doctor usually marks the position of a lesion in three-dimensional image data of a target tissue, and may obtain position information of a simulated lesion in a three-dimensional image space according to the marking of the doctor. In this embodiment, the position information of the simulated ablation needle may be position information of a heat source center of the simulated ablation needle, and the position information of the simulated lesion may be position information of a center of the simulated lesion.
In order to facilitate quantitative evaluation of the position of the insertion needle, position information of the simulated ablation needle and the simulated lesion needs to be determined in the same target space. When the target space is an ultrasonic image space, the position information of the simulated focus in the three-dimensional image space can be converted into the position information of the simulated focus in the ultrasonic image space according to the space transformation matrix; when the target space is a three-dimensional image space, the position information of the simulated ablation needle in the ultrasonic image space can be converted into the position information of the simulated ablation needle in the three-dimensional image space according to the spatial transformation matrix. The position information of the simulated ablation needle and the simulated focus in the target space can be obtained by converting the space transformation matrix.
When a single needle is adopted for ablation aiming at small tumors, the position information of the heat source center of the single simulated ablation needle is determined; when a plurality of needles are adopted for ablation of a large tumor, the central positions of a plurality of simulated ablation needles or the position information of an ablation center are determined. Specifically, if a plurality of ablation simulation needles are provided and the types of the ablation simulation needles are the same, determining the position information of the ablation simulation needles includes: determining position information of central positions of a plurality of simulated ablation needles; if the number of the simulated ablation needles is multiple and the types of the simulated ablation needles are not completely the same, determining the position information of the simulated ablation needles comprises the following steps: and determining ablation centers of the plurality of simulated ablation needles according to the simulated ablation range of each simulated ablation needle, and determining position information of the ablation centers of the plurality of simulated ablation needles. It should be noted that the above tumor size division can refer to the clinical practical requirement, and this embodiment does not limit this. For the case that the target space is a three-dimensional image space, the position information of the plurality of simulated ablation needles in the ultrasound image space can be respectively obtained, then the central positions of the plurality of simulated ablation needles or the position information of the ablation center in the ultrasound image space are determined, and finally the central positions of the plurality of simulated ablation needles or the position information of the ablation center in the three-dimensional image space is obtained through one-time conversion of a space transformation matrix; or after the position information of the plurality of simulated ablation needles in the ultrasound image space is respectively obtained, the position information of the plurality of simulated ablation needles in the three-dimensional image space is respectively obtained through multiple times of conversion of the spatial transformation matrix, and finally, the center positions of the plurality of simulated ablation needles or the position information of the ablation center is determined according to the position information of the plurality of simulated ablation needles in the three-dimensional image space.
Taking the example of inserting two ablation needles, suppose the position coordinates of two simulated ablation needles are P 1 And P 2 If the two ablation needles are of the same type, the center positions (P) of the two simulated ablation needles can be determined 1 +P 2 ) The position information of the ablation needle is determined as simulation; if the types of the two ablation needles are not the same,the ablation centers of the two simulated ablation needles are determined, and the distance from the ablation center to each simulated ablation needle is inversely related to the size of the simulated ablation range of each simulated ablation needle. Suppose P 1 And P 2 The distance between them is 20mm 1 The simulated ablation range of the simulated ablation needle is a sphere with the radius of 4mm, P 2 The simulated ablation range of the simulated ablation needle is a sphere with the radius of 1mm, and the ablation center can be P 1 And P 2 Distance P on the connecting line 2 Is a position point at 16 mm.
S205, calculating and displaying the relative position between the simulated ablation needle and the simulated focus in real time according to the position information of the simulated ablation needle and the position information of the simulated focus.
After the position information of the simulated ablation needle and the simulated focus is determined in the same space, the relative position between the simulated ablation needle and the simulated focus can be determined according to the position information. The relative position in the embodiment can be expressed in various ways, and no matter what way the relative position is displayed, the quantitative evaluation of the position of the inserted ablation needle can be realized through the relative position, and the doctor can be intuitively guided to insert the needle through displaying the relative position.
In an alternative embodiment, the distance between the simulated ablation needle and the simulated lesion can be calculated and displayed in real time, and the distance can be in the form of Euclidean distance. As shown in fig. 3A and 3B. Fig. 3A shows, for a single needle ablation scenario, in the target space (303), a simulated tumor (301) and a single simulated ablation needle (302), respectively, and shows "the distance between the simulated ablation needle and the simulated lesion" in the form of a text box (304): 10mm ", which may be the distance between the center of the heat source of the simulated ablation needle and the center of the simulated lesion; fig. 3B shows, for a scenario of two-needle ablation, in the target space (303), a simulated tumor (301) and two simulated ablation needles (305), respectively, and "the distance between the simulated ablation needles and the simulated lesion" in the form of a text box (304): 5mm ", which may be the distance between the centers of the two simulated ablation needles and the center of the simulated lesion. The quantitative evaluation of the position of the inserted ablation needle can be realized through the distance, and a doctor can adjust the position of the inserted ablation needle by taking the reduction of the distance as a target, so that the doctor can be intuitively guided to insert the needle by displaying the distance.
In another alternative embodiment, the relative position between the simulated ablation needle and the simulated lesion may also be represented by an angle. Taking the single-needle ablation scenario shown in fig. 3A as an example, an angle between a straight line where the needle bar of the simulated ablation needle is located and a straight line passing through the center point of the simulated lesion and parallel to the body position of the patient can be calculated; taking the two-needle ablation scenario shown in fig. 3B as an example, the angle between the straight line passing through the centers of the two simulated ablation needles and the straight line passing through the center of the simulated lesion and parallel to the patient position can be calculated. The quantitative evaluation of the position of the inserted ablation needle can be realized through the angle, and the doctor can adjust the position of the inserted ablation needle by taking the reduction of the angle as a target, so that the doctor can be intuitively guided to insert the needle by displaying the angle. Alternatively, the angle and distance may also be combined to represent the relative position between the simulated ablation needle and the simulated lesion.
In yet another alternative embodiment, a three-dimensional cartesian coordinate system may be established in the target space with the center of the simulated lesion as the coordinate origin, and then the position coordinates of the simulated ablation needle in the established three-dimensional cartesian coordinate system may be determined according to the position information of the simulated ablation needle and the simulated lesion in the target space, so as to represent the relative position between the simulated ablation needle and the simulated lesion with the position coordinates.
In order to more intuitively guide the insertion of the needle by the doctor, in addition to displaying the relative position between the simulated ablation needle and the simulated lesion on the display interface, the variation trend of the relative position between the simulated ablation needle at the current position compared with the former position and the simulated lesion can be determined. For example, it may be determined whether a distance between the simulated ablation needle at the current position and the simulated lesion is increased or decreased compared to a distance between the simulated ablation needle at the previous position and the simulated lesion, and the trend of the change may be displayed on the display interface. The variation trend can be displayed by means of characters, icons, colors and the like. For example, "increase" or "decrease" may be displayed directly to indicate a trend of change; a red icon may be used to alert that the distance is increased and a green icon may be used to indicate that the distance is decreased. After the doctor sees the indication information for indicating the distance increase, the doctor can know that the current adjustment direction deviates from the ablation target and needs to be adjusted again; after the doctor sees the instruction information for instructing the distance reduction, the doctor can know that the current adjustment direction is correct, and can continue to adjust. By displaying the variation trend of the relative position on the display interface, a doctor can be guided to insert the needle more intuitively.
According to the ablation simulation method provided by the embodiment, the ultrasonic image of the target tissue is generated, the three-dimensional image data of the target tissue is obtained, the spatial transformation matrix between the ultrasonic image space where the ultrasonic image is located and the three-dimensional image space where the three-dimensional image data is located is determined, the position information of the simulated ablation needle and the position information of the simulated focus are determined in the same target space according to the spatial transformation matrix, and finally the relative position between the simulated ablation needle and the simulated focus is calculated and displayed in real time according to the position information of the simulated ablation needle and the position information of the simulated focus, so that the quantitative evaluation of the position of the insertion needle is realized, and the insertion needle can be intuitively guided by a doctor.
In order to quantitatively evaluate the needle inserting position of the ablation needle, the relative position between the simulated ablation needle and the simulated lesion needs to be calculated in the same space, and a spatial transformation matrix for performing spatial transformation is particularly important. Referring to fig. 4, on the basis of the foregoing embodiment, in the ablation simulation method provided in this embodiment, determining a spatial transformation matrix between an ultrasound image space in which an ultrasound image is located and a three-dimensional image space in which three-dimensional image data is located may include:
s2031, a first transformation matrix from the ultrasonic image space to the position sensor space is obtained.
The position sensor may be generally fixed to the ultrasound probe, and continuously provides position information as the ultrasound probe moves. The first transformation matrix in this embodiment is a transformation matrix from an ultrasound image space to a position sensor space, which may be denoted as a. When the position sensor is fixed on the ultrasonic probe and the model of the ultrasonic probe is unchanged, A is fixed and can be obtained by a calibration method.
S2032, a second transformation matrix from the position sensor space to the world coordinate space is obtained.
The second transformation matrix in this embodiment is a transformation matrix from the position sensor space to the world coordinate space, which may be denoted as R probe 。R probe Can be read directly by the magnetic positioning controller. With the movement of the ultrasonic probe, R probe Will be constantly changing.
S2033, determining a third transformation matrix from the world coordinate space to the three-dimensional image space.
The third transformation matrix in the present embodiment is a transformation matrix from the world coordinate space to the three-dimensional image space, and may be denoted as P. P may be determined by a registration result between the ultrasound image of the target tissue and the three-dimensional image data of the target tissue.
In an alternative embodiment, the ultrasound image of the target tissue and the three-dimensional image data of the target tissue may be registered based on the in-vitro markers or based on the image information to obtain a registration matrix M, and then the third transformation matrix P may be determined according to the registration matrix M.
S2034, a space transformation matrix is determined according to the first transformation matrix, the second transformation matrix and the third transformation matrix.
Obtaining a first transformation matrix A and a second transformation matrix R probe And a third transformation matrix P, then according to A, R probe And P determines the spatial transformation matrix. When the target space is a three-dimensional image space, the spatial transformation matrix T may be expressed as: t = P · R probe A; when the target space is an ultrasound image space, the spatial transformation matrix T can be expressed as: t = (P · R) probe ·A) -1
In the embodiment, a first transformation matrix from an ultrasonic image space to a position sensor space, a second transformation matrix from the position sensor space to a world coordinate space, and a third transformation matrix from the world coordinate space to a three-dimensional image space are sequentially obtained, and then a space transformation matrix is determined according to the first transformation matrix, the second transformation matrix, and the third transformation matrix. The spatial transformation matrix can realize the mutual conversion of the position information between the ultrasonic image space and the three-dimensional image space, thereby obtaining the position information in the same target space.
In the following, how to determine the position information of the simulated ablation needle and the position information of the simulated lesion center in the target space according to the spatial transformation matrix will be described in detail respectively for the case where the target space is an ultrasound image space and the case where the target space is a three-dimensional image space. In one embodiment, determining the position information of the simulated ablation needle and the position information of the simulated lesion center in the target space according to the spatial transformation matrix may include:
acquiring a needle inserting angle and a needle inserting depth of a simulated ablation needle;
determining position information of the simulated ablation needle in an ultrasonic image space according to the needle inserting angle and the needle inserting depth;
acquiring position information of a simulated focus center in a three-dimensional image space;
when the target space is the ultrasonic image space, the position information of the simulated focus center in the three-dimensional image space is converted into the position information of the simulated focus center in the ultrasonic image space according to the space transformation matrix,
alternatively, the first and second liquid crystal display panels may be,
and when the target space is a three-dimensional image space, converting the position information of the simulated ablation needle in the ultrasonic image space into the position information of the simulated ablation needle in the three-dimensional image space according to the space transformation matrix.
Ablation parameters of different types of ablation needles are often different, for example, some ablation needles have spherical energy radiation ranges, and some ablation needles have ellipsoidal energy radiation ranges; some ablation needles have a heat source at the needle tip and some ablation needles have a heat source at a distance d from the needle tip hot At the location. Therefore, when in use, corresponding ablation parameters need to be set according to the model of the ablation needle actually used. For an ablation needle with a spherical energy radiation range, the radius, the needle tip distance, the needle inserting depth and the like are required to be set; for the ablation needle with the energy radiation range in an ellipsoidal shape, the long diameter, the short diameter and the needle tip distance need to be setPin depth, etc. The insertion needle depth d of the simulated ablation needle may be derived based on the set insertion needle depth. When ablation is carried out based on the puncture frame, the angle of the puncture frame needs to be set first, and the needle inserting angle beta of the simulated ablation needle can be obtained according to the set angle of the puncture frame. After the inserting needle angle beta and the inserting needle depth d of the analog ablation needle are obtained, the position information P of the analog ablation needle in the current ultrasonic sector, namely in the ultrasonic image space can be determined according to the angle beta and the depth d us_probe (x us_probe ,y us_probe ) Wherein x is us_probe =d·sinβ,y us_probe = d · cos β. When d is hot >At 0 time, x us_probe =(d-d hot )·sinβ,y us_probe =(d-d hot ) Cos β. Doctors usually mark the position of the focus in the three-dimensional image data of the target tissue, and can obtain the position information P of the simulated focus center in the three-dimensional image space according to the mark of the doctors sec_tumor
When the target space is an ultrasonic image space, the position information P of the simulated focus center in the three-dimensional image space can be obtained according to the space transformation matrix T sec_tumor Converting into position information P of simulated focus center in ultrasonic image space us_tumor (P us_tumor =T·P sec_tumor ). Thus, the position information P of the simulated ablation needle in the ultrasonic image space is obtained us_probe And position information P of the simulated lesion us_tumor The distance D between the ablation needle and the simulated focus can be simulated dist =|P us_tumor -P us_probe And | to express the relative position between the simulated ablation needle and the simulated lesion so as to realize quantitative evaluation of the position of the insertion needle. D, irrespective of whether there is tissue at risk surrounding the tumor dist Smaller indicates better pin placement.
When the target space is a three-dimensional image space, the position information P of the simulated ablation needle in the ultrasonic image space can be converted according to the space transformation matrix T us_probe Converting into position information P of analog ablation needle in three-dimensional image space sec_probe (P sec_probe =T·P us_probe ). Thus, a three-dimensional image space is obtainedPosition information P of the analog ablation needle sec_probe And position information P of the simulated lesion sec_tumor The distance D between the ablation needle and the simulated focus can be simulated dist =|P sec_tumor -P sec_probe And | to express the relative position between the simulated ablation needle and the simulated lesion so as to realize quantitative evaluation of the position of the insertion needle. D, irrespective of whether there is tissue at risk surrounding the tumor dist Smaller indicates better pin placement.
Referring to fig. 5, in order to more intuitively guide the physician to perform the insertion during the ablation process, on the basis of any of the above embodiments, the ablation simulation method provided in this embodiment may further include:
s501, fusion display is carried out on the ultrasonic image and the three-dimensional image data.
And S502, performing three-dimensional visual display on the simulated ablation needle and the simulated focus in the three-dimensional image displayed in a fusion mode.
The image information and the positioning information can be utilized to carry out registration fusion on the ultrasonic image and the three-dimensional image data of the target tissue, and the fusion result is displayed on a display interface. By fusing and displaying the ultrasonic image and the three-dimensional image data, the relative position of the current ultrasonic sector image of the target tissue in the three-dimensional image data can be visually checked. The three-dimensional visual display of the simulated ablation needle and the simulated focus under the same space coordinate system can be realized in the three-dimensional image displayed in a fusion manner. When the ultrasonic probe moves, the relative position of the current ultrasonic sector and the three-dimensional image data changes along with the change of the coordinates of the position sensor.
In an alternative embodiment, the simulated ablation needle and the simulated lesion are displayed in a three-dimensional manner through a window, and the simulated ablation needle and the simulated lesion are displayed in a three-dimensional manner from a target angle, wherein the target angle is determined and/or adjusted according to the operation information input by the user. That is to say, the relative position between the simulated ablation needle and the simulated focus can be checked from any angle through one display window on the display interface, so that the position relationship between the simulated ablation needle and the simulated focus can be conveniently and comprehensively known, and the needle inserting process can be better guided. For example, the simulated ablation needle and the simulated focus can be displayed in a rotating manner at a preset angular speed, so that 360-degree display without dead angles is realized; the simulated ablation needle and the simulated lesion may also be displayed visually in three dimensions from a user-specified perspective.
In another alternative embodiment, the simulated ablation needle and the simulated lesion may be three-dimensionally visualized through a plurality of windows from different angles, and the displayed angle in each window is preset or determined and/or adjusted according to the operation information input by the user. That is, each window can display the simulated ablation needle and the simulated lesion in a three-dimensional visualization mode from any angle. The relative position between the simulated ablation needle and the simulated focus is displayed from a plurality of different angles through a plurality of windows, so that comparison is convenient, the position relation between the simulated ablation needle and the simulated focus is conveniently and rapidly and comprehensively known, and the needle inserting process can be better guided.
The three-dimensional visual display of the simulated ablation needle and the simulated lesion can be realized by referring to the following modes:
and marking the focus in the three-dimensional image, such as marking a tumor, and drawing the marked focus to realize three-dimensional visualization of the focus. In the three-dimensional visualization, a three-dimensional model stereo image of a target structure is reconstructed by using the target structure in a series of original two-dimensional images based on a drawing method. The rendering method can be divided into surface rendering and volume rendering.
The surface drawing is to adopt a segmentation technology to carry out operations such as contour recognition, extraction and the like on a series of two-dimensional images, finally restore a three-dimensional model of the detected object and display the three-dimensional model in a surface mode. The premise of lesion mapping is to segment the lesions in the volume data, and the accurate segmentation of the lesions is an important premise in the stages of subsequent treatment plan formulation, computer-assisted surgery, needle insertion result verification and the like. There are many ways to segment the lesion, such as manual segmentation, semi-automatic segmentation, and automatic segmentation. There are also many lesion segmentation algorithms, such as level set segmentation, graph segmentation, region growing, etc. in the traditional algorithm; FCN, U-Net, V-Net and the like in the deep learning method.
After the lesion is accurately segmented, surface rendering may be performed based on the segmented boundaries. The Marching Cube (MC) algorithm is a classic algorithm in surface rendering algorithms, also known as "iso-surface extraction". Essentially, a series of two-dimensional slice data is treated as a three-dimensional data field from which material having a certain threshold is extracted and connected into triangular patches in a topological manner. The basic idea of the MC algorithm is to process each voxel in the volumetric data field one by one and to determine the structural form of the iso-surface inside the voxel based on the values of its respective vertices. In the algorithm implementation process, the intra-voxel isosurface structure is subjected to the following two main calculations: 1. calculating an approximation isosurface of a triangular patch in a voxel; 2. and calculating normal vectors of vertexes of the triangular patch.
Referring to fig. 6, after the vertex energy value is calculated, the vertex energy value is compared with the set energy threshold, and if the vertex energy value is smaller than the energy threshold, it indicates that the point is an external point (outs) and is set to 1; if the vertex energy value is greater than the energy threshold, it indicates that the point is an interior point (INSIDE) and is set to 0. An index value between 0 and 255 can be obtained from the states of 8 vertexes of a Cube (Cube), and the Cube has Rotation (Rotation) symmetry, and the Rotation does not influence the topological structure of an isosurface. Referring to fig. 7, 15 Basic cubes (Basic Cube) can be used to cover all 256 possible cases. From these 15 Basic cubes, a look-up table can be created. The length of the table is 256, and the iso-surface joining pattern is recorded in all cases. And inquiring a lookup table with the length of 256 according to the index value to obtain the edge numbers of the three vertexes of the isosurface triangular plate. After the edge number is obtained, linear interpolation operation is carried out on the edge to obtain the coordinates of the vertex of the triangular plate.
Volume rendering is a technique for generating two-dimensional images on a screen directly from a three-dimensional data field. Digital images correspond to a two-dimensional array of colors and intensities that describe the data elements, referred to as pixels. Similarly, a three-dimensional data field may be described by a three-dimensional array having corresponding values, referred to as voxels. Similar to the two-dimensional raster of a digital image, the volumetric data field can be seen as a three-dimensional raster. A typical three-dimensional data field is a medical image three-dimensional data field, which is obtained by performing Computed Tomography (CT) or Magnetic Resonance Imaging (MRI) scanning to obtain a series of medical image slice data, regularizing the slice data according to position and angle information, and then forming a regular data field composed of a uniform grid in a three-dimensional space, where each node on the grid is a voxel and describes attribute information such as density of an object. The greatest advantage of volume rendering techniques is that the internal structure of an object can be explored, very shaped objects can be described, such as muscles, etc., while surface rendering is weaker in these respects, but the rendering speed is somewhat slower. In order to improve the drawing speed, a surface drawing method can be adopted, namely, a three-dimensional target structure surface is firstly reconstructed from three-dimensional data, namely, the reconstruction of the object surface is carried out according to a segmentation result and a contour line, and then a three-dimensional entity with reality is generated by utilizing a reasonable illumination model and a texture mapping method.
Referring to fig. 8, in order to perform quantitative evaluation on the position of the needle more fully so as to guide the physician to perform needle insertion more accurately, the ablation simulation method provided in this embodiment may further include, on the basis of any of the above embodiments:
s801, determining a simulated ablation range of the simulated ablation needle.
The ablation range of different types of ablation needles is often different, for example, the ablation range of some ablation needles is spherical with a radius of 10mm, and the ablation range of some ablation needles is ellipsoidal with a major diameter of 20mm and a minor diameter of 15mm, so in order to determine the proportion of the simulated lesion covered by the simulated ablation range, the simulated ablation range of the simulated ablation needle needs to be determined first.
The ablation parameters of the simulated ablation needle can be obtained, and then the simulated ablation range of the simulated ablation needle can be determined according to the ablation parameters. The ablation parameters can include an energy radiation range and a heat source position, wherein the energy radiation range can be used for indicating the shape and size of the ablation range, and the heat source position can be used for indicating the distance from the center of the heat source to the needle tip. Specifically, the ablation parameters of the simulated ablation needle can be obtained according to the type of the simulated ablation needle and the mapping relationship between the preset type and the ablation parameters.
S802, calculating and displaying the proportion of the simulated focus covered by the simulated ablation range in real time according to the position information and the simulated ablation range of the simulated ablation needle and the position information and the size of the simulated focus.
After the simulated ablation range of the simulated ablation needle is determined, three-dimensional volume data corresponding to the simulated ablation range can be determined in the target space according to the position information of the simulated ablation needle and the simulated ablation range. And determining three-dimensional volume data corresponding to the simulated focus in the target space according to the position information and the size of the simulated focus. And finally, calculating the proportion of the simulated focus covered by the simulated ablation range according to the three-dimensional volume data corresponding to the simulated ablation range and the three-dimensional volume data corresponding to the simulated focus, and displaying the proportion on a display interface for a user to view.
Optionally, when the three-dimensional visualization display is performed on the simulated ablation needle and the simulated lesion in the three-dimensional image displayed in the fusion manner, the three-dimensional visualization display may be performed on the simulated ablation range of the simulated ablation needle in the three-dimensional image displayed in the fusion manner, so that the user may visually check the simulated ablation range of the simulated ablation needle.
The ablation simulation method provided by the embodiment further calculates and displays the proportion of the simulated lesion covered by the simulated ablation range in real time on the basis of calculating and displaying the relative position between the simulated ablation needle and the simulated lesion in real time, can more comprehensively realize quantitative evaluation on the position of the needle through the relative position and the coverage proportion, can more intuitively guide a doctor to adjust the position of the needle in the direction of increasing the coverage proportion and enabling the simulated ablation needle to be close to the simulated lesion, and is convenient for finding out a more proper needle position.
When the tumor is small, the single needle is used for ablation, and the simulated ablation range of the simulated ablation needle can be accurately obtained according to the ablation parameters. When the tumor is large, multiple needles are needed for ablation, and at the moment, due to the heat conduction of tissues among the needles, the simulated ablation ranges of the multiple simulated ablation needles are no longer simple superposition of the simulated ablation ranges of the various simulated ablation needles. Accurately determining the simulated ablation range of the plurality of simulated ablation needles is a prerequisite for obtaining an accurate coverage ratio. In the following, how to determine the simulated ablation range in the case of multiple simulated ablation needles will be described in detail.
In an alternative embodiment, if there are a plurality of simulated ablation needles, determining a simulated ablation range of the simulated ablation needle according to the ablation parameters may include:
determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameter corresponding to each simulated ablation needle to obtain a plurality of simulated energy distributions corresponding to the plurality of simulated ablation needles;
determining a combined energy value in a combined simulated energy distribution range corresponding to the plurality of simulated ablation needles based on the plurality of simulated energy distributions;
determining an energy distribution area with a combined energy value larger than a first energy threshold value from the combined simulation energy distribution range;
and determining simulated ablation ranges corresponding to the plurality of simulated ablation needles according to the energy distribution areas of which the combined energy value is greater than the first energy threshold value.
Determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameter corresponding to each simulated ablation needle may include:
determining a three-dimensional coordinate point in the energy radiation range according to the energy radiation range of each simulated ablation needle and the heat source position of each simulated ablation needle;
calculating the energy values of the three-dimensional coordinate points and determining a target three-dimensional coordinate point with the energy value larger than a second energy threshold value from the energy values;
determining the target three-dimensional coordinate points as the simulated ablation range of each simulated ablation needle;
and determining the simulated energy distribution corresponding to each simulated ablation needle according to the simulated ablation range of each simulated ablation needle and the energy value of the target three-dimensional coordinate point.
The energy distribution of the simulated ablation range of a single simulated ablation needle needs to be calculated first. If the shape of the simulated ablation range set by the user is an ellipsoid, the energy distribution of the ablation needle of the ablation range can be obtained by calculation based on the long and short diameters of the simulated ablation ellipsoid. The formula is as follows:
Figure BDA0003837324740000181
in the formula, x, y and z are coordinate points in a three-dimensional space, x1, y1 and z1 are energy centers, namely heat source positions, and A1, B1 and C1 are energy radiation ranges simulating an ablation ellipsoid. Transforming the formula yields:
(x-x1) 2 ·B1 2 ·C1 2 +(y-y1) 2 ·A1 2 ·C1 2 +(z-z1) 2 ·A1 2 ·B1 2 ≤A1 2 ·B1 2 ·C1 2
setting up
Figure BDA0003837324740000182
Figure BDA0003837324740000183
If it is used
Figure BDA0003837324740000184
Indicating that the spot is within the radiation range, and then calculating the energy of the spot
Figure BDA0003837324740000185
From this equation, it can be calculated that when a coordinate point is at the center of the ellipsoid, the energy is at most 1, when the point is at the edge of the ellipsoid, the energy is at least 0, by comparing energyl of the coordinate value of each point with the energy threshold energyThre, if: energy1 > energyThre, indicating that the energy at this point satisfies the ablation condition and can be included in the ablation range. The energyThre can be obtained by the ablation range set by the user, that is, the energy threshold of the ablation range set by the user can be reached under the current energy radiation.
Energy superposition between multiple ablation needles is calculated. Based on the above system, the energy distribution of a single ablation needle can be obtained, and if there are two ablation needles, the energy distribution of the ablation needle 1 is calculated:
Figure BDA0003837324740000191
energy distribution of the ablation needle 2:
Figure BDA0003837324740000192
calculating the energy sum of coordinate points in the radiation range of the two needles under the radiation of the ablation needle 1 and the ablation needle 2, namely:
energy=energy1+energy2
based on the fixed energy threshold value, the simulated ablation range of the two needles for combined ablation can be obtained. The ablation range can be drawn based on a surface drawing method, so that three-dimensional visualization of the ablation range can be realized.
When ablating a lesion, there may be dangerous tissue around the lesion. For example, when a tumor in the liver is ablated, the dangerous tissues that may be encountered during ablation include the large blood vessels around the tumor, the gall bladder, the kidney, the heart, and the like. Access to the hazardous tissue may pose a risk that is difficult to predict during ablation. Therefore, when ablation is performed, not only is the ablation needle as close to the center of the lesion as possible, but also the ablation range of the ablation needle is as far away from the hazardous tissue as possible. In order to improve the safety of ablation, the ablation simulation method provided by this embodiment may further include, on the basis of any of the above embodiments: the distance between the boundary of the simulated ablation volume and the boundary of the at-risk tissue is determined and displayed on the display interface.
When ablation simulation is performed in consideration of the dangerous tissue, since damage to any part of the dangerous tissue may cause serious consequences, the distance between the center of the dangerous tissue and the simulated ablation range should not be used as an evaluation criterion, and the distance between the boundary of the simulated ablation range and the boundary of the dangerous tissue should be used as an evaluation criterion, and the larger the distance between the two is, the better.
The boundary position of the simulated ablation range is recorded as T sec_probe The boundary position of the dangerous tissue is denoted as T sec_danger The distance D between the boundary of the simulated ablation volume and the boundary of the dangerous tissue can be quantitatively evaluated by the following expression:
Figure BDA0003837324740000201
wherein, T sec_danger And T sec_probe Whether the coordinates of the ablation range and the coordinates of the dangerous tissue coincide is determined by simulating whether the coordinates of the ablation range and the coordinates of the dangerous tissue coincide. If there is coincidence, T sec_danger And T sec_probe Intersecting, simulating the ablation range of the ablation needle to damage dangerous tissues; otherwise, then T sec_danger And T sec_probe Are not intersected.
In the above expression, when the dangerous tissue and the simulated ablation range are not intersected, the farther the distance between the dangerous tissue and the simulated ablation range is, the closer the value of the final D is to 0, which indicates that the evaluation is higher; when the dangerous tissue intersects the simulated ablation zone, the value of D is negative, indicating that the dangerous tissue has been touched, and is relatively dangerous.
When the simulated ablation range is intersected with the dangerous tissue, the dangerous tissue can be damaged and is dangerous, and at the moment, the user can be timely prompted to touch the dangerous tissue by outputting warning information, so that the user can timely adjust the insertion needle position of the simulated ablation needle, and the safety is ensured.
Reference is made herein to various exemplary embodiments. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary embodiments without departing from the scope hereof. For example, the various operational steps, as well as the components used to perform the operational steps, may be implemented in differing ways depending upon the particular application or consideration of any number of cost functions associated with operation of the system (e.g., one or more steps may be deleted, modified or incorporated into other steps).
Additionally, as will be appreciated by one skilled in the art, the principles herein may be reflected in a computer program product on a computer readable storage medium having computer readable program code pre-loaded thereon. Any tangible, non-transitory computer-readable storage medium may be used, including magnetic storage devices (hard disks, floppy disks, etc.), optical storage devices (CD-ROMs, DVDs, blu Ray disks, etc.), flash memory, and/or the like. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including means for implementing the function specified. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified.
While the principles herein have been illustrated in various embodiments, many modifications of structure, arrangement, proportions, elements, materials, and components particularly adapted to specific environments and operative requirements may be employed without departing from the principles and scope of the present disclosure. The above modifications and other changes or modifications are intended to be included within the scope of this document.
The foregoing detailed description has been described with reference to various embodiments. However, one skilled in the art will recognize that various modifications and changes may be made without departing from the scope of the present disclosure. Accordingly, the disclosure is to be considered in all respects as illustrative and not restrictive, and all such modifications are intended to be included within the scope thereof. Also, advantages, other advantages, and solutions to problems have been described above with regard to various embodiments. However, the benefits, advantages, solutions to problems, and any element(s) that may cause any element(s) to occur or become more pronounced are not to be construed as a critical, required, or essential feature or element of any or all the claims. As used herein, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, system, article, or apparatus. Furthermore, the term "coupled," and any other variation thereof, as used herein, refers to a physical connection, an electrical connection, a magnetic connection, an optical connection, a communicative connection, a functional connection, and/or any other connection.
The present invention has been described in terms of specific examples, which are provided to aid in understanding the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.

Claims (13)

1. An ablation simulation method, comprising:
transmitting ultrasonic waves to target tissues containing focuses through an ultrasonic probe, and generating ultrasonic images of the target tissues according to received ultrasonic echo data;
acquiring three-dimensional image data of the target tissue;
determining a spatial transformation matrix between an ultrasonic image space where the ultrasonic image is located and a three-dimensional image space where the three-dimensional image data is located;
determining position information of a simulated ablation needle and position information of a simulated focus in a target space according to the spatial transformation matrix, wherein the target space is the ultrasonic image space or the three-dimensional image space;
calculating and displaying the relative position between the simulated ablation needle and the simulated focus in real time according to the position information of the simulated ablation needle and the position information of the simulated focus;
determining a simulated ablation range of the simulated ablation needle;
calculating and displaying the proportion of the simulated lesion covered by the simulated ablation range in real time according to the position information and the simulated ablation range of the simulated ablation needle and the position information and the size of the simulated lesion;
wherein the determining a simulated ablation range of the simulated ablation needle comprises: acquiring ablation parameters of the simulated ablation needle, wherein the ablation parameters comprise an energy radiation range and a heat source position; determining a simulated ablation range of the simulated ablation needle according to the ablation parameters;
if a plurality of ablation simulation needles are provided, determining the ablation simulation range of the ablation simulation needle according to the ablation parameters comprises: determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameter corresponding to each simulated ablation needle to obtain a plurality of simulated energy distributions corresponding to the plurality of simulated ablation needles; determining a combined energy value in a combined simulated energy distribution range corresponding to the plurality of simulated ablation needles based on the plurality of simulated energy distributions; determining an energy distribution area with a combined energy value larger than a first energy threshold value from the joint simulation energy distribution range; and determining simulated ablation ranges corresponding to the plurality of simulated ablation needles according to the energy distribution region of which the combined energy value is greater than the first energy threshold value.
2. The method of claim 1, wherein determining a spatial transformation matrix between an ultrasound image space in which the ultrasound image is located and a three-dimensional image space in which the three-dimensional image data is located comprises:
acquiring a first transformation matrix of the ultrasonic image space to the position sensor space;
acquiring a second transformation matrix from the position sensor space to a world coordinate space;
determining a third transformation matrix from the world coordinate space to the three-dimensional image space;
determining the spatial transformation matrix from the first transformation matrix, the second transformation matrix, and the third transformation matrix.
3. The method of claim 2, wherein the determining a third transformation matrix of the world coordinate space to the three-dimensional image space comprises:
registering the ultrasonic image and the three-dimensional image data based on an in-vitro marker or based on image information to obtain a registration matrix;
determining the third transformation matrix from the registration matrix.
4. The method of claim 1, wherein if there are a plurality of ablation needles of the same type, the determining the position information of the ablation needles comprises: determining position information of central positions of a plurality of simulated ablation needles; if the number of the simulated ablation needles is multiple and the types of the simulated ablation needles are not completely the same, the determining the position information of the simulated ablation needles comprises: determining ablation centers of a plurality of simulated ablation needles according to the simulated ablation range of each simulated ablation needle, and determining position information of the ablation centers of the plurality of simulated ablation needles.
5. The method of claim 1, wherein determining position information of a simulated ablation needle and position information of a simulated lesion center in a target space according to the spatial transformation matrix comprises:
acquiring a needle inserting angle and a needle inserting depth of the simulated ablation needle;
determining the position information of the simulated ablation needle in the ultrasonic image space according to the needle inserting angle and the needle inserting depth;
acquiring the position information of the simulated focus center in the three-dimensional image space;
when the target space is the ultrasonic image space, converting the position information of the simulated focus center in the three-dimensional image space into the position information of the simulated focus center in the ultrasonic image space according to the spatial transformation matrix,
alternatively, the first and second electrodes may be,
when the target space is the three-dimensional image space, converting the position information of the simulated ablation needle in the ultrasonic image space into the position information of the simulated ablation needle in the three-dimensional image space according to the space transformation matrix.
6. The method of claim 1, wherein said acquiring three-dimensional image data of said target tissue comprises:
acquiring at least one of computed tomography data, magnetic resonance imaging data, and three-dimensional ultrasound data of the target tissue,
alternatively, the first and second electrodes may be,
and acquiring three-dimensional image data of the target tissue by performing three-dimensional reconstruction on the ultrasonic image of the target tissue acquired by the ultrasonic probe.
7. The method of claim 1, wherein the method further comprises:
fusing and displaying the ultrasonic image and the three-dimensional image data;
and carrying out three-dimensional visual display on the simulated ablation needle and the simulated focus in the three-dimensional image displayed in a fusion mode.
8. The method of claim 7, wherein said three-dimensionally visualizing said simulated ablation needle and said simulated lesion comprises:
performing three-dimensional visual display on the simulated ablation needle and the simulated lesion from a target angle through a window, wherein the target angle is determined and/or adjusted according to operation information input by a user;
alternatively, the first and second electrodes may be,
and performing three-dimensional visual display on the simulated ablation needle and the simulated lesion from different angles through a plurality of windows, wherein the displayed angle in each window is preset or determined and/or adjusted according to operation information input by a user.
9. The method of claim 1, wherein determining the simulated energy distribution for each simulated ablation needle based on the ablation parameters for each simulated ablation needle comprises:
determining a three-dimensional coordinate point in the energy radiation range according to the energy radiation range of each simulated ablation needle and the heat source position of each simulated ablation needle;
calculating the energy value of the three-dimensional coordinate point and determining a target three-dimensional coordinate point with the energy value larger than a second energy threshold value;
determining the target three-dimensional coordinate points as the simulated ablation range of each simulated ablation needle;
and determining the simulated energy distribution corresponding to each simulated ablation needle according to the simulated ablation range of each simulated ablation needle and the energy value of the target three-dimensional coordinate point.
10. The method of claim 1, wherein the method further comprises:
determining and displaying on a display interface a distance between a boundary of the simulated ablation volume and a boundary of a hazardous tissue.
11. The method of claim 1, wherein the method further comprises:
and when the simulated ablation range intersects with the dangerous tissue, outputting warning information, wherein the warning information is used for prompting a user to touch the dangerous tissue.
12. An ultrasound imaging apparatus, comprising:
an ultrasonic probe;
the transmitting circuit is used for outputting a corresponding transmitting sequence to the ultrasonic probe according to a set mode so as to control the ultrasonic probe to transmit corresponding ultrasonic waves;
the receiving circuit is used for receiving the ultrasonic echo signal output by the ultrasonic probe and outputting ultrasonic echo data;
a display for outputting visual information;
a processor for performing the ablation simulation method of any of claims 1-11.
13. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, are configured to implement the ablation simulation method of any of claims 1-11.
CN202211091789.7A 2022-01-04 2022-09-07 Ablation simulation method and device Pending CN115317128A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210001938 2022-01-04
CN202210001938X 2022-01-04

Publications (1)

Publication Number Publication Date
CN115317128A true CN115317128A (en) 2022-11-11

Family

ID=83930999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211091789.7A Pending CN115317128A (en) 2022-01-04 2022-09-07 Ablation simulation method and device

Country Status (1)

Country Link
CN (1) CN115317128A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115778545A (en) * 2022-12-22 2023-03-14 天津市鹰泰利安康医疗科技有限责任公司 Ablation positioning method and system
CN115841873A (en) * 2022-11-21 2023-03-24 天津市鹰泰利安康医疗科技有限责任公司 Ablation simulation method and system
CN116531089A (en) * 2023-07-06 2023-08-04 中国人民解放军中部战区总医院 Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115841873A (en) * 2022-11-21 2023-03-24 天津市鹰泰利安康医疗科技有限责任公司 Ablation simulation method and system
CN115841873B (en) * 2022-11-21 2024-01-16 天津市鹰泰利安康医疗科技有限责任公司 Ablation simulation method and system
CN115778545A (en) * 2022-12-22 2023-03-14 天津市鹰泰利安康医疗科技有限责任公司 Ablation positioning method and system
CN115778545B (en) * 2022-12-22 2023-11-14 天津市鹰泰利安康医疗科技有限责任公司 Ablation positioning method and system
CN116531089A (en) * 2023-07-06 2023-08-04 中国人民解放军中部战区总医院 Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method
CN116531089B (en) * 2023-07-06 2023-10-20 中国人民解放军中部战区总医院 Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method

Similar Documents

Publication Publication Date Title
US20200281662A1 (en) Ultrasound system and method for planning ablation
EP2997924B1 (en) Surgical navigation system
US20170095226A1 (en) Ultrasonic diagnostic apparatus and medical image diagnostic apparatus
EP2666430B1 (en) Navigation system
AU2006201451B2 (en) Registration of ultrasound data with pre-acquired image
CN103445866B (en) Surgical planning system and navigation system
AU2006201644B2 (en) Registration of electro-anatomical map with pre-acquired imaging using ultrasound
CN115317128A (en) Ablation simulation method and device
EP2064991B1 (en) Flashlight view of an anatomical structure
JP5622995B2 (en) Display of catheter tip using beam direction for ultrasound system
AU2006201645A1 (en) Display of two-dimensional ultrasound fan
JP2006305359A (en) Software product for three-dimensional cardiac imaging using ultrasound contour reconstruction
KR102439769B1 (en) Medical imaging apparatus and operating method for the same
CN115811961A (en) Three-dimensional display method and ultrasonic imaging system
WO2014031531A1 (en) System and method for image guided medical procedures
Singla et al. Intra‐operative ultrasound‐based augmented reality guidance for laparoscopic surgery
JP7171168B2 (en) Medical image diagnosis device and medical image processing device
CN107835661A (en) Ultrasonoscopy processing system and method and its device, supersonic diagnostic appts
CN112043377B (en) Ultrasound visual field simulation auxiliary ablation path planning method and system for any section of CT
CN115530973A (en) Ablation visualization method and system
WO2015087203A1 (en) Imaging systems and methods for monitoring treatment of tissue lesions
CN115998334A (en) Ablation effect display method and ultrasonic imaging system
EP3703012A1 (en) Map of body cavity
US20240122653A1 (en) Parametric contour map visualization for needle guide system
CN115998423A (en) Display method for simulated ablation and ultrasonic imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication